tooling Scripting experience (Python, Perl, Bash, etc.) ELK (Elastic stack) JavaScript Cypress Linux experience Search engine technology (e.g., Elasticsearch) Big Data Technology experience (Hadoop, Spark, Kafka, etc.) Microservice and cloud native architecture Desirable Skills Able to demonstrate experience of troubleshooting and diagnosis of technical issues. Able to demonstrate excellent more »
Swansea, Wales, United Kingdom Hybrid / WFH Options
Inspire People
processing, and analytics. Programming Skills: Proficiency in Python, SQL, and other relevant programming languages. Big Data Technologies: Experience with big data technologies such as Apache Spark. Data Warehousing: Strong knowledge of data warehousing concepts and solutions. Problem-Solving: Excellent problem-solving skills with a detail-oriented approach. Leadership: Proven more »
Swansea, Neath Port Talbot, Wales, United Kingdom Hybrid / WFH Options
Inspire People
processing, and analytics. Programming Skills: Proficiency in Python, SQL, and other relevant programming languages. Big Data Technologies: Experience with big data technologies such as Apache Spark. Data Warehousing: Strong knowledge of data warehousing concepts and solutions. Problem-Solving: Excellent problem-solving skills with a detail-oriented approach. Leadership: Proven more »
or similar, with 6+ years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be more »
Solid experience with programming languages like Python and proficiency in SQL Experience working with cloud platforms (e.g., Azure, GCP) and big data technologies like Spark Working knowledge of ETL and ELT frameworks and tools like dbt Working knowledge of Version Control and CICD pipelines Hiring, developing, and retaining cyber more »
C++, Java, or similar. Experience using SQL, Python (Pandas, NumPy, Scikit-Learn, etc.)/R or equivalent. Experience with Data Storage, Ingestion, and Transformation (Spark, Kafka or similar tools). Experience developing cloud infrastructure services, preferably with Kubernetes. You are passionate about solving data problems at Scale. Expertise with more »
SQL Data Warehouse, Azure Data Lake, Azure Databricks Azure Cosmos DB, Azure Data Factory, Azure Search, Azure Stream Analytics Delta Lake and Data Lakes ApacheSpark Pools, SQL Pools (dpools and spools) Experience in Python, C# coding, Spark, PySpark, Unix shell/Perl scripting experience. Experience in more »
or similar technologies. Hands-on experience with AWS and snowflake. Financial services industry experience (highly desirable). Experience with Big Data technologies such as Spark or Hadoop. Bachelor's degree in computer science, Engineering, or equivalent. Further information available upon application. ECS Recruitment Group Ltd is acting as an more »
tools such as Informatica MDM, Informatica AXON, Informatica EDC, and Collibra MySQL, SQL Server, Oracle, Snowflake, PostgreSQL and NoSQL databases Programming languages such as Spark or Python Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : Base Salary more »
complex data warehouses and/or data lakes. Familiarity with cloud-based analytics platforms such as AWS, Azure, Snowflake, Google Cloud Platform (Big Query), Spark, and Splunk. Proficiency in SQL and experience using one or more of the following languages: R, Python, Scala, and Julia, including relevant frameworks/ more »
machine learning or more general statistical analysis Strong software development skills with proficiency in Python or C++ Experience with analytics frameworks such as Pandas, ApacheSpark, Dask, or Flink Experience with machine learning frameworks such as TensorFlow, JAX, PyTorch, Spark MLlib, Keras, or scikit-learn Experience in … cloud-based infrastructures such as AWS or GCP Exposure to orchestration platforms such as Apache Airflow or Kubeflow Proven attention to detail, critical thinking, and the ability to work independently within a cross-functional team What benefits do we offer ? Cond Nast Learning Hub where you ll find you more »
Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly ApacheSpark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal more »
with Git for version control and project management, alongside some knowledge of Linux/Shell. data platform familiarity - previous experience of working with both ApacheSpark and MapReduce data processing and analytics frameworks. and reporting expertise - experience with Tableau, Power BI, Excel alongside notebooks for experiment documentation. What more »
Platforms Must have 8 years Experience with Relational Databases like Oracle NoSQL Databases and or Big Data technologies e g Oracle SQL Server Postgres Spark Hadoop other Open Source Must have experience in Data Security Solutions Identity and Access Management and Data Security Access Management Must have 3 years more »
C++, Java, or similar. Experience using SQL, Python (Pandas, NumPy, Scikit-Learn, etc.)/R or equivalent. Experience with Data Storage, Ingestion, and Transformation (Spark, Kafka or similar tools). Experience developing cloud infrastructure services, preferably with Kubernetes. You are passionate about solving data problems at Scale. Expertise with more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
Bachelor's degree in Computer Science or a related field (Master's degree preferred) Nice to have: experience with LLMs, Vector Databases, AWS EMR, Spark, and Python Our commitment: Equal opportunities are important to us. We believe that diversity and inclusion at The Stepstone Group are critical to our more »
API design, MLOps) Building machine learning models and pipelines in Python, using common libraries and frameworks (e.g., PyTorch, MLFlow, JAX) Distributed computing frameworks (e.g., Spark, Dask) Cloud platforms (e.g., AWS, Azure, GCP) and HP computing Containerization and orchestration (Docker, Kubernetes) Ability to scope and effectively deliver projects Strong problem more »
Guildford, Surrey, South East, United Kingdom Hybrid / WFH Options
Allianz Insurance Plc
with monitoring tools to track model performance, resource utilization, and system health. Proficiency in programming languages such as Python, and knowledge of PySpark and Spark pool clusters as well as ML libraries and frameworks. Proficiency with observability tools, such as: Prometheus and Grafana. Infrastructure-as-Code(IaC)Terraform and more »
ML libraries (TensorFlow, pytorch, scikit-learn, transformers, XGBoost, ResNet), geospatial libraries (shapely, geopandas, rasterio), CV libraries (scikit-image, OpenCV, yolo, Detectron2). AWS, Postgres, Apache Airflow, Apache kafka, Apache Spark. Mandatory requirements: You have at least 5 years of experience in DS role, deploying models into production … You have proven experience delivering end-to-end ML solutions that produced business value. You have proven experience with big data technologies, specifically Spark and Kafka You are proficient in Python. You have expert knowledge of at least one cloud computing platform (preferably aws). You are fluent in more »
quality of data. Key Requirements: Strong experience designing data pipelines/warehouses using AWS and Snowflake. Exposure to big data technologies such as Kafka, Spark, or Hadoop. Solid experience with Snowflake, including performance optimisation and cost management. Strong experience with SQL and Data modelling. Excellent understanding of AWS architecture more »
Wandsworth, Greater London, Dundonald, United Kingdom
DataBuzz
Bricks setup using Terraform experience. * Experience of MLOps and DataOps. * Experience of using container technologies, cloud platforms (ideally AWS), and distributed processing frameworks like Spark and Dask. * Experience in Javascript application development and UI design. * Expertise in developing mobile applications. * Familiarity with the agile software development process. If you more »
of different platforms. The data will be stored and transported security while still able to be queried efficiently. Technologies used include: Data Technologies: Kafka, Spark, Debezium, GraphQL Programming Languages: Java, Scripting Database Technologies: MongoDB, ElasticSearch, MemSQL, Sybase IQ/ASE Micro Service Technologies: REST, Spring Boot, Jersey Build and more »
tasks, both for oneself and others, ensuring progress aligns with project goals. Nice to have: Previous experience in startup environments. Proficiency or experience with Apache Spark. Familiarity or background in working with Azure. Experience orchestrating workflows, particularly within distributed system environments. Knowledge of MLOps principles and practices, especially in more »
and partners Preferred Requirements Experience or strong interest in blockchain and other Web 3.0 technologies Experience with OLAP technologies, such as, Presto/Trino, Spark, Hadoop, Athena, or BigQuery is a plus Experience in Golang or any other strongly-typed programming language Experience mentoring and supporting fellow engineers Our more »
of AI techniques including, graph data analytics, time series, NLP, deep learning, supervised and unsupervised machine learning etc Programming skills in Python or R, Spark and SQL Worked with open source data science libraries and understand how to apply them to various problem types Experience of using the latest more »