to adopt in order to enhance our platform. What you'll do: Develop across our evolving technology stack - we're using Python, Java, Kubernetes, Apache Spark, Postgres, ArgoCD, Argo Workflow, Seldon, MLFlow and more. We are migrating into AWS cloud and adopting many services that are available in that More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English More ❯
TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data More ❯
TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data More ❯
and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management : Lead post-incident reviews, perform root cause analysis for data disruptions More ❯
processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the More ❯
implement elegant solutions for them. Are a data enthusiast who wants to be surrounded by brilliant teammates and huge challenges. Bonus Points: Experience with Apache Airflow, including designing, managing, and troubleshooting DAGs and data pipelines. Experience with CI/CD pipelines and tools like Jenkins, including automating the process More ❯
CD pipelines and integrating automated tests within them - Jenkins, BitBucket required. Familiarity with performance testing, security testing, and other non-functional testing approaches - JMeter, Apache Benchmark preferred. Good experience of working on cloud technologies and services on AWS. Strong practical experience in Flyway or Liquibase. Strong understanding of modern More ❯
with NoSQL databases (e.g., MongoDB) and relational databases (e.g., PostgreSQL, MySQL). 5+ years in Python and SQL work. Knowledge of ETL tools (e.g., Apache Airflow) and cloud platforms (e.g., AWS, Azure, GCP). Understand data modelling concepts and best practices. Experience with healthcare data standards (e.g., HL7, FHIR More ❯
FOR THE SENIOR SOFTWARE ENGINEER TO HAVE . Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED . Please either apply by clicking online or emailing me directly to For further information please More ❯
A passion for building and participating in highly effective teams and development processes. Strong debugging, testing/validation, and analytics/SQL(AO), skills Apache, Experience working with Agile methodologies (Scrum) and cross-functional teams. Desirable: Experience in CDN, datacenter, hyperscaler, media, news, and/or entertainment industry. Knowledge More ❯
Python. Develop real-time streaming features using big data tools such as Spark. SKILLS AND EXPERIENCE Extensive experience using big data tools such as Apache Spark. Experience working in and maintaining an AWS database. Strong Python coding background. Good knowledge of working with SQL. THE BENEFITS Generous Holiday plan. More ❯
Big Data, and Cloud Technologies. Hands-on expertise in at least 2 Cloud platforms (Azure, AWS, GCP, Snowflake, Databricks) and Big Data processing (e.g., Apache Spark, Beam). Proficiency in key technologies like BigQuery, Redshift, Synapse, Pub/Sub, Kinesis, Event Hubs, Kafka, Dataflow, Airflow, and ADF. Strong ETL More ❯
Big Data, and Cloud Technologies. Hands-on expertise in at least 2 Cloud platforms (Azure, AWS, GCP, Snowflake, Databricks) and Big Data processing (e.g., Apache Spark, Beam). Proficiency in key technologies like BigQuery, Redshift, Synapse, Pub/Sub, Kinesis, Event Hubs, Kafka, Dataflow, Airflow, and ADF. Strong ETL More ❯
ensure high availability and accessibility. Experience & Skills : Strong experience in data engineering. At least some commercial hands-on experience with Azure data services (e.g., Apache Spark, Azure Data Factory, Synapse Analytics). Proven experience in leading and managing a team of data engineers. Proficiency in programming languages such as More ❯
london, south east england, United Kingdom Hybrid / WFH Options
DATAHEAD
ensure high availability and accessibility. Experience & Skills : Strong experience in data engineering. At least some commercial hands-on experience with Azure data services (e.g., Apache Spark, Azure Data Factory, Synapse Analytics). Proven experience in leading and managing a team of data engineers. Proficiency in programming languages such as More ❯
ll Bring: Strong experience in AWS cloud platform architecture and solving complex business issues. Proficiency in Java programming and Linux, with desirable knowledge of Apache NiFi, Node.js, JSON/XML, Jenkins, Maven, BitBucket, or JIRA. Hands-on experience with scripting (Shell, Bash, Python) and a solid understanding of Linux More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
ll Bring: Strong experience in AWS cloud platform architecture and solving complex business issues. Proficiency in Java programming and Linux, with desirable knowledge of Apache NiFi, Node.js, JSON/XML, Jenkins, Maven, BitBucket, or JIRA. Hands-on experience with scripting (Shell, Bash, Python) and a solid understanding of Linux More ❯
engineering, including infrastructure-as-code (e.g., Terraform, CloudFormation), CI/CD pipelines, and monitoring (e.g., CloudWatch, Datadog). Familiarity with big data technologies like Apache Spark, Hadoop, or similar. ETL/ELT tools and creating common data sets across on-prem (IBMDatastage ETL) and cloud data stores Leadership & Strategy More ❯
the user experience. Key skills: Senior Data Scientist experience Commercial experience in Generative AI and recommender systems Strong Python and SQL experience Spark/Apache Airflow LLM experience MLOps experience AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill More ❯
the user experience. Key skills: Senior Data Scientist experience Commercial experience in Generative AI and recommender systems Strong Python and SQL experience Spark/Apache Airflow LLM experience MLOps experience AWS Additional information: This role offers a strong salary of up to 95,000 (Depending on experience/skill More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill More ❯
City of London, London, Tottenham Court Road, United Kingdom Hybrid / WFH Options
Cathcart Technology
the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill More ❯
Product Or Domain Expertise Blend of technical expertise with 5+ years of experience, analytical problem-solving, and collaboration with cross-functional teams Azure DevOps Apache Spark, Python Strong SQL proficiency Data modeling understanding ETL processes, Azure Data Factory Azure Databricks knowledge Familiarity with data warehousing Big data technologies Data More ❯