Mathematics, or similar field of study or equivalent work experience. Proven experience with object-oriented programming languages preferably in Python. Experience in infrastructure Automation or orchestration Frameworks e.g. Ansible, Airflow, Terraform, Chef, Salt. A bachelor's or master's degree in computer science, Engineering, Mathematics, a similar field of study or equivalent work experience. Desirable Experience with Telemetry: Splunk More ❯
ML evaluation methodologies and key IR metrics Passion for shipping high-quality products and a self-motivated drive to take ownership of tasks Tech Stack Core : Python, FastAPI, asyncio, Airflow, Luigi, PySpark, Docker, LangGraph Data Stores : Vector Databases, DynamoDB, AWS S3, AWS RDS Cloud & MLOps : AWS, Databricks, Ray ️ Unlimited vacation time - we strongly encourage all of our employees take More ❯
CD for ML, and production monitoring. Experience building robust backend systems and APIs to serve ML models at scale. Strong understanding of big data technologies and data orchestration tools (Airflow, DBT). Familiarity with LLM integration and optimisation in production environments. Excellent problem-solving, analytical, and communication skills. Experience fine-tuning LLMs (e.g. Unsloth, cloud-based methods). More ❯
data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize More ❯
of building performant, maintainable, and testable systems Solid background in microservices architecture Proficiency with Postgres & MongoDB (relational + non-relational) Experience with event-driven architectures and asynchronous workflows (Kafka, Airflow, etc.) Solid coding practices (clean, testable, automated) The mindset of a builder: thrives in fast-paced startup environments, takes ownership, solves complex challenges Bonus points if youve worked with More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
Job Title: Airflow/AWS Data Engineer Location: Manchester Area (3 days per week in the office) Rate: Up to £400 per day inside IR35 Start Date: 03/11/2025 Contract Length: Until 31st December 2025 Job Type: Contract Company Introduction: An exciting opportunity has become available with one of our sector-leading financial services clients. They … to join their growing data engineering function. This role will play a key part in designing, deploying, and maintaining modern cloud infrastructure and data pipelines, with a focus on Airflow, AWS, and data platform automation. Key Responsibilities: Deploy and manage cloud infrastructure across Astronomer Airflow and AccelData environments. Facilitate integration between vendor products and core systems, including data … Establish and enforce best practices for cloud security, scalability, and performance. Configure and maintain vendor product deployments, ensuring reliability and optimized performance. Ensure high availability and fault tolerance for Airflow clusters. Implement and manage monitoring, alerting, and logging solutions for Airflow and related components. Perform upgrades, patches, and version management for platform components. Oversee capacity planning and resource More ❯
and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production pipelines using tools such as Airflow or Dagster Understanding of CI/CD principles, version control (Git) and software development lifecycle. Strong communication and stakeholder-management skills with the ability to influence technical and business More ❯
and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production pipelines using tools such as Airflow or Dagster Understanding of CI/CD principles, version control (Git) and software development lifecycle. Strong communication and stakeholder-management skills with the ability to influence technical and business More ❯
and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production pipelines using tools such as Airflow or Dagster Understanding of CI/CD principles, version control (Git) and software development lifecycle. Strong communication and stakeholder-management skills with the ability to influence technical and business More ❯
and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production pipelines using tools such as Airflow or Dagster Understanding of CI/CD principles, version control (Git) and software development lifecycle. Strong communication and stakeholder-management skills with the ability to influence technical and business More ❯
london (city of london), south east england, united kingdom
Kubrick Group
and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production pipelines using tools such as Airflow or Dagster Understanding of CI/CD principles, version control (Git) and software development lifecycle. Strong communication and stakeholder-management skills with the ability to influence technical and business More ❯
in delta one, store of value, and/or FICC options trading Experience with Linux-based, concurrent, high-throughput, low-latency software systems Experience with pipeline orchestration frameworks (e.g. Airflow, Dagster) Experience with streaming platforms (e.g. Kafka), data lake platforms (e.g. Delta Lake, Apache Iceberg), and relational databases Have a Bachelor or advanced degree in Computer Science, Mathematics More ❯
Role : Build and operate data pipelines and analytics platforms in the cloud. Work with engineers and analysts to design, implement, and maintain reliable, observable ETL/ELT workflows using Airflow and managed cloud services. Focus on Python-first implementations, high-quality SQL, Airflow orchestration, and query engines such as Athena, Trino, or ClickHouse. Required Skills: Hands-on software More ❯
Role : Build and operate data pipelines and analytics platforms in the cloud. Work with engineers and analysts to design, implement, and maintain reliable, observable ETL/ELT workflows using Airflow and managed cloud services. Focus on Python-first implementations, high-quality SQL, Airflow orchestration, and query engines such as Athena, Trino, or ClickHouse. Required Skills: Hands-on software More ❯
with researchers, technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader … Strong programming ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering More ❯
with researchers, technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader … Strong programming ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering More ❯
with researchers, technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader … Strong programming ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering More ❯
london (city of london), south east england, united kingdom
Mondrian Alpha
with researchers, technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader … Strong programming ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering More ❯
with researchers, technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader … Strong programming ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering More ❯
ECS images – largely in NodeJS.) ETL Pipeline Management: Develop and optimise data pipelines to enable seamless data flow and transformation. (We currently use a mix of SSIS, ETL Works, Airflow, Snowflake and are moving to Airflow/Snowflake only architecture.) Snowflake Management: Create production-ready procedures in Snowflake for moving and analysing data. System Optimisation: Improve existing backend … Can handle sensitive and confidential information Experience working with non-data stakeholders to translate their needs and generate useful results presented in an understandable way Familiarity with orchestration tools (Airflow, DBT) and data warehouse modelling Managing other data engineers Experience with customer and commercial datasets, especially in retail or FMCG A love of pets! About Jollyes Pets Jollyes are More ❯
Staffordshire, England, United Kingdom Hybrid / WFH Options
MSA Data Analytics Ltd
and strengthen the organisation’s data engineering and analytics capability within its AWS-based environment. We’re ideally looking for someone with strong hands-on experience across AWS services, Airflow, Python, and SQL. You’ll play a key role in designing, building, and maintaining modern data infrastructure that powers insight-led decision-making across the business. Working within a … and key stakeholders to deliver practical, scalable solutions that make a real impact. Key Responsibilities Design, build, and maintain robust, scalable ETL/ELT pipelines using tools such as Airflow and AWS services (S3, Redshift, Glue, Lambda, Athena). Integrate new data sources and continuously optimise performance and cost efficiency. Ensure data quality, integrity, and security across all systems. … with new tools and trends in data engineering, particularly within the AWS ecosystem. Skills & Experience Strong hands-on experience with AWS (S3, Redshift, Glue, Lambda, Athena). Skilled in Airflow for workflow orchestration. Advanced SQL and proficient in Python for data engineering. Experience with data modelling (e.g. dimensional) and familiarity with NoSQL databases (e.g. Elasticsearch). Confident using Git More ❯