Arlington, Virginia, United States Hybrid / WFH Options
Full Visibility LLC
AWS S3, Azure Blob, MinIO, or similar) Proficiency in data parsing and transformation, handling structured and unstructured data Hands-on experience with ETL tools and data workflow orchestration (e.g., ApacheAirflow, Luigi, Prefect) Strong programming skills in Python, SQL, or Scala Experience with open-source data processing tools (e.g., Kafka, Spark, Flink, Hadoop) Familiarity with database technologies (PostgreSQL More ❯
with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform, CloudFormation More ❯
Brighton, Sussex, United Kingdom Hybrid / WFH Options
Burns Sheehan
Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML We are partnered with a private equity backed company who provide an AI-powered, guided selling platform that helps businesses improve online sales and customer experience. They are looking for a Lead Data Engineer to lead a small team … experience in a Senior Data Engineering role. Comfortable owning and delivering technical projects end-to-end. Strong in Python, SQL, and cloud platforms (AWS or comparable). Experience with Airflow, Snowflake, Docker (or similar). Familiarity with coaching and mentoring more junior engineers, leading 1-1s and check ins. Wider tech stack : AWS, Python, Airflow, Fivetran, Snowflake … Enhanced parental leave and pay If you are interested in finding out more, please apply or contact me directly! Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML Burns Sheehan Ltd will consider applications based only on skills and ability and will not discriminate on any grounds. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
within ambitious software businesses. Specifically, you can expect to be involved in the following: Designing and developing full-stack data pipelines and platforms using modern tools such as dbt, Airflow, and cloud infrastructure Cleansing, enriching and modelling data to generate commercial insights and power C-level dashboards Delivering scalable solutions that support internal use cases and extend directly to … sales) and building tools that serve business needs Background in startups or scale-ups with high adaptability and a hands-on approach Experience with modern data tools (e.g. dbt, Airflow, CI/CD) and at least one cloud platform (AWS, GCP, Azure) Strong communication skills and a track record of credibility in high-pressure or client-facing settings BENEFITS More ❯
Employment Type: Full-Time
Salary: £100,000 - £110,000 per annum, Inc benefits
platforms such as Azure or AWS, leveraging tools like Databricks, Data Factory, and Synapse Analytics. Ensuring best practices in DevOps, version control, and data governance. Managing orchestration workflows using Airflow or similar tools. Supporting AI teams by preparing high-quality, structured data for machine learning applications. Communicating technical concepts to non-technical stakeholders and making data solutions accessible to … and Python. Experience in Azure with tools like Databricks, Data Factory, and Synapse Analytics. Knowledge of data modeling techniques. Familiarity with DevOps and version control best practices. Experience with Airflow or other orchestration tools is a plus. Expertise in Machine Learning, especially NLP, is a plus. Certifications such as Azure DP-203 are a plus. Soft Skills: Ability to More ❯
Strong SQL and Python skills for building and optimising data pipelines Experience working with cloud platforms (e.g., AWS, GCP, or Azure) Familiarity with modern data stack tools (e.g., dbt, Airflow, Snowflake, Redshift, or BigQuery) Understanding of data modelling and warehousing principles Experience working with large datasets and distributed systems What's in it for you? Up to £70k Hybrid More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
engineers Collaborate across engineering, data science, and product teams to deliver business impact Skills & Experience: Expert in SQL , dbt , and cloud data warehouses (e.g., BigQuery, Redshift) Strong experience with Airflow , Python , and multi-cloud environments (AWS/GCP) Proven background in designing and scaling analytics solutions in agile environments Proven experience as an Analytics Engineer Nice to Have: Experience More ❯
trusted, reliable and available . The technology underpinning these capabilities includes industry leading data and analytics products such as Snowflake, Tableau, DBT, Talend, Collibra, Kafka/Confluent , Astronomer/Airflow , and Kubernetes . This forms part of a longer-term strategic direction to implement Data Mesh, and with it establish shared platform s that enables a connected collection of … and driv ing a culture of iterative improvemen t. Modern data stack - hands-on deploy ment and govern ance of enterprise technologies at scale (e.g. Snowflake, Tableau, DBT, Fivetran , Airflow, AWS , GitHub, Terraform, etc ) for self-service workloads . Thought leadership and influencing - deep interest in data platforms landscape to build well-articulated proposals that are supported by strong More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
more. Technical Requirements: Advanced proficiency in Python and modern software engineering practices. Experience architecting solutions using major cloud platforms (Azure, AWS, GCP). Familiarity with technologies such as Databricks, Airflow, dbt, Snowflake, GitHub CI/CD, and infrastructure-as-code. Strong background across at least several of the following areas: Cloud Engineering, Data Platform Architecture, DevOps, MLOps/LLMOps. More ❯
West London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
more. Technical Requirements: Advanced proficiency in Python and modern software engineering practices. Experience architecting solutions using major cloud platforms (Azure, AWS, GCP). Familiarity with technologies such as Databricks, Airflow, dbt, Snowflake, GitHub CI/CD, and infrastructure-as-code. Strong background across at least several of the following areas: Cloud Engineering, Data Platform Architecture, DevOps, MLOps/LLMOps. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
more. Technical Requirements: Advanced proficiency in Python and modern software engineering practices. Experience architecting solutions using major cloud platforms (Azure, AWS, GCP). Familiarity with technologies such as Databricks, Airflow, dbt, Snowflake, GitHub CI/CD, and infrastructure-as-code. Strong background across at least several of the following areas: Cloud Engineering, Data Platform Architecture, DevOps, MLOps/LLMOps. More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
skills, with the ability to work cross-functionally in an Agile environment Exposure to data product management principles (SLAs, contracts, ownership models) Familiarity with orchestration tools and observability platforms (Airflow, dbt, Monte Carlo, etc.) Exposure to real-time/streaming pipelines Understanding of information security best practices Familiarity with BI tools (QuickSight, Power BI, Tableau, Looker, etc.) Interest or More ❯
have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control tools such as Git You will have exposure to Python for data or analytics engineering tasks (preferred) You will demonstrate excellent problem More ❯
have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control tools such as Git You will have exposure to Python for data or analytics engineering tasks (preferred) You will demonstrate excellent problem More ❯
Join us on our mission to make a better world of work. Culture Amp is the world's leading employee experience platform, revolutionizing how 25 million employees across more than 6,500 companies create a better world of work. Culture More ❯
Walsall, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Adecco
Senior Data Engineer Hybrid/remote - North-West based £65-80,000 + Bonus + Benefits Are you a data enthusiast eager to work on innovative solutions that impact millions? We're looking for an experienced Senior Data Engineer to More ❯
Reston, Virginia, United States Hybrid / WFH Options
CGI
governance. Skilled in leveraging S3, Redshift, AWS Glue, EMR, Azure Data Lake, and Power BI to deliver secure, high-performance solutions and self-service BI ecosystems. Skilled in leveraging ApacheAirflow, Apache Flink and other Data tools Experienced in distributed data compute architecture using Apache Spark and PySpark. Education: Bachelor's degree in computer science, Information More ❯