privacy, and security, ensuring our AI systems are developed and used responsibly and ethically. Tooling the Future: Get hands-on with cutting-edge technologies like Hugging Face, PyTorch, TensorFlow, Apache Spark, ApacheAirflow, and other modern data and ML frameworks. Collaborate and Lead: Partner closely with ML Engineers, Data Scientists, and Researchers to understand their data needs … their data, compute, and storage services. Programming Prowess: Strong programming skills in Python and SQL are essential. Big Data Ecosystem Expertise: Hands-on experience with big data technologies like Apache Spark, Kafka, and data orchestration tools such as ApacheAirflow or Prefect. ML Data Acumen: Solid understanding of data requirements for machine learning models, including feature engineering More ❯
Astronomer empowers data teams to bring mission-critical software, analytics, and AI to life and is the company behind Astro, the industry-leading unified DataOps platform powered by Apache Airflow. Astro accelerates building reliable data products that unlock insights, unleash AI value, and powers data-driven applications. Trusted by more than 700 of the world's leading enterprises, Astronomer … our product's evolution through client feedback. This role is ideal for someone who wants to make a visible impact while growing into an expert in workflow orchestration and Apache Airflow. This is a hybrid role requiring a minimum of 3 days per week onsite, and includes up to 40% travel for business and customer needs. What you get … production. Be a Trusted Advisor: Conduct demos and provide technical guidance to engineering teams, showing them how our platform can transform their workflows. Drive Community Impact: Contribute to the ApacheAirflow community by creating technical content and best practices, positioning Astronomer as a thought leader in workflow orchestration. Influence Product Direction: Act as a liaison by gathering field More ❯
doing? Evolve the Data Platform by designing and building the next generation of the stack. Develop, run and support our batch and real-time data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS. Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support their needs. Develop automation and … quality issues. Recent projects the team has delivered: Refactoring of our MySQL Ingestion pipeline for reduced latency and 10x scalability. Redshift -> Snowflake migration Unified Local Analytics Development Environment for Airflow and DBT Building our next generation company metrics framework, adding anomaly detection and alerting, and enabling easier discovery and consumption. About you You have 5+ years of full-time … might be more valuable than your direct technical contributions on a project. You care about your craft In addition it would be a bonus if you have Worked with ApacheAirflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would More ❯
or other testing methodologies Preferred: Familiarity with PostgreSQL and Snowflake Preferred: Familiarity with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarity with event streaming platforms such as Apache Kafka Preferred: Familiarity with data pipeline platforms such as ApacheAirflow Preferred: Familiarity with Java Preferred: Experience in one or more relevant financial areas (market data, order More ❯
delivering end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
business glossary, and data mapping framework using metadata management and data catalog tools. Automate data classification, lineage tracking, and policy enforcement through scripts, APIs, and orchestration tools (e.g., dbt, Airflow). Map & Visualize Data Flows : Design and maintain clear documentation and visualizations of data movement across systems, focusing on sensitive and business-critical data. Drive Cross-Functional Alignment: Collaborate … governance SME; support teams with tooling, guidance, and best practices. About You: Strong technical foundation in data governance architecture and tooling. You've worked with tools such as DataHub, ApacheAirflow, AWS, dbt, Snowflake, BigQuery , or similar. Hands-on experience building and maintaining centralized data inventories, business glossaries, and data mapping frameworks. Proficient in automating data classification and … lineage using scripting languages like Python , SQL , or Java , along with orchestration tools such as Airflow and dbt . 5+ years of experience in data governance, privacy, or data engineering roles-especially in settings that integrate governance tightly into data platform design. Familiarity with privacy-by-design , data minimization , and regulatory standards including GDPR, ISO 27001, SOC 2, and More ❯
building production data pipelines Advanced Python skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and More ❯
building production data pipelines Advanced Python skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and More ❯
TransferGo. Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What you … methodologies. Collaborating with stakeholders to define data strategies, implement data governance policies, and ensure data security and compliance. About you: Strong technical proficiency in data engineering technologies, such as ApacheAirflow, ClickHouse, ETL tools, and SQL databases. Deep understanding of data modeling, ETL processes, data integration, and data warehousing concepts. Proficiency in programming languages commonly used in data More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating them into data architecture Strong problem More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic More ❯
experience What We're Looking For Strong hands-on experience with Python, Java, or Scala Proficiency in cloud environments (AWS, Azure, or GCP) and big data tech (Spark, Hadoop, Airflow) Solid understanding of SQL, ETL/ELT approaches, and data modelling techniques Experience building CI/CD pipelines with tools like Jenkins or CircleCI Knowledge of data security protocols More ❯
Experience with Python or other scripting languages Good working knowledge of the AWS data management stack (RDS, S3, Redshift, Athena, Glue, QuickSight) or Google data management stack (Cloud Storage, Airflow, Big Query, DataPlex, Looker) About Our Process We can be flexible with the structure of our interview process if someone's circumstances or timescales require it but our general More ❯
record migrating large-scale systems (e.g., BigQuery Redshift) Infrastructure as Code - Experience with tools like Terraform Data Engineering: ELT pipeline mastery - Experience with tools like Fivetran, dataform, dbt, and Airflow for building reliable data workflows Custom integrations - Strong Python skills for building data ingestion from third-party APIs, and developing cloud functions Data governance - Experience implementing RBAC, data masking More ❯
architectures, data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
secure use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For More ❯
South West London, London, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
support data needs Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
looking for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
looking for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
quality data assets Strong architectural acumen and software engineering fundamentals Experience driving adoption of data governance and improving data platform usage across internal teams stack including: Snowflake AWS DBT Airflow Python Kinesis Terraform CI/CD tools BENEFITS The successful Principal Data Engineer will receive the following benefits: Salary up to £107,000 Hybrid working: 2 days per week More ❯
transformation. Deep understanding of cloud-based data architecture, particularly with GCP (BigQuery, Cloud Functions, Pub/Sub, etc.) or AWS equivalents. Hands-on experience with orchestration tools such as Airflow or DBT. 3+ years in data engineering, preferably including at least one role supporting a live or F2P game. Experience with analytics and marketing APIs (e.g. Appsflyer, Applovin, IronSource More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯