privacy, and security, ensuring our AI systems are developed and used responsibly and ethically. Tooling the Future: Get hands-on with cutting-edge technologies like Hugging Face, PyTorch, TensorFlow, Apache Spark, ApacheAirflow, and other modern data and ML frameworks. Collaborate and Lead: Partner closely with ML Engineers, Data Scientists, and Researchers to understand their data needs … their data, compute, and storage services. Programming Prowess: Strong programming skills in Python and SQL are essential. Big Data Ecosystem Expertise: Hands-on experience with big data technologies like Apache Spark, Kafka, and data orchestration tools such as ApacheAirflow or Prefect. ML Data Acumen: Solid understanding of data requirements for machine learning models, including feature engineering More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform)(Python, Snowflake, Data Modelling, ETL/ELT, ApacheAirflow, Kafka, AWS)Large-scale data environmentUp to £70,000 plus benefitsFULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing environment … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is … a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of ApacheAirflow - from a Cloud perspective, good AWS exposure. Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive More ❯
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, ApacheAirflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is … a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of ApacheAirflow - from a Cloud perspective, good AWS exposure. Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive More ❯
Astronomer empowers data teams to bring mission-critical software, analytics, and AI to life and is the company behind Astro, the industry-leading unified DataOps platform powered by Apache Airflow. Astro accelerates building reliable data products that unlock insights, unleash AI value, and powers data-driven applications. Trusted by more than 700 of the world's leading enterprises, Astronomer … our product's evolution through client feedback. This role is ideal for someone who wants to make a visible impact while growing into an expert in workflow orchestration and Apache Airflow. This is a hybrid role requiring a minimum of 3 days per week onsite, and includes up to 40% travel for business and customer needs. What you get … production. Be a Trusted Advisor: Conduct demos and provide technical guidance to engineering teams, showing them how our platform can transform their workflows. Drive Community Impact: Contribute to the ApacheAirflow community by creating technical content and best practices, positioning Astronomer as a thought leader in workflow orchestration. Influence Product Direction: Act as a liaison by gathering field More ❯
doing? Evolve the Data Platform by designing and building the next generation of the stack. Develop, run and support our batch and real-time data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS. Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support their needs. Develop automation and … quality issues. Recent projects the team has delivered: Refactoring of our MySQL Ingestion pipeline for reduced latency and 10x scalability. Redshift -> Snowflake migration Unified Local Analytics Development Environment for Airflow and DBT Building our next generation company metrics framework, adding anomaly detection and alerting, and enabling easier discovery and consumption. About you You have 5+ years of full-time … might be more valuable than your direct technical contributions on a project. You care about your craft In addition it would be a bonus if you have Worked with ApacheAirflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would More ❯
or other testing methodologies Preferred: Familiarity with PostgreSQL and Snowflake Preferred: Familiarity with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarity with event streaming platforms such as Apache Kafka Preferred: Familiarity with data pipeline platforms such as ApacheAirflow Preferred: Familiarity with Java Preferred: Experience in one or more relevant financial areas (market data, order More ❯
delivering end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
business glossary, and data mapping framework using metadata management and data catalog tools. Automate data classification, lineage tracking, and policy enforcement through scripts, APIs, and orchestration tools (e.g., dbt, Airflow). Map & Visualize Data Flows : Design and maintain clear documentation and visualizations of data movement across systems, focusing on sensitive and business-critical data. Drive Cross-Functional Alignment: Collaborate … governance SME; support teams with tooling, guidance, and best practices. About You: Strong technical foundation in data governance architecture and tooling. You've worked with tools such as DataHub, ApacheAirflow, AWS, dbt, Snowflake, BigQuery , or similar. Hands-on experience building and maintaining centralized data inventories, business glossaries, and data mapping frameworks. Proficient in automating data classification and … lineage using scripting languages like Python , SQL , or Java , along with orchestration tools such as Airflow and dbt . 5+ years of experience in data governance, privacy, or data engineering roles-especially in settings that integrate governance tightly into data platform design. Familiarity with privacy-by-design , data minimization , and regulatory standards including GDPR, ISO 27001, SOC 2, and More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to ApacheAirflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience More ❯
building production data pipelines Advanced Python skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and More ❯
building production data pipelines Advanced Python skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
But if not? We’ll help you get there: Understanding of cloud computing security concepts Proficiency with IaC tools like Terraform or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in Agile environments with tools like Jira and Git. More ❯
TransferGo. Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What you … methodologies. Collaborating with stakeholders to define data strategies, implement data governance policies, and ensure data security and compliance. About you: Strong technical proficiency in data engineering technologies, such as ApacheAirflow, ClickHouse, ETL tools, and SQL databases. Deep understanding of data modeling, ETL processes, data integration, and data warehousing concepts. Proficiency in programming languages commonly used in data More ❯
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of ApacheAirflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with More ❯
Gloucester, Gloucestershire, United Kingdom Hybrid / WFH Options
Navtech, Inc
of Science Degree in software engineering or a related field Proficiency in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., ApacheAirflow). Familiarity with Change Data Capture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating them into data architecture Strong problem More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic More ❯
experience What We're Looking For Strong hands-on experience with Python, Java, or Scala Proficiency in cloud environments (AWS, Azure, or GCP) and big data tech (Spark, Hadoop, Airflow) Solid understanding of SQL, ETL/ELT approaches, and data modelling techniques Experience building CI/CD pipelines with tools like Jenkins or CircleCI Knowledge of data security protocols More ❯
Experience with Python or other scripting languages Good working knowledge of the AWS data management stack (RDS, S3, Redshift, Athena, Glue, QuickSight) or Google data management stack (Cloud Storage, Airflow, Big Query, DataPlex, Looker) About Our Process We can be flexible with the structure of our interview process if someone's circumstances or timescales require it but our general More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
architectures, data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to More ❯
record migrating large-scale systems (e.g., BigQuery Redshift) Infrastructure as Code - Experience with tools like Terraform Data Engineering: ELT pipeline mastery - Experience with tools like Fivetran, dataform, dbt, and Airflow for building reliable data workflows Custom integrations - Strong Python skills for building data ingestion from third-party APIs, and developing cloud functions Data governance - Experience implementing RBAC, data masking More ❯
architectures, data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to More ❯