and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge More ❯
Coventry, Warwickshire, United Kingdom Hybrid / WFH Options
Jaguar & Land Rove
data extraction, transformation, analysis, and process automation Hands-on experience with Google Cloud Platform (GCP) or Amazon Web Services (AWS) Proficient in Data Engineering and Orchestration tools such as ApacheAirflow, Glue or Dataform Skilled in creating impactful data visualisations using Tableau, Power BI or Python Background in engineering sectors such as automotive, aerospace, or transport BENEFITS This More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
secure use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For More ❯
and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative More ❯
and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
support data needs Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
support data needs Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
looking for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
looking for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
quality data assets Strong architectural acumen and software engineering fundamentals Experience driving adoption of data governance and improving data platform usage across internal teams stack including: Snowflake AWS DBT Airflow Python Kinesis Terraform CI/CD tools BENEFITS The successful Principal Data Engineer will receive the following benefits: Salary up to £107,000 Hybrid working: 2 days per week More ❯
in software engineering principles. Expertise with AWS services , including Lambda, ECS/EC2, S3 and RDS. Deep experience with Terraform and infrastructure-as-code practices. Familiarity with tools like Airflow or DBT , and data platforms such as Snowflake or Databricks . Solid experience with CI/CD, observability, and platform reliability practices in cloud-native environments. Understanding of distributed More ❯
applications to the Cloud (AWS) We'd love to hear from you if you Have strong experience with Python & SQL Have experience developing data pipelines using dbt, Spark and Airflow Have experience Data modelling (building optimised and efficient data marts and warehouses in the cloud) Work with Infrastructure as code (Terraform) and containerising applications (Docker) Work with AWS, S3 More ❯
job responsibilities - Design, build, and operate highly scalable, fault-tolerant data processing systems using modern AWS services like Redshift, S3, Glue, EMR, Kinesis, and Lambda, and orchestration systems using Airflow - Leverage your expertise in Python, Scala, or other modern programming languages to develop custom data processing frameworks and automation tools - Collaborate closely with data scientists, analysts, and product managers More ❯
Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Business Intelligence - Looker Experience and Attributes we'd like to see Platform Engineering Expertise Extensive experience in platform engineering; designing, building, and More ❯
concepts to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices More ❯
transformation. Deep understanding of cloud-based data architecture, particularly with GCP (BigQuery, Cloud Functions, Pub/Sub, etc.) or AWS equivalents. Hands-on experience with orchestration tools such as Airflow or DBT. 3+ years in data engineering, preferably including at least one role supporting a live or F2P game. Experience with analytics and marketing APIs (e.g. Appsflyer, Applovin, IronSource More ❯
Power markets. Expert in Python and SQL; strong experience with data engineering libraries (e.g., Pandas, PySpark, Dask). Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
data into a data platform using Fivetran. Experience of developing BI dashboards using Power BI. Knowledge of security concepts relevant to Azure. Experience of workflow management tools such as Apache Airflow. Interested in the role? Complete the online application. We look forward to getting to know you. Discover more about LGT Wealth Management A message from our CEO Ben More ❯
stack Python and associated ML/DS libraries (scikit-learn, numpy, LightlGBM, Pandas, LangChain/LangGraph TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy, 2-for-1 share purchase plans, an EV Scheme to further reduce More ❯
through coursework, Kaggle competitions, or personal data projects You've shown initiative in teaching yourself new technical tools or concepts beyond what was required - such as exploring BigQuery, dbt, Airflow, Docker, or other data engineering technologies on your own time Progression This is an initial six-month engagement. If you perform well, the expectation is that you'll move More ❯
scalable, fault-tolerant ETL pipelines with minimal manual intervention. Knowledge of data modelling best practices, including the medallion architecture or comparable frameworks. Experience in workflow orchestration using Flyte, dbt, Airflow, or Prefect. Strong understanding of unit, integration, and data validation testing using tools like Pytest or Great Expectations. Familiarity with cloud infrastructure (preferably Azure) for managing pipelines and storage More ❯
unsupervised learning, and operations research methods. Solid background in software engineering for data science products: version control (Git), testing (unit, regression, E2E), CI/CD (GitHub Actions), and orchestration (Airflow, Dagster). Proficient in SQL and cloud platforms (AWS preferred), with exposure to model/data versioning tools (e.g. DVC), containerised solutions (Docker, ECS), and experiment tracking (e.g. MLflow More ❯
unsupervised learning, and operations research methods. Solid background in software engineering for data science products: version control (Git), testing (unit, regression, E2E), CI/CD (GitHub Actions), and orchestration (Airflow, Dagster). Proficient in SQL and cloud platforms (AWS preferred), with exposure to model/data versioning tools (e.g. DVC), containerised solutions (Docker, ECS), and experiment tracking (e.g. MLflow More ❯
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯