Astronomer empowers data teams to bring mission-critical software, analytics, and AI to life and is the company behind Astro, the industry-leading unified DataOps platform powered by Apache Airflow. Astro accelerates building reliable data products that unlock insights, unleash AI value, and powers data-driven applications. Trusted by more than 800 of the world's leading enterprises, Astronomer … our field team to deliver world-class technical solutions to customer and prospect business problems. In this role, you'll engage with a diverse set of use cases where ApacheAirflow plays a central role in the overall solution architecture. Your work will directly enable customers to scale their operations and achieve their goals by adopting Astronomer's … Do Implement Astronomer's software and services in the core of some of the world's largest businesses and organizations. Guide our largest and most complex customers in their ApacheAirflow journeys. Act as the core interface between the Customer, Sales, and Product to ensure that the broader solution around the platform is solving pain points and bringing More ❯
/ELT processes, and Lakehouse concepts. Experience with data quality frameworks, data governance, and compliance requirements. Familiarity with version control (Git), CI/CD pipelines, and workflow orchestration tools (Airflow, Prefect). Soft Skills Strong analytical and problem-solving mindset with attention to detail. Good team player with effective communication and storytelling with data and insights. Consulting skills, including More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Adecco
you the chance to work on cutting-edge solutions that make a real impact.Key Responsibilities* Data Engineering: Design and implement data pipelines, lakes, and warehouses using tools like Spark, Airflow, or dbt.* API & Microservices Development: Build secure, efficient APIs and microservices for data integration.* Full Stack Development: Deliver responsive, high-performance web applications using React (essential), plus Angular or More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
Desirable Skills for the AWS Data Engineer: Experience with Databricks , Kafka , or Kinesis for real-time data streaming Knowledge of containerisation (Docker, ECS) and modern orchestration tools such as Airflow Familiarity with machine learning model deployment pipelines or data lakehouse architectures Data Engineer, AWS Data Engineer More ❯
South West London, London, United Kingdom Hybrid/Remote Options
ARC IT Recruitment Ltd
. AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT design and data orchestration, specifically with Apache Airflow. SQL Mastery: Strong SQL skills with significant experience in query tuning and performance optimisation. Programming Proficiency: Proficiency in Python and Bash (for data processing, scripting, and automation). More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Arc IT Recruitment
. AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT design and data orchestration, specifically with Apache Airflow. SQL Mastery: Strong SQL skills with significant experience in query tuning and performance optimisation. Programming Proficiency: Proficiency in Python and Bash (for data processing, scripting, and automation). More ❯
following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith More ❯
Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in More ❯
Google Cloud, Databricks) are a strong plus Technical Skills: • Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL) • Familiarity with data pipeline and workflow management tools (e.g., ApacheAirflow) • Experience with programming languages such as Python, Java, or Scala. Python is highly preferred • Basic understanding of cloud platforms and services (e.g., AWS, Azure, Google Cloud) • Knowledge More ❯
using AWS PaaS such as Glue, EMR, Sagemaker, Redshift, Aurora and Snowflake Building data processing and analytics pipelines as code, using Python, SQL, PySpark, spark, CloudFormation, lambda, step functions, ApacheAirflow Monitoring and reporting on the data platform performance, usage and security Designing and applying security and access control architectures to secure sensitive data You will have: 6+ More ❯
For Hands-on experience with Snowflake Production experience with dbt (mandatory) Strong SQL and Python programming skills Experience with Git-based workflows and DevOps practices Familiarity with orchestration tools (Airflow, Prefect) and ETL/ELT patterns Knowledge of cloud platforms (AWS, Azure) and security best practices More ❯
For Hands-on experience with Snowflake Production experience with dbt (mandatory) Strong SQL and Python programming skills Experience with Git-based workflows and DevOps practices Familiarity with orchestration tools (Airflow, Prefect) and ETL/ELT patterns Knowledge of cloud platforms (AWS, Azure) and security best practices More ❯
time and compute costs. Develop modular, reusable transformations using SQL and Python. Implement CI/CD pipelines and manage deployments via Git. Automate workflows using orchestration tools such as Airflow or dbt Cloud. Configure and optimise Snowflake warehouses for performance and cost efficiency. Required Skills & Experience 7+ years in data engineering roles. 3+ years hands-on experience with Snowflake. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
time and compute costs. Develop modular, reusable transformations using SQL and Python. Implement CI/CD pipelines and manage deployments via Git. Automate workflows using orchestration tools such as Airflow or dbt Cloud. Configure and optimise Snowflake warehouses for performance and cost efficiency. Required Skills & Experience 7+ years in data engineering roles. 3+ years hands-on experience with Snowflake. More ❯
Linux environments. Cloud & Containerization: Docker, Kubernetes, and experience with cloud platforms (GCP or similar). Familiarity with data engineering concepts (ETL/ELT, SQL) and big data tools (Spark, Airflow) is a plus. Strong problem-solving skills and ability to work independently under pressure. LA International is a HMG approved ICT Recruitment and Project Solutions Consultancy, operating globally from More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
/ETL processes for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and effectively across the data platform.* Work with orchestration tools such as Airflow, ADF, or Prefect to schedule and automate workflows.* Keep abreast of industry trends and emerging technologies in data engineering, and continuously improve your skills and knowledge. Profile * Minimum More ❯
AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like ApacheAirflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI, SCADA, MES, and Historian systems. Information Modeling:Experience in defining More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Searchability
are giving express consent for us to process (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: GCP | Python | SQL | MongoDB | Airflow | dbt | Terraform | Docker | ETL | AI | Machine Learning More ❯
data services, including ADF, Synapse, ADLS, and Azure Functions. Proven track record of designing and implementing metadata-driven data pipelines. Deep expertise in orchestration and data workflow automation e.g. Airflow, DBT. Strong understanding of CI/CD practices for data engineering. Experience with infrastructure as code (Terraform, Bicep, or ARM templates) (preferred). Solid development and coding standards, with More ❯
and dimensional data modelling (SCDs, fact/dim, conformed dimensions) Experience with PostgreSQL optimisation. Advanced Python skills ETL/ELT Pipelines: Hands-on experience building pipelines using SSIS, dbt, Airflow, or similar Strong understanding of enterprise ETL frameworks, lineage, and data quality Cloud & Infrastructure: Experience designing and supporting AWS-based analytical infrastructure Skilled in working with S3 and integrating More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Oscar Technology
roles. Strong expertise in Azure cloud services and Databricks . Advanced proficiency in Python and SQL for data engineering. Experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, dbt). Strong knowledge of data warehousing and cloud-based data architectures. Understanding of BI tools such as Power BI or Tableau. Strong problem-solving and debugging skills. Ability More ❯
focused, and able to contribute quickly in a fast-moving environment. Nice to Have Experience with Power BI or other data visualisation tools. Familiarity with orchestration tools such as Airflow, Prefect, or Dagster. Understanding of CI/CD practices in data and analytics engineering. Knowledge of data governance, observability, and security best practices in cloud environments. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
ingestion pipelines using Azure Data Factory (ADF) and Python . Ensure high-quality raw datasets to enable accurate analytics and data modeling. Deploy and maintain data tools on Kubernetes (Airflow, Superset, RStudio Connect). Support Data Analytics initiatives through DBT , DevOps , and deployment governance. You will work closely with a small, focused team, contributing directly to strategic data initiatives. More ❯