Required: Bachelor's degree in Computer Science, Software Engineering, Data Science, or a closely related field. Advantageous: Certifications or substantial hands-on experience with modern data pipeline tools (e.g., ApacheAirflow, Spark, Kafka, dbt, or similar). Desirable: Familiarity with financial services regulatory frameworks (e.g., MiFID II, GDPR, SOX) and best practices for data governance Required Knowledge and … Engineering: Hands-on experience with Java (Spring Boot), React, and Python, covering backend, frontend, and data engineering. Data Engineering Tools: Proficient with modern data engineering and analytics platforms (e.g., ApacheAirflow, Spark, Kafka, dbt, Snowflake, or similar). DevOps & Cloud: Experience with containerisation (Docker, Kubernetes), CI/CD pipelines, and cloud platforms (e.g., AWS, Azure, GCP) is highly More ❯
at least one cloud data platform (e.g. AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark). · Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, ApacheAirflow etc · Good knowledge of stream and batch processing solutions like Apache Flink, Apache Kafka/· Good knowledge of log management, monitoring, and More ❯
london, south east england, united kingdom Hybrid/Remote Options
Yapily
. Preferred Skills Python: Knowledge for data automation and scripting. Containerization: Familiarity with tools like Docker and Kubernetes. Workflow/Orchestration Tools: Familiarity with workflow/orchestration tools (e.g., Airflow, Dagster, Prefect). Cloud-based Data Services: Exposure to cloud-based data services (GCP preferred; AWS/Azure also considered). Data Lineage & Metadata Management: Understanding of best practices. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Adecco
you the chance to work on cutting-edge solutions that make a real impact.Key Responsibilities* Data Engineering: Design and implement data pipelines, lakes, and warehouses using tools like Spark, Airflow, or dbt.* API & Microservices Development: Build secure, efficient APIs and microservices for data integration.* Full Stack Development: Deliver responsive, high-performance web applications using React (essential), plus Angular or More ❯
Profile: Proven experience as a Data Engineer, with strong expertise in designing and managing large-scale data systems. Hands-on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery More ❯
offs. Experience with multiple data formats and serialization systems (e.g. Arrow, Parquet, Protobuf/gRPC, Avro, Thrift, JSON, etc.) Experience managing data pipeline orchestration systems (e.g. Kubernetes, Argo Workflows, Airflow, Prefect, Dagster, etc.) Proven experience in managing the operational aspects of large data pipelines such as backfilling datasets, rerunning batch jobs, and handling dead-letter queues. Prior experience triaging More ❯
liverpool, north west england, united kingdom Hybrid/Remote Options
Intuita - Vacancies
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to ApacheAirflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience More ❯
South West London, London, United Kingdom Hybrid/Remote Options
ARC IT Recruitment Ltd
. AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT design and data orchestration, specifically with Apache Airflow. SQL Mastery: Strong SQL skills with significant experience in query tuning and performance optimisation. Programming Proficiency: Proficiency in Python and Bash (for data processing, scripting, and automation). More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Arc IT Recruitment
. AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT design and data orchestration, specifically with Apache Airflow. SQL Mastery: Strong SQL skills with significant experience in query tuning and performance optimisation. Programming Proficiency: Proficiency in Python and Bash (for data processing, scripting, and automation). More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
DCS Recruitment
modern data tools such as Snowflake, Databricks, or BigQuery. Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage. Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines, and Terraform. What you get in return: Up to £60,000 per annum + benefits Hybrid working (3 in office) Opportunity to lead More ❯
following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith More ❯
Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in More ❯
london (brentford), south east england, united kingdom
NBCUniversal
to lead and mentor a team of data engineers Experience in using techniques such as infrastructure as code and CI/CD Experience with graph-based data workflows using ApacheAirflow Programming skills in one or more of the following: Python, Java, Scala, R and experience in writing reusable/efficient code to automate analysis and data processes … Experience in processing large volumes of data using parallelism techniques/tooling, such as Apache Spark Experience in processing structured and unstructured data into a form suitable for analysis and reporting with integration with a variety of data metric providers ranging from advertising, web analytics, and consumer devices Experience in basic Machine Learning techniques is a big plus Experience More ❯
Greater Manchester, Lancashire, England, United Kingdom
Sagacity
Google Cloud, Databricks) are a strong plus Technical Skills: • Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL) • Familiarity with data pipeline and workflow management tools (e.g., ApacheAirflow) • Experience with programming languages such as Python, Java, or Scala. Python is highly preferred • Basic understanding of cloud platforms and services (e.g., AWS, Azure, Google Cloud) • Knowledge More ❯
time and compute costs. Develop modular, reusable transformations using SQL and Python. Implement CI/CD pipelines and manage deployments via Git. Automate workflows using orchestration tools such as Airflow or dbt Cloud. Configure and optimise Snowflake warehouses for performance and cost efficiency. Required Skills & Experience 7+ years in data engineering roles. 3+ years hands-on experience with Snowflake. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
time and compute costs. Develop modular, reusable transformations using SQL and Python. Implement CI/CD pipelines and manage deployments via Git. Automate workflows using orchestration tools such as Airflow or dbt Cloud. Configure and optimise Snowflake warehouses for performance and cost efficiency. Required Skills & Experience 7+ years in data engineering roles. 3+ years hands-on experience with Snowflake. More ❯
Bristol, Avon, South West, United Kingdom Hybrid/Remote Options
Indotronix Avani UK Ltd
data modelling and warehousing activities using best practices in schema design and data governance. Contribute to ETL/ELT development, testing, and automation using tools such as Python, SQL, Airflow, or Azure Data Factory. Collaborate closely with analysts, scientists, and architects to ensure data solutions meet business needs. Participate in Agile delivery teams, including sprint planning, code reviews, and More ❯
with SQL and Snowflake performance tuning. Hands-on expertise with Fivetran and dbt. Good understanding of data modelling, governance, and security best practices. Familiarity with orchestration tools such as Airflow or Prefect (advantageous). Experience working in Azure (or AWS/GCP). Strong analytical and collaboration skills, with great attention to detail. A degree in Computer Science, Data More ❯
in-person collaboration is crucial at this early stage) You may be a great fit if you have experience with any of the following... Workflow orchestration tooling (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (e.g. BigQuery, Snowflake, Redshift) Data transformation tools (e.g. dbt) and data quality frameworks (e.g. Great Expectations) Backend Python frameworks (e.g. Django, FastAPI, Flask More ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge More ❯
Manchester, North West, United Kingdom Hybrid/Remote Options
Client Server
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid/Remote Options
Fruition Group
Data Vault, Kimball, and dimensional modelling techniques. Experience designing data marts and semantic layers for BI tools (Power BI, Tableau, Looker). Familiarity with analytics engineering tools including dbt, Airflow, and Git for version control. Excellent collaboration and communication skills, with strong attention to detail and data quality. Desirable: Exposure to AI/ML data preparation, Python or Spark More ❯
Engineering: Proficiency in SQL and relational databases (e.g., PostgreSQL, DuckDB). Experience with the modern data stack, building data ingestion pipelines and working with ETL and orchestration tools (e.g., Airflow, Luigi, Argo, dbt), big data technologies (Spark, Kafka, Parquet), and web frameworks for model serving (e.g. Flask or FastAPI). Data Science: Familiarity or experience with classical NLP techniques More ❯
AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like ApacheAirflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI, SCADA, MES, and Historian systems. Information Modeling:Experience in defining More ❯
Birmingham, West Midlands, England, United Kingdom
Harnham - Data & Analytics Recruitment
pipelines. Work within an Azure-based environment (preferred). Provide input into best practices across the data function and help "keep the lights on." Tech Stack Core: Python, SQL, Airflow, dbt, Terraform, CI/CD, Power BI Nice to have: Kimball methodology Cloud: Azure (preferred) What They're Looking For 2-4 years' experience in a Data Engineering role. More ❯