field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're More ❯
Engineer position will require 2 days per week in Heathrow. The key skills required for this Data Engineer position are: Snowflake Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
Data Engineers! What skills you need...... Well versed working with Python Deep understanding of SQL Experienced in Spark and/OR PySpark. AWS Glue & Airflow experience is ideal for this position. Building, developing and maintaining robust data pipelines Good understand of working API's, databases etc Deep cloud knowledge More ❯
. Understanding of software development life cycle and agile methodologies. Proven experience designing, developing, and deploying machine learning models. Experience with orchestration frameworks (eg Airflow, MLFlow, etc) Experience deploying machine learning models to production environments. Knowledge of MLOps practices and tools for model monitoring and maintenance. Hands-on experience More ❯
Greater Manchester, England, United Kingdom Hybrid / WFH Options
ECOM
communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master’s or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Square One Resources
Job Type : Permanent Job Responsibilities/Objectives Design and manage data warehouses using SQL, NoSQL, and cloud platforms. Develop ETL/ELT pipelines using Airflow and dbt. Collaborate with stakeholders to deliver impactful solutions. Ensure data quality, security, and governance. Required Skills/Experience The ideal candidate will have More ❯
with AWS services (S3, Glue, Lambda, SageMaker, Redshift, etc.) Strong Python and SQL skills; experience with PySpark a bonus Familiarity with containerization (Docker), orchestration (Airflow, Step Functions), and infrastructure as code (Terraform/CDK) Solid understanding of machine learning model lifecycle and best practices for deployment at scale Excellent More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
ECOM Recruitment
communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (eg, POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master's or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
InterQuest Group (UK) Limited
communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master's or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
business needs into technical solutions using SQL and cloud tools. 🧠 What you’ll bring: Strong SQL skills Proven experience building and maintaining data pipelines (Airflow, DBT, etc.). Familiarity with cloud data platforms like Snowflake, BigQuery, or Redshift. Solid experience with BI tools like Looker, Tableau, or similar. Understanding More ❯
london, south east england, United Kingdom Hybrid / WFH Options
ENI – Elizabeth Norman International
business needs into technical solutions using SQL and cloud tools. 🧠 What you’ll bring: Strong SQL skills Proven experience building and maintaining data pipelines (Airflow, DBT, etc.). Familiarity with cloud data platforms like Snowflake, BigQuery, or Redshift. Solid experience with BI tools like Looker, Tableau, or similar. Understanding More ❯
time data processing Proficiency in shell scripting and experience with AWS cloud integration Familiarity with Refinitiv/Bloomberg market data and exposure to Python, Airflow, Observability, and CI/CD platforms Experience in maturing development practices within an agile-based team A learning mindset with the ability to adapt More ❯
coding practices. Results-driven, with the capacity to meet strict deadlines and service level agreements (SLAs). Ideally, familiarity with workflow management tools like Airflow, SQL database management, Linux command-line interface, Microsoft Dynamics 365 Finance and Operations, and Power Apps. The Package: Basic annual salary up to More ❯
coding practices. Results-driven, with the capacity to meet strict deadlines and service level agreements (SLAs). Ideally, familiarity with workflow management tools like Airflow, SQL database management, Linux command-line interface, Microsoft Dynamics 365 Finance and Operations, and Power Apps. The Package: Basic annual salary up to More ❯
coding practices. Results-driven, with the capacity to meet strict deadlines and service level agreements (SLAs). Ideally, familiarity with workflow management tools like Airflow, SQL database management, Linux command-line interface, Microsoft Dynamics 365 Finance and Operations, and Power Apps. The Package: Basic annual salary up to More ❯
coding practices. Results-driven, with the capacity to meet strict deadlines and service level agreements (SLAs). Ideally, familiarity with workflow management tools like Airflow, SQL database management, Linux command-line interface, Microsoft Dynamics 365 Finance and Operations, and Power Apps. The Package: Basic annual salary up to More ❯
coding practices. Results-driven, with the capacity to meet strict deadlines and service level agreements (SLAs). Ideally, familiarity with workflow management tools like Airflow, SQL database management, Linux command-line interface, Microsoft Dynamics 365 Finance and Operations, and Power Apps. The Package: Basic annual salary up to More ❯
coding practices. Results-driven, with the capacity to meet strict deadlines and service level agreements (SLAs). Ideally, familiarity with workflow management tools like Airflow, SQL database management, Linux command-line interface, Microsoft Dynamics 365 Finance and Operations, and Power Apps. The Package: Basic annual salary up to More ❯
understanding of data architecture, data-modelling, and best practices in data engineering Proficient in Python and SQL; experience with data processing frameworks such as Airflow, TensorFlow, or Spark is advantageous Willingness to gain working knowledge of backend development (e.g., Python with Django) for pipeline integration Familiarity with data versioning More ❯
related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and More ❯
related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and More ❯
Databricks (PySpark, SQL, Delta Lake, Unity Catalog). You have extensive experience in ETL/ELT development and data pipeline orchestration (Databricks Workflows, DLT, Airflow, ADF, Glue, Step Functions). You're proficient in SQL and Python, using them to transform and optimize data. You know your way around More ❯
Oxford, Oxfordshire, United Kingdom Hybrid / WFH Options
Connect Centric LLC
to enhance scalability and efficiency. Collaboration & Leadership : Work closely with software and AI engineering teams while mentoring junior engineers. Legacy Workflow Integration : Manage ArgoCD, Airflow, Jenkins, Bitbucket, and Bamboo pipelines. Technical Ownership : Act as a tech owner for software products, liaising with stakeholders and presenting cloud solutions. Continuous Learning More ❯