Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Realtime Recruitment
for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud data platform More ❯
Bury, Greater Manchester, United Kingdom Hybrid / WFH Options
Realtime Recruitment
for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud data platform More ❯
Leigh, Greater Manchester, United Kingdom Hybrid / WFH Options
Realtime Recruitment
for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud data platform More ❯
Central London / West End, London, United Kingdom Hybrid / WFH Options
Realtime Recruitment
for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud data platform More ❯
london, south east england, united kingdom Hybrid / WFH Options
Realtime Recruitment
for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud data platform More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Realtime Recruitment
for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud data platform More ❯
Ashton-Under-Lyne, Greater Manchester, United Kingdom Hybrid / WFH Options
Realtime Recruitment
for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud data platform More ❯
london (west end), south east england, united kingdom Hybrid / WFH Options
Realtime Recruitment
for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud data platform More ❯
Greater Manchester, England, United Kingdom Hybrid / WFH Options
ECOM
communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master’s or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
Kafka, Cloud DataProc, Cloud Data Flow, Cloud Data Fusion, GCS, Cloud Pub/Sub, Cloud functions, BigTable, Python, Scala, Java, .NET, ANSI SQL, Spark, Airflow, ShellScripts, Control-M, Git, CI & CD, HDFS, Unix File System, RDBMS Azure Devops, Harness More about the role: Able to define Data Product Definition More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Square One Resources
Job Type : Permanent Job Responsibilities/Objectives Design and manage data warehouses using SQL, NoSQL, and cloud platforms. Develop ETL/ELT pipelines using Airflow and dbt. Collaborate with stakeholders to deliver impactful solutions. Ensure data quality, security, and governance. Required Skills/Experience The ideal candidate will have More ❯
with AWS services (S3, Glue, Lambda, SageMaker, Redshift, etc.) Strong Python and SQL skills; experience with PySpark a bonus Familiarity with containerization (Docker), orchestration (Airflow, Step Functions), and infrastructure as code (Terraform/CDK) Solid understanding of machine learning model lifecycle and best practices for deployment at scale Excellent More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
InterQuest Group (UK) Limited
communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master's or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
into production. We are looking for a individual who is interested in working with the latest big data technology (Spark, EMR, Glue, SageMaker, and Airflow) and collaborate with Economist and Scientist in creating scalable solutions for our multiple Retail Businesses. Key job responsibilities - Partnering with economists and senior team More ❯
time data processing Proficiency in shell scripting and experience with AWS cloud integration Familiarity with Refinitiv/Bloomberg market data and exposure to Python, Airflow, Observability, and CI/CD platforms Experience in maturing development practices within an agile-based team A learning mindset with the ability to adapt More ❯
understanding of data architecture, data-modelling, and best practices in data engineering Proficient in Python and SQL; experience with data processing frameworks such as Airflow, TensorFlow, or Spark is advantageous Willingness to gain working knowledge of backend development (e.g., Python with Django) for pipeline integration Familiarity with data versioning More ❯
related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and More ❯
related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and More ❯
Databricks (PySpark, SQL, Delta Lake, Unity Catalog). You have extensive experience in ETL/ELT development and data pipeline orchestration (Databricks Workflows, DLT, Airflow, ADF, Glue, Step Functions). You're proficient in SQL and Python, using them to transform and optimize data. You know your way around More ❯
Oxford, Oxfordshire, United Kingdom Hybrid / WFH Options
Connect Centric LLC
to enhance scalability and efficiency. Collaboration & Leadership : Work closely with software and AI engineering teams while mentoring junior engineers. Legacy Workflow Integration : Manage ArgoCD, Airflow, Jenkins, Bitbucket, and Bamboo pipelines. Technical Ownership : Act as a tech owner for software products, liaising with stakeholders and presenting cloud solutions. Continuous Learning More ❯
Solid understanding of ETL/ELT processes, along with hands-on experience building and maintaining data pipelines using DBT, Snowflake, Python, SQL, Terraform and Airflow * Experience in designing and implementing data products and solutions on cloud-based architectures. * Cloud Platforms: Experience working with cloud data warehouses and analytics platforms More ❯
analysis, extraction, transformation, and loading, data intelligence, data security and proven experience in their technologies (e.g. Spark, cloud-based ETL services, Python, Kafka, SQL, Airflow) You have experience in assessing the relevant data quality issues based on data sources & uses cases, and can integrate the relevant data quality checks More ❯
data platform (PaaS) Ensure scalable storage solutions (data lakes, data warehouses) to handle structured and unstructured data. Implement ETL/ELT pipelines using Dagster, Airflow, or similar tools. Optimize performance and scalability for large data volumes. Govern data security, compliance, and access controls. Development & DevOps: Strong programming and scripting More ❯
development lifecycle and agile methodologies. Proven experience designing, developing, and deploying machine learning models. Experience with debugging ML models. Experience with orchestration frameworks (e.g. Airflow, MLFlow, etc). Experience deploying machine learning models to production environments. Knowledge of MLOps practices and tools for model monitoring and maintenance. Familiarity with More ❯
data platform (PaaS) Ensure scalable storage solutions (data lakes, data warehouses) to handle structured and unstructured data. Implement ETL/ELT pipelines using Dagster, Airflow, or similar tools. Optimize performance and scalability for large data volumes. Govern data security, compliance, and access controls. Development & DevOps: Strong programming and scripting More ❯