AWS, Azure, GCP) Expertise with modern cloud data platforms (e.g. Microsoft Fabric, Databricks) Expertise with multiple data analytics tools (e.g. Power BI) Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/ More ❯
SQL for complex queries and data extraction. Cloud Architecture & Data Management: Strong understanding of cloud platforms (e.g., AWS, Google Cloud, Microsoft Azure) and their architecture. Knowledge of data warehousing, ETL processes, data pipelines, and data governance principles. Data Visualisation & Reporting: Skilled in using data visualisation tools (e.g., Tableau, Power BI) to create insightful dashboards and reports for stakeholders. Business Acumen More ❯
to define requirements and deliver solutions Requirements: Hands-on experience with Azure Data Factory, Databricks, SQL, Data Lake, Power BI Strong skills in T-SQL and DAX Experience in ETL development, data modelling, and data governance Familiarity with CI/CD pipelines and Azure DevOps Excellent communication and problem-solving skills More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
to define requirements and deliver solutions Requirements: Hands-on experience with Azure Data Factory, Databricks, SQL, Data Lake, Power BI Strong skills in T-SQL and DAX Experience in ETL development, data modelling, and data governance Familiarity with CI/CD pipelines and Azure DevOps Excellent communication and problem-solving skills More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
governance, and supporting analytics teams with high-quality, reliable data. Key Responsibilities: Develop and maintain scalable data pipelines using Azure Data Factory, Synapse, Databricks, and Microsoft Fabric. Build efficient ETL/ELT processes and data models to support analytics, reporting, and dashboards. Optimise existing pipelines for performance, reliability, and cost efficiency. Implement best practices for data quality, error handling, automation More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
governance, and supporting analytics teams with high-quality, reliable data. Key Responsibilities: Develop and maintain scalable data pipelines using Azure Data Factory, Synapse, Databricks, and Microsoft Fabric. Build efficient ETL/ELT processes and data models to support analytics, reporting, and dashboards. Optimise existing pipelines for performance, reliability, and cost efficiency. Implement best practices for data quality, error handling, automation More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
from the warehouse. Maintaining and improving the existing Python-based dynamic pricing model and web scraping tools. What we're looking for: Experience across a modern data stack for ETL/ELT processes and data warehousing. Strong SQL and Python skills, with an understanding of Kimball-style data modelling . Experience with DBT, Dagster, or Airflow for transformation and orchestration. More ❯
engineering. Mandatory Skills: Cloud Platforms:Deep experience with AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like Apache Airflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI, SCADA, MES, and Historian More ❯
and reliable flow of data, building foundational skills in a supportive environment. Key Responsibilities Data Pipeline Development • Contribute to the design, building, and maintenance of scalable data pipelines • Implement ETL (Extract, Transform, Load) processes to ingest data from various sources • Ensure data quality, integrity, and reliability through thorough testing and validation Database Management • Help manage and optimise data storage solutions More ❯
suited to someone who's confident communicating with data, product, and engineering teams, not a 'heads-down coder' type. Top 4 Core Skills Python - workflow automation, data processing, andETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing More ❯
and shape the direction of the platform as it evolves, pushing the boundaries of what's possible with data and AI. What You'll Do Design & build high-performance ETL/ELT pipelines in modern cloud environments (including Azure, AWS, GCP, Snowflake or Databricks). Lead CI/CD automation, environment versioning, and production deployments for data products. Integrate AI More ❯
quality Liaise with analytics capabilities across departments and co-ordinate to deliver business-centric solutions Engage in strategic data transformation projects to provide technical guidance Data Engineering & Automation: Develop ETL processes using SQL and Python; Implement Power Automate and Power Apps for workflow automation. AI & Emerging Technologies: understand how to leverage emerging capabilities How This Opportunity Is Different This is More ❯
this is your chance to make an impact. What You'll Work With Azure Data Services: Data Factory, Data Lake, SQL Databricks: Spark, Delta Lake Power BI: Advanced dashboards ETL & Data Modelling: T-SQL, metadata-driven pipelines DevOps: CI/CD Bonus: Python What you'll do Design and implement scalable Azure-based data solutions Build and optimize data pipelines More ❯
this is your chance to make an impact. What You'll Work With Azure Data Services: Data Factory, Data Lake, SQL Databricks: Spark, Delta Lake Power BI: Advanced dashboards ETL & Data Modelling: T-SQL, metadata-driven pipelines DevOps: CI/CD Bonus: Python What you'll do Design and implement scalable Azure-based data solutions Build and optimize data pipelines More ❯
standards, models, and frameworks. Design data solutions leveraging Azure services such as Azure Data Lake, Azure SQL Database, Azure Synapse Analytics, Azure Data Factory, and Azure Databricks. Data Integration & ETL Develop and optimize data pipelines for ingestion, transformation, and storage using Azure Data Factory and Databricks. Governance & Security Implement data governance, security, and compliance practices aligned with financial services regulations More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
design, build, and optimise scalable data pipelines and lakehouse architectures on Azure, enabling advanced analytics and data-driven decision making across the business. Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. Build and optimise data lakehouse architectures on Azure Data Lake Storage ( ADLS ) . Develop high-performance data solutions using Azure More ❯
design, build, and optimise scalable data pipelines and lakehouse architectures on Azure, enabling advanced analytics and data-driven decision making across the business. Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. Build and optimise data lakehouse architectures on Azure Data Lake Storage ( ADLS ) . Develop high-performance data solutions using Azure More ❯
model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing: Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development: Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Proactive Appointments
model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing: Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development: Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
and external sources, ensuring data quality, governance, and integrity.* Implement efficient data models and schemas within Snowflake, and use DBT for transformation, orchestration, and workflow management.* Optimise ELT/ETL processes for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and effectively across the data platform.* Work with orchestration tools such as Airflow, ADF, or More ❯
experience with Snowflake. 2+ years production experience with dbt (mandatory). Advanced SQL and strong Python programming skills. Experience with Git, CI/CD, and DevOps practices. Familiarity with ETL/ELT tools and cloud platforms (AWS, Azure). Knowledge of Snowflake features such as Snowpipe, streams, tasks, and query optimisation. Preferred Qualifications Snowflake certifications (SnowPro Core or Advanced). More ❯
Java exposure beneficial Delta Lake/Delta table optimisation experience Git/GitLab, CI/CD pipelines, DevOps practices Strong troubleshooting and problem-solving ability Experience with lakehouse architectures, ETL workflows, and distributed computing Familiarity with time-series, market data, transactional data or risk metrics Nice to Have Power BI dataset preparation OneLake, Azure Data Lake, Kubernetes, Docker Knowledge of More ❯
programming proficiency in Python and Spark (PySpark) or Scala, with the ability to build scalable and efficient data processing applications. Advanced understanding of data warehousing concepts, including dimensional modelling, ETL/ELT patterns, and modern data integration architectures. Extensive experience working with Azure data services, particularly Azure Data Factory, Azure Blob Storage, Azure SQL Database, and related components within the More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
experience with Snowflake. 2+ years production experience with dbt (mandatory). Advanced SQL and strong Python programming skills. Experience with Git, CI/CD, and DevOps practices. Familiarity with ETL/ELT tools and cloud platforms (AWS, Azure). Knowledge of Snowflake features such as Snowpipe, streams, tasks, and query optimisation. Preferred Qualifications Snowflake certifications (SnowPro Core or Advanced). More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Michael Page
innovative solutions and maintaining a strong reputation for excellence in analytics and data-driven decision-making. Description Senior Data Engineer Develop and maintain robust and scalable data pipelines andETL processes. Optimise data workflows and ensure efficient data storage solutions. Collaborate with analytics and engineering teams to meet business objectives. Ensure data integrity and implement best practices for data governance. More ❯