error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity with data warehousing solutions (e.g., Snowflake More ❯
error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity with data warehousing solutions (e.g., Snowflake More ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
london (city of london), south east england, united kingdom
Luxoft
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
london (city of london), south east england, united kingdom
Luxoft
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
london (city of london), south east england, united kingdom
83data
suited to someone who’s confident communicating with data, product, and engineering teams, not a “heads-down coder” type. Top 4 Core Skills Python — workflow automation, data processing, andETL/ELT development. Snowflake — scalable data architecture, performance optimisation, and governance. SQL — expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) — modular data modelling, testing More ❯
suited to someone who’s confident communicating with data, product, and engineering teams, not a “heads-down coder” type. Top 4 Core Skills Python — workflow automation, data processing, andETL/ELT development. Snowflake — scalable data architecture, performance optimisation, and governance. SQL — expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) — modular data modelling, testing More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
management position. Financial services experience is essential , ideally covering lending, servicing, or securitisation data. Deep technical expertise in Microsoft and Azure data technologies. Strong knowledge of data modelling (Kimball) , ETL/ELT , and hybrid cloud architectures . Proven ability to drive quality, governance, and best practices within engineering teams. Excellent communication, stakeholder management, and leadership skills. If you're interested More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Akkodis
WFHDuration: 3 months rolling contractType of contract : Freelance, Inside IR35Level: mid-Senior Duties and Tasks: Develop and optimize data pipelines using Databricks and Spark.Design and implement data models andETL processes in Snowflake.Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.Ensure data quality, integrity, and security across platforms.Monitor and troubleshoot data workflows and performance issues. Requirements: Proven More ❯
Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models andMore ❯
Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models andMore ❯
london (city of london), south east england, united kingdom
Sanderson
Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models andMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
internal best practices What You’ll Bring: Strong experience in data engineering with Microsoft Fabric Solid understanding of DataOps, CI/CD, and automation Hands-on experience with Jira, ETL/ELT, and data modelling Familiarity with Power BI, DAX, or Azure DevOps Excellent communication and stakeholder engagement skills Consulting or client-facing experience is a plus Career Progression:Clear More ❯
Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models andMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
chance to make tangible impact across data quality, scalability, and reliability. Requirements 5+ years' experience in Data Engineering with strong Python and SQL skills. Background in building and maintaining ETL/ELT pipelines in a cloud environment (AWS or Azure). Knowledge of data warehousing (e.g., Redshift, Postgres) and tools like DBT . Familiarity with CI/CD , data modelling More ❯
Banbury, Oxfordshire, United Kingdom Hybrid / WFH Options
Cornwallis Elt Ltd
solutions. Develop and maintain data models, schemas, and documentation. Lead by example - setting standards for coding, design, and delivery in Databricks. Design, build, and maintain scalable data pipelines andETL processes across cloud platforms, databases, and APIs. Optimise data systems for performance, reliability, and scalability. Collaborate with the Data Architect to shape and deliver the data architecture roadmap. Maintain strong More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Asset Resourcing Limited
Extensive experience in designing cloud data platforms using Azure, AWS, or exceptional on-premise design expertise. - At least 5 years in data engineering or business intelligence roles. - Proficiency in ETLand data pipeline design, with a technology-agnostic approach. - A solid understanding of data warehouse and data lake principles. - Expert SQL skills and demonstrable data modelling capabilities. About the Company More ❯
Salesforce Administrator) are a plus. Preferred Skills Strong knowledge of integration patterns and authentication protocols. Knowledge of DevOps tools. Familiarity with the finance industry is a plus. Experience with ETL tools and data visualization platforms (e.g., Tableau, Power BI). Knowledge of programming languages (e.g., Python, Apex) for data manipulation and automation. Familiarity with cloud computing concepts and technologies. More ❯
Salesforce Administrator) are a plus. Preferred Skills Strong knowledge of integration patterns and authentication protocols. Knowledge of DevOps tools. Familiarity with the finance industry is a plus. Experience with ETL tools and data visualization platforms (e.g., Tableau, Power BI). Knowledge of programming languages (e.g., Python, Apex) for data manipulation and automation. Familiarity with cloud computing concepts and technologies. More ❯
london (city of london), south east england, united kingdom
Next Ventures
Salesforce Administrator) are a plus. Preferred Skills Strong knowledge of integration patterns and authentication protocols. Knowledge of DevOps tools. Familiarity with the finance industry is a plus. Experience with ETL tools and data visualization platforms (e.g., Tableau, Power BI). Knowledge of programming languages (e.g., Python, Apex) for data manipulation and automation. Familiarity with cloud computing concepts and technologies. More ❯
Brighton, East Sussex, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in most of the below areas... Experience in a Data Engineering (or similar) role Strong scripting skills in SQL (and Python would be a bonus) Experience designing and developing ETL/ELT processes using the Azure platform - Azure Synapse, Data Factory, Databricks or Fabric Knowledge of data lakes and medallion lake house design Working knowledge of Power BI or similar More ❯
london, south east england, united kingdom Hybrid / WFH Options
Tenth Revolution Group
for scalable, reliable, and governed data solutions. Strong leadership and mentorship capabilities, guiding teams through complex deliveries, fostering collaboration, and ensuring adoption of best practices. Skilled in orchestrating complex ETL workflows, integrating hybrid cloud environments, and delivering high-quality data for advanced analytics and reporting. Experience with Power BI, and building dynamic dashboards to uncover actionable insights. Excellent communication andMore ❯