London, England, United Kingdom Hybrid / WFH Options
nudge
means we need to stay agile, meaning the responsibilities of a role are never set in stone. Responsibilities Design, build, and maintain robust, scalable, and secure data pipelines andETL processes from various structured and unstructured data sources Leverage data mining techniques to extract meaningful insights from large-scale datasets, including customer transactions, behavioural data, and third-party data providers More ❯
Kirkby on Bain, England, United Kingdom Hybrid / WFH Options
ANGLIAN WATER-2
Modeller, design robust, secure and supportable corporate data solutions to meet business requirements following dimensional modelling methodology, considering privacy by design and self-service capabilities by default. As an ETL Developer, develop, test and/or quality assure extracts of data from corporate systems into the Data Lake As a Semantic Layer Developer, develop, test and/or quality assure More ❯
on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and infrastructure-as-code practices Familiarity with data visualisation tools (Tableau, PowerBI, Looker) and analytics frameworks Leadership & Communication Proven experience leading technical More ❯
London, England, United Kingdom Hybrid / WFH Options
DATAPAO
with a focus on cloud platforms (AWS, Azure, GCP); You have a proven track record working with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); You have extensive experience in ETL/ELT development and data pipeline orchestration (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, and Step Functions.); You’re proficient in SQL and Python , using them to transformand optimize More ❯
offered on an Outside IR35 basis. Key Responsibilities: Design, build, and maintain scalable data pipelines using Microsoft Fabric, Azure Synapse, and Azure Data Factory (ADF). Develop and optimise ETL/ELT processes to support business intelligence and analytics solutions. Collaborate with data architects, analysts, and stakeholders to understand business requirements and translate them into technical solutions. Contribute to the … needed. Key Skills & Experience: Proven experience as a Data Engineer working in cloud-native environments. Microsoft Fabric Azure Synapse Analytics Azure Data Factory (ADF) Strong understanding of data modelling, ETL/ELT development, and data warehouse best practices. Experience working across greenfield and brownfield projects. Comfortable working independently in a remote setup. Strong communication and stakeholder engagement skills. Nice to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Primus
offered on an Outside IR35 basis. Key Responsibilities: Design, build, and maintain scalable data pipelines using Microsoft Fabric, Azure Synapse, and Azure Data Factory (ADF). Develop and optimise ETL/ELT processes to support business intelligence and analytics solutions. Collaborate with data architects, analysts, and stakeholders to understand business requirements and translate them into technical solutions. Contribute to the … needed. Key Skills & Experience: Proven experience as a Data Engineer working in cloud-native environments. Microsoft Fabric Azure Synapse Analytics Azure Data Factory (ADF) Strong understanding of data modelling, ETL/ELT development, and data warehouse best practices. Experience working across greenfield and brownfield projects. Comfortable working independently in a remote setup. Strong communication and stakeholder engagement skills. Nice to More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Fashion Company - London (Tech Stack: Data Engineer, Databricks, Python, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) We're recruiting on behalf of a leading fashion brand based in London that's recognised for combining creativity with cutting-edge technology. They're on the lookout for a talented Data Engineer to join their growing data team. … analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models andETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data quality and compliance with governance standards. Collaborate with data analysts, scientists, and stakeholders to deliver clean, usable data. … Looking For 3+ years' experience in Data Engineering roles. Strong skills in Python and SQL . Hands-on experience with GCP , particularly the services listed above. Solid understanding of ETL , data warehousing , and data modelling principles. Familiarity with Terraform or similar infrastructure-as-code tools. Knowledge of data governance and data quality management . Experience with version control systems like More ❯
London, England, United Kingdom Hybrid / WFH Options
Supermercados Guanabara
and mentoring a team of data engineers, setting standards and best practices. Architecting and building end-to-end data solutions and streaming platforms. Designing and implementing data pipelines andETL workflows using Pentaho BA. Creating engaging BI reports and dashboards using Power BI. Managing cloud-based data environments (AWS, Azure), ensuring scalability and resilience. Collaborating on project planning, risk management … environments. Essential Experience Includes 5+ years in data engineering with leadership responsibilities. Deep expertise in AWS and/or Azure data platforms and services. Proficiency with Pentaho BA for ETL processes and data workflows. Strong skills in Power BI, Python, and Java. Agile development experience and a pragmatic, team-first approach. Excellent problem-solving and communication skills, including the ability … SSIS, AWS or Azure Data Factory. Familiarity with Hadoop, Jenkins, or DevOps practices including CI/CD. Cloud certifications (Azure or AWS). Knowledge of additional programming languages or ETL tools. This is a fantastic opportunity to take a leadership role on meaningful, large-scale government programmes while continuing to develop your skills and experience in an inclusive, innovative environment. More ❯
ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a desire to make a significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using More ❯
Gold layers), working with modern tools such as Databricks , dbt , Azure Data Factory , and Python/SQL to support critical business analytics and AI/ML initiatives. Key Responsibilities ETL Development : Design and build robust and reusable ETL/ELT pipelines through the Medallion architecture in Databricks . Data Transformation : Create and manage data models and transformations using dbt , ensuring More ❯
London, England, United Kingdom Hybrid / WFH Options
freemarketFX Limited
Gold layers), working with modern tools such as Databricks , dbt , Azure Data Factory , and Python/SQL to support critical business analytics and AI/ML initiatives. Key Responsibilities ETL Development : Design and build robust and reusable ETL/ELT pipelines through the Medallion architecture in Databricks . Data Transformation : Create and manage data models and transformations using dbt , ensuring More ❯
London, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
helping them make informed, data-driven decisions. Key Responsibilities: Develop and implement data engineering solutions using tools like Azure Data Factory, Azure Synapse, and Fabric . Build and optimize ETL/ELT pipelines , data lakes, and data warehouses. Engage with stakeholders to translate business requirements into effective data strategies . Support data governance, architecture, and best practices to ensure high More ❯
London, England, United Kingdom Hybrid / WFH Options
Circana
ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a desire to make a significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
following product/solution development lifecycles using frameworks/methodologies such as Agile, SAFe, DevOps and use of associated tooling (e.g., version control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
following product/solution development lifecycles using frameworks/methodologies such as Agile, SAFe, DevOps and use of associated tooling (e.g., version control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or PowerBI. Excellent communication and stakeholder management skills. Google Cloud Professional certifications. Experience in alternative cloud More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or PowerBI. Excellent communication and stakeholder management skills. Desirable Experience: Google Cloud Professional certifications. Experience in More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with relational SQL databases either on premises or in the cloud. Power platform experience is desirable. Experience delivering multiple solutions using key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. A solid understanding of key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices andMore ❯
with relational SQL databases either on premises or in the cloud. Power platform experience is desirable. Experience delivering multiple solutions using key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. A solid understanding of key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices andMore ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or PowerBI. Excellent communication and stakeholder management skills. Google Cloud Professional certifications. Experience in alternative cloud More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Version 1
analytics to enhance decision-making capabilities. Provide technical expertise on Oracle FDI architecture, including semantic models, data pipelines, and performance optimisation. Oversee data integration and transformation processes, ensuring seamless ETL from Oracle Cloud ERP, HCM, SCM, and third-party applications. Collaborate with business users to identify key performance indicators (KPIs) and develop self-service analytics using FDI tools. Optimise data … focus on data modelling and analytics. Extensive knowledge of Oracle Analytics offerings, including Fusion Data Intelligence, OAC, OBIEE, Machine Learning, and ADW. Strong expertise in data warehousing concepts, including ETL, data pipelines, and semantic modelling within Oracle FDI. Proven ability to build advanced analytics, KPIs, and data visualisations within FDI to drive business insights. Experience in Oracle Cloud ERP, HCM More ❯
London, England, United Kingdom Hybrid / WFH Options
DATAPAO
to fit the bill? Technical Expertise 5+ years in Data Engineering , focusing on cloud platforms (AWS, Azure, GCP); Proven experience with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); Extensive ETL/ELT and data pipeline orchestration experience (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, Step Functions); Proficiency in SQL and Python for data transformation and optimization; Knowledge of CI/ More ❯
London, England, United Kingdom Hybrid / WFH Options
Just Eat Takeaway.com
with data visualization tools such as Looker, ThoughtSpot, or Tableau. Possess highly proficient SQL skills for querying and manipulating large, complex datasets. Exhibit strong experience with data warehousing concepts, ETL/ELT processes, and various database technologies. Show knowledge in a relevant data engineering programming language like Python, Bash, or Django. Hold practical experience working with major cloud platforms, including More ❯
London, England, United Kingdom Hybrid / WFH Options
Nadara
of enterprise data architecture patterns, including data lake, data warehouse, lakehouse, and cloud-native designs. Experience with Inmon, Data Vault 2.0, Kimball, and dimensional modelling. Knowledge of integration patterns, ETL/ELT processes, and tools (e.g., Apache Airflow, Azure Data Factory, Informatica, Talend) to orchestrate data workflows. Familiarity with DevOps/MLOps principles, CI/CD pipelines, and infrastructure as More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
AWS, Azure, or GCP, including infrastructure as code (e.g., Terraform) Understanding of network protocols, security frameworks, and best practices for secure system architecture Familiarity with data engineering concepts, including ETL/ELT pipelines, big data tools, and AI/ML workflows Ability to troubleshoot complex system issues, perform root-cause analysis, and implement effective solutions Excellent communication, teamwork, and organizational More ❯