essential Strong programming skills in Python and SQL; knowledge of Java or Scala is a plus Solid experience with relational databases and data modelling (Data Vault, Dimensional) Proficiency with ETL tools and cloud platforms (AWS, Azure or GCP) Experience working in Agile and DevOps environments Knowledge of AI/ML applications in data workflows is desirable Familiarity with visualisation tools More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
DevOps, and GitHub. Data Modelling Apply hands-on expertise in enterprise data modelling using tools such as ERwin, ER/Studio, or PowerDesigner to ensure performance, scalability, and maintainability. ETL/ELT Frameworks Design robust data pipelines using tools like Cloud Composer, Dataflow, Dataproc, Informatica, or IBM DataStage, supporting both batch and streaming ingestion. Data Governance & Quality Implement governance frameworks More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
DevOps, and GitHub. Data Modelling Apply hands-on expertise in enterprise data modelling using tools such as ERwin, ER/Studio, or PowerDesigner to ensure performance, scalability, and maintainability. ETL/ELT Frameworks Design robust data pipelines using tools like Cloud Composer, Dataflow, Dataproc, Informatica, or IBM DataStage, supporting both batch and streaming ingestion. Data Governance & Quality Implement governance frameworks More ❯
markets, balancing mechanisms, and regulatory frameworks (e.g., REMIT, EMIR). Expert in Python and SQL; strong experience with data engineering libraries (e.g., Pandas, PySpark, Dask). Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. More ❯
Experience with Tableau, Alteryx, Tableau Prep. Hands-on experience with SAP SuccessFactors, SAP Analytics Cloud (SAC). Knowledge of data visualisation best practices and dashboard design principles. Experience with ETL processes and data transformation workflows. Understanding of HR data structures, key metrics, and reporting requirements. Experience supporting HR system migrations or implementations. Experience supporting technology implementations or large-scale transformation More ❯
into the reporting layer, enabling analysts and stakeholders to uncover insight. Location: London Contract: 6 Months (OIR35) Rate: £325-400 p/day Key Responsibilities: Designing and maintaining scalable ETL/ELT pipelines . Integrating data from multiple sources into a centralised warehouse (SQL Server, PostgreSQL, or Snowflake). Working with Azure Data Factory and cloud-native tooling for data More ❯
City Of Westminster, London, United Kingdom Hybrid / WFH Options
Bennett and Game Recruitment LTD
background in data analysis, cleansing, enrichment, metadata management, and data innovation. Proficiency in SQL and MySQL. Experience with business intelligence tools: Power BI, Sisense, Quicksight. Hands-on experience with ETL tools, ideally Sagemaker Unified Studio. Ability to validate, test, and build data engineering solutions. Strong communication skills with ability to translate between technical and non-technical stakeholders. Desirable More ❯
Westminster, London, City of Westminster, United Kingdom Hybrid / WFH Options
Bennett and Game Recruitment LTD
background in data analysis, cleansing, enrichment, metadata management, and data innovation. Proficiency in SQL and MySQL. Experience with business intelligence tools: Power BI, Sisense, Quicksight. Hands-on experience with ETL tools, ideally Sagemaker Unified Studio. Ability to validate, test, and build data engineering solutions. Strong communication skills with ability to translate between technical and non-technical stakeholders. Desirable More ❯
analytics and reporting. You'll work with diverse datasets and collaborate with cross-functional teams to ensure data is accurate, accessible, and actionable. Key Responsibilities Build and maintain scalable ETL/ELT pipelines. Design data models and schemas to support analytics and reporting. Integrate data from APIs, internal systems, and streaming sources. Monitor and ensure data quality and availability. Collaborate More ❯
analytics and reporting. You'll work with diverse datasets and collaborate with cross-functional teams to ensure data is accurate, accessible, and actionable. Key Responsibilities Build and maintain scalable ETL/ELT pipelines. Design data models and schemas to support analytics and reporting. Integrate data from APIs, internal systems, and streaming sources. Monitor and ensure data quality and availability. Collaborate More ❯
Databricks Engineer London- hybrid- 3 days per week on-site 6 Months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with analytics More ❯
skills, Experience in risk management (Market, Credit, Regulatory). Familiarity with risk measures: VAR, CE/PE, PFE. Success in managing multi-terabyte data warehouses. Skilled in data warehousing, ETL/ELT, and reporting tools. Scripting skills (Python, PowerShell). Knowledge of applications, data governance, and cybersecurity.- Preferred: Experience with data modelling tools like dbt. Knowledge of orchestration tools andMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Deerfoot Recruitment Solutions Ltd
focus on Snowflake and Power BI. Strong expertise in data modelling, query optimisation, and Power BI development (DAX, dashboards). Hands-on with dbt, SQL, Python, and ELT/ETL pipeline development. Experience with Azure Data Factory (ADF), and migrating legacy data platforms to cloud solutions. Financial services exposure, ideally with market or credit risk analytics. Ability to produce technical More ❯
building and guiding high-performing engineering teams • Strong hands-on expertise in cloud platforms (GCP, AWS, or Azure) with emphasis on cloud-native architectures • Deep understanding of data engineering - ETL/ELT, real-time/batch processing, data lakes/warehouses, and data governance • Track record of delivering complex, scalable systems that drive measurable business impact • Strong software engineering foundation More ❯
technology domains including CMS, CRM, Martech platforms, data pipelines, analytics, and cloud services. Exposure to CDPs (e.g., Bloomreach, Segment, BlueConic, Tealium) and data integration pipelines. Understanding of data modelling, ETL processes, and basic analytics or BI tools like Power BI or Tableau. Experience in the sports, entertainment, or fan engagement domain is a strong plus. What can we offer you More ❯
Platform proficiency - comprehensive experience with Power BI Desktop and Service environments DAX mastery - advanced skills in creating complex calculations and measures Data architecture - proven experience in data modeling andETL process design Database skills - thorough understanding of relational databases and SQL proficiency Integration capabilities - connecting diverse data sources to Power BI platforms Stakeholder engagement - strong communication skills for requirements gathering More ❯
scientists, analysts, and software engineers to ensure the company's data strategy underpins their innovative financial products. Key Responsibilities: Lead the design, development, and optimisation of data pipelines andETL processes. Architect scalable data solutions to support analytics, machine learning, and real-time financial applications. Drive best practices for data engineering, ensuring high levels of data quality, governance, and security. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
CV TECHNICAL LTD
scientists, analysts, and software engineers to ensure the company's data strategy underpins their innovative financial products. Key Responsibilities: Lead the design, development, and optimisation of data pipelines andETL processes. Architect scalable data solutions to support analytics, machine learning, and real-time financial applications. Drive best practices for data engineering, ensuring high levels of data quality, governance, and security. More ❯
Bonus if you have – Proficiency in SQL Server, Azure SQL Database and other enterprise database platforms Experience with Azure data services (Azure Data Factory, Azure Synapse, Azure Data Lake) ETL/ELT: experience designing and implementing data integration workflows Knowledge of reporting and analytics platforms (Power BI, SSRS, or similar) Experience with dimensional modelling and data warehouse architecture patterns API More ❯
Azure DevOps Consulting: Requirements gathering, stakeholder engagement, agile delivery Desirable Experience Power BI/Tableau Azure Functions (Python/C#) Streaming (Event Hubs, Stream Analytics) CI/CD, SDLC, ETL best practices Please only apply if you have full right to work in the UK. This is remote with occasional travel as required to client sites and falls outside of More ❯
their team on an initial 6-month contract. Key skills/knowledge required: Proficiency in Python and SQL for data engineering and application development Experience building data pipelines andETL processes for capital markets data Hands-on experience with AWS cloud services and Snowflake data platform Experience with cloud-native development patterns and AWS services (Lambda, S3, RDS, etc.) Hands More ❯
related field. Proven expertise in Azure cloud services, including Infrastructure as Code deployment and CI/CD pipelines (GitHub/Azure DevOps). Strong experience in SQL , data warehousing, ETL design, and best practices. Proficiency in Python for data engineering tasks. Analytical skills to work with structured and unstructured datasets. Excellent communication skills with the ability to explain technical solutions More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
decisions and shape the company's long-term data strategy. What you'll be doing: Designing, developing, and owning scalable data pipelines and APIs . Building infrastructure for optimal ETL/ELT processes across diverse data sources. Working with Azure, dbt, SQL, and Python to deliver cloud-first solutions. Taking ownership of data quality and identifying opportunities for continuous improvement. More ❯
appropriate architecture design, opting for modern architectures where possible. Data Modeling: Design and optimize data models and schemas for efficient storage, retrieval, and analysis of structured and unstructured data. ETL Processes: Develop, optimize and automate ETL workflows to extract data from diverse sources, transform it into usable formats, andload it into data warehouses, data lakes or lakehouses. Big Data … teams, including data scientists, analysts, and software engineers, to understand requirements, define data architectures, and deliver data-driven solutions. Documentation: Create and maintain technical documentation, including data architecture diagrams, ETL workflows, and system documentation, to facilitate understanding and maintainability of data solutions. Best Practices: Stay current with emerging technologies and best practices in data engineering, cloud architecture, and DevOps. Mentoring … and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as AWS More ❯