intelligence and reporting tools like Tableau, PowerBI or similar. Experience with version control systems (e.g. Git) Ability to work in an Agile environment Experience with Microsoft SQL. Experience with ETL Tools and Data Migration. Experience with Data Analysis, Data mapping and UML. Experience with programming languages (Python, Ruby, C++, PHP, etc). The ability to work with large datasets across More ❯
Spark, Databricks, or similar data processing tools. Stron g technical proficiency in data modelin g , SQL, NoSQL databases, and data warehousing . Hands-on experience with data pipeline development, ETL processes, and big data technolo g ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
SQL, Spark SQL, and Python for data processing and automation Knowledge of Microsoft Fabric and Azure Data Factory would be useful but not essential Power BI Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats Familiarity with workflow automation tools (eg, Power Automate) and/ More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
workshops, and shape pre-sales proposals (40–50% of the role) Advise senior client stakeholders (CDOs, CIOs, Heads of Data) on data strategy, governance, and platform modernisation Design robust ETL/ELT frameworks with Azure Data Factory , SSIS , Informatica , or IBM DataStage Lead data modelling using ERwin , ER/Studio , or PowerDesigner Implement data governance and quality frameworks using Unity More ❯
to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the monitoring, alerting, deployment strategies for various services. Experience providing solution for resiliency, fail over, monitoring etc. Good to have More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
knowledge of Microsoft Fabric, Azure Data Factory, Power BI, and related Azure tools Strong proficiency in SQL, Spark SQL, and Python for data processing and automation Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats (CSV, JSON, Parquet, etc) Familiarity with workflow automation tools (eg More ❯
AWS - Data source exploration, data warehousing expansion, horizontal datasets for downstream consumption - Work with customers to build Dashboards with the right KPIs, Metrics for decision making - Data Quality checks, ETL/ELT processes, automation Technical Requirements: - Strong proficiency in SQL and Python programming - Extensive experience with data modeling and data warehouse concepts - Advanced knowledge of AWS data services, including: S3 … Redshift, AWS Glue, AWS Lambda - Experience with Infrastructure as Code using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability More ❯
e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong understanding of data quality frameworks, data contracts, and lineage. Proficient in using analytics and BI tools More ❯
City of London, London, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks Data Platform, ensuring performance, scalability, and availability. 🔹 Data Pipelines – Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. 🔹 Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. 🔹 CI/CD & Automation – Create deployment pipelines and automate workflows. 🔹 Collaboration – Work with stakeholders across Global IT, Data … hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge of MS SQL Server, SSIS, Visual Studio, and SSDT projects More ❯
Expert in Microsoft Azure data services and hands-on with Microsoft Fabric (OneLake, Lakehouse, DirectLake, Power BI integration, etc.) would be a distinct advantage. Strong understanding of data modelling, ETL/ELT pipelines, and data warehousing principles. Skilled in designing scalable and secure solutions using best practices and industry frameworks. Excellent communication and stakeholder engagement skills. Bonus Points For Certifications More ❯
Employment Type: Permanent
Salary: £90000 - £110000/annum Plus bonus and package
City of London, London, United Kingdom Hybrid / WFH Options
Hanson Lee
models, database design development, data mining, and segmentation techniques. Strong knowledge of and experience with reporting packages (Business Objects, Tableau, Power BI), databases (SQL, etc.), programming (XML, JavaScript, or ETL frameworks), and data visualization (Power BI, Tableau, etc.). Demonstrated analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail andMore ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
GreatFind Recruitment
and commitment to data accuracy Desirable: Certifications in data management or architecture (e.g., CDMP, TOGAF) Project management experience (Agile methodologies preferred) Familiarity with tools such as SQL, NoSQL, andETL platforms Benefits 33 days annual leave plus bank holidays 8% employer pension contribution or NHS Pension continuation (if applicable) Enhanced family leave policies Free health cashback plan and employee assistance More ❯
Loving Heart . These values guide how we serve our clients, grow our business, and support each other. Key Responsibilities Design, develop, and maintain interactive Power BI dashboards and reports Extract, transform, andload (ETL) data from Salesforce , Simpro , Unleashed and other systems into the Microsoft Fabric Data Lake (OneLake) Build and manage data pipelines into Fabric using tools like … on experience extracting data from systems like Salesforce , Simpro , and ERP platforms into a data lake environment Strong DAX, Power Query (M), and SQL skills Familiarity with data modeling, ETL frameworks , and structured/unstructured data handling Knowledge of Power BI administration, service workspaces, and security practices Understanding of business processes and workflows across CRM, ERP, and field service systems More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
technologies (Azure, AWS, GCP) and tools like Databricks, Snowflake, Synapse. Shape cloud migration and modernization strategies with a strong focus on DevOps practices. Architect scalable data models and robust ETL/ELT pipelines using industry-standard frameworks. Implement data governance and quality frameworks to ensure data integrity and compliance. Collaborate with clients’ senior leadership to influence data-driven transformation initiatives. More ❯
tools such as GA4 and Adobe Analytics to track the right data Data engineering: For smaller clients: centralise and clean marketing data using proprietary tools For larger clients: manage ETL processes and build scalable, clean tables Lay strong data foundations to enable analytics, reporting and modelling Consulting & insight activation: Translate analysis into actionable guidance for media, CRO, CRM and creative More ❯
Proven experience designing scalable data models, architectures, and pipelines, with proficiency in cloud platforms (AWS, GCP, or Azure) and data warehousing solutions. Hands-on experience with data integration tools, ETL processes, and statistical analysis tools (e.g., SAS, SPSS, STATA, R, Matlab) or general programming. Familiarity with SQL for large-scale datasets; training provided if needed. Prior experience at an insights … Architecture Leadership: Design and implement robust data architectures to support generative AI capabilities, ensuring scalability, performance, and compliance with data governance and security standards. Develop and maintain data pipelines, ETL processes, and integration tools to enable seamless data flow for AI-driven initiatives. Collaborate with data scientists, engineers, and business stakeholders to define data strategies and roadmaps aligned with business More ❯
data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT processes Clear communication skills and ability to document processes effectively Desirable Skills: GCP Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
use cases Architect cloud-native data platforms using tools like Databricks , Airflow , Snowflake , and Spark Collaborate with AI/ML teams to align data processing with model requirements Develop ETL/ELT workflows to support feature engineering, model training, and inference Optimise data workflows for scalability, reliability, and cost-efficiency Ensure security, compliance, and data governance standards (e.g. GDPR , RBAC More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Fabric or strong expertise in related technologies such as Power BI, Azure Synapse, Data Factory, Azure Data Lake etc. A solid understanding of data engineering principles, including data modelling, ETL/ELT processes, and data warehousing. Hands-on experience with Power BI and DAX, and ideally some exposure to Notebooks, Pipelines, or Lakehouses within Fabric. Strong communication and stakeholder management More ❯
Python experience in a collaborative, version-controlled production environment. Familiarity with financial markets and instruments, especially derivatives and fixed income products. Exposure to systems involving data movement or transformation (ETL processes) is beneficial though not central to the role. Strong verbal and written communication skills, with a track record of collaborating across both technical and non-technical teams. Practical knowledge More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
Proven track record leading multi-million-pound projects within consulting or enterprise-level engagements. • Strong stakeholder engagement at CxO or Director level. • Deep experience in cloud data lake architectures, ETL/ELT patterns, and metadata/data quality management. • Expertise in Matillion, Redshift, Glue, Lambda, DynamoDB, and data pipeline automation. • Familiarity with data visualisation platforms such as Quicksight, Tableau, or More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
experience in a Data Engineer role and a strong academic background Python & SQL: Advanced-level Python for data applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
experience in a Data Engineer role and a strong academic background Python & SQL: Advanced-level Python for data applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud More ❯
working with franchisees or external partners on performance management. Proficiency in data visualization tools (Power BI, Tableau, or similar). Familiarity with data storage and integration platforms (Snowflake, APIs, ETL processes). Understanding of POS systems and financial reporting. Strong stakeholder management and communication skills. Ability to work cross-functionally with IT, operations, and franchise partners. Problem-solving mindset with More ❯