responsible for designing, developing, and maintaining robust ETL (Extract, Transform, Load) solutions using Informatica tools. YOUR PROFILE Assist in designing and developing ETL processes to extract, transform, andload data from various sources into data warehouses or data marts. Very good in Informatica development, setup and IDMC cloud migration Strong in writing SQL, joining between tables and … compare the table data Collaborate with team members to understand data requirements and translate them into technical specifications. Support the maintenance and enhancement of existing ETL processes to ensure data accuracy and reliability. Conduct data quality checks and troubleshoot issues related to ETL processes. Participate in code reviews and provide feedback to improve ETL processes and performance. … records. Learn and apply best practices in ETLdevelopmentand data integration. Knowledge of scripting languages (Python, Shell scripting) is advantageous. Very good knowledge in Datawarehouse andETL concepts ABOUT CAPGEMINI Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact More ❯
responsible for designing, developing, and maintaining robust ETL (Extract, Transform, Load) solutions using Informatica tools. YOUR PROFILE Assist in designing and developing ETL processes to extract, transform, andload data from various sources into data warehouses or data marts. Very good in Informatica development, setup and IDMC cloud migration Strong in writing SQL, joining between tables and … compare the table data Collaborate with team members to understand data requirements and translate them into technical specifications. Support the maintenance and enhancement of existing ETL processes to ensure data accuracy and reliability. Conduct data quality checks and troubleshoot issues related to ETL processes. Participate in code reviews and provide feedback to improve ETL processes and performance. … records. Learn and apply best practices in ETLdevelopmentand data integration. Knowledge of scripting languages (Python, Shell scripting) is advantageous. Very good knowledge in Datawarehouse andETL concepts ABOUT CAPGEMINI Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact More ❯
Business requirements to technical solutions and the production of specifications, Designing and implementing business intelligence & modern data analytics platform technical solutions, Data architecture design and implementation Data modelling, Databricks ETL, data integration and data migration design and implementation Master data management system and process design and implementation, Data quality system and process design and implementation, Major focus on data … data modelling, data staging, and data extraction processes, including data warehouse and cloud infrastructure. Experience with multi-dimensional design, star schemas, facts and dimensions. Experience and demonstrated competencies in ETLdevelopment techniques. Experience in data warehouse performance optimization. Experience on projects across a variety of industry sectors an advantage, Comprehensive understanding of data management best practices including demonstrated … Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. More ❯
dynamic team and help shape the future of their data strategy. 🔍 Your Role: You’ll collaborate across the business to deliver high-impact data, applications, and services. From designing ETL pipelines to automating key processes, you’ll be at the heart of our data transformation journey. 🎯 What We’re Looking For: Proven experience in data engineering … and solution design Strong skills in Azure Data Factory, Databricks, Unity Catalogue, and SQL Server Practical knowledge of data modelling andETLdevelopment Hands-on coding anddevelopment with a focus on quality Excellent communication and stakeholder engagement skills A collaborative, problem-solving mindset 💡 What You’ll Do: Build strong stakeholder relationships to shape effective data solutions … Design and implement data models andETL pipelines Develop and maintain applications and automation tools Champion best practices in data quality, lineage, and governance Act as SME in Azure Data Factory, Azure Databricks, Unity Catalogue, and SQL Server Support agile delivery, code control, and QA standards Monitor and improve existing data sets and processes Enable colleagues through testing, training More ❯
dynamic team and help shape the future of their data strategy. 🔍 Your Role: You’ll collaborate across the business to deliver high-impact data, applications, and services. From designing ETL pipelines to automating key processes, you’ll be at the heart of our data transformation journey. 🎯 What We’re Looking For: Proven experience in data engineering … and solution design Strong skills in Azure Data Factory, Databricks, Unity Catalogue, and SQL Server Practical knowledge of data modelling andETLdevelopment Hands-on coding anddevelopment with a focus on quality Excellent communication and stakeholder engagement skills A collaborative, problem-solving mindset 💡 What You’ll Do: Build strong stakeholder relationships to shape effective data solutions … Design and implement data models andETL pipelines Develop and maintain applications and automation tools Champion best practices in data quality, lineage, and governance Act as SME in Azure Data Factory, Azure Databricks, Unity Catalogue, and SQL Server Support agile delivery, code control, and QA standards Monitor and improve existing data sets and processes Enable colleagues through testing, training More ❯
the Media industry, who are looking for a Data Migration Specialist to join their team on a contract basis. This is a highly technical role that blends hands-on development with strategic data management, offering the chance to lead critical data migration activities within this well-respected organisation. As the Data Migration Specialist, you will take ownership of designing … and executing data migrations from legacy systems, developing standardised ETL packages using Azure Data Factory. You'll be responsible for ensuring data quality, accuracy, and integrity throughout the migration process, while establishing robust processes and documentation that ensure consistency across technology. You'll work closely with functional and business leads, testing teams, and the Integration Team to align data … migration activities with overall integration strategies and business objectives. We are looking for: Extensive experience in ETLdevelopment using SQL, Azure Data Factory, Synapse Pipelines, and Stored Procedures Proven experience in legacy data migration and ERP implementations, including at least two full life cycle implementations in a Data Analyst role Strong SQL and SSIS experience It would be More ❯
We are seeking a Data Engineer (Databricks) to support the growth of a global technology provider within the Insurtech space. The role focuses on designing and delivering ETL pipelines and scalable solutions using the Azure ecosystem, with an emphasis on enabling advanced analytics and data-driven decision-making. As a key player in a high-performing data engineering team … opportunity to join a team that invests in your growth, with comprehensive training and certification programs and a real opportunity to showcase your talents. Key Responsibilities Design and develop ETL pipelines using Azure Data Factory for data ingestion and transformation Work with Azure Data Lake, Synapse, and SQL DW to manage large volumes of data Develop data transformation logic … solutions Create mapping documents, transformation rules, and ensure quality delivery Contribute to DevOps processes, CI/CD deployments, and agile delivery models Requirements 10+ years' experience in data engineering, ETLdevelopmentand big data solutions Experience within Insurance Technology or Finance Technology is essential Experience with Reinsurance is advantageous (please outline in your CV) Solid expertise with Azure More ❯
We are seeking a Data Engineer (Databricks) to support the growth of a global technology provider within the Insurtech space. The role focuses on designing and delivering ETL pipelines and scalable solutions using the Azure ecosystem, with an emphasis on enabling advanced analytics and data-driven decision-making. As a key player in a high-performing data engineering team … opportunity to join a team that invests in your growth, with comprehensive training and certification programs and a real opportunity to showcase your talents. Key Responsibilities Design and develop ETL pipelines using Azure Data Factory for data ingestion and transformation Work with Azure Data Lake, Synapse, and SQL DW to manage large volumes of data Develop data transformation logic … solutions Create mapping documents, transformation rules, and ensure quality delivery Contribute to DevOps processes, CI/CD deployments, and agile delivery models Requirements 10+ years' experience in data engineering, ETLdevelopmentand big data solutions Experience within Insurance Technology or Finance Technology is essential Experience with Reinsurance is advantageous (please outline in your CV) Solid expertise with Azure More ❯
this position. What You’ll Bring Strong Python, SQL, and dbt experience in production settings. Knowledge of cloud data warehouses (BigQuery, Redshift, Snowflake) and denormalised data models. Experience developing ETL/ELT pipelines across varied data sources Experience deploying and managing data infrastructure in a cloud environment., e.g. GCP, AWS or Azure. Ability to create dashboards for scientific users More ❯
City of London, London, United Kingdom Hybrid/Remote Options
TrueNorth®
this position. What You’ll Bring Strong Python, SQL, and dbt experience in production settings. Knowledge of cloud data warehouses (BigQuery, Redshift, Snowflake) and denormalised data models. Experience developing ETL/ELT pipelines across varied data sources Experience deploying and managing data infrastructure in a cloud environment., e.g. GCP, AWS or Azure. Ability to create dashboards for scientific users More ❯
with Business Objects SAP (Data services and Information design tools) Experience with Mortgage lending services (must have) Extensive experience with SQL, preferably MySQL Desirable experience with Redshift Experience developing ETL processes from multiple data sources/types Ability to communicate and operate with global distributed teams (internal and off-shore) If you're interested in joining a fintech that More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions Ltd
Data Engineer Salary £60,000–£75,000 + Equity, 25 days holiday, Pension + more Location - London (Hybrid) | Python | ETL | Impact-Driven Team I’m working with a fast-growing healthtech client that’s reshaping access to fertility care across Europe. They’ve just hired one Data Engineer and urgently need another to join their lean, high-performing team.This … driven ideal for a Data Engineer who thrives on solving real problems and wants their work to make a tangible difference. What You’ll Be Doing Build and optimize ETL pipelines in Python Migrate legacy databases to modern platforms Clean andtransform unstructured data into usable formats Support ML workflows and data-driven product features Collaborate with engineering, product … and leadership to deliver fast, meaningful results What We’re Looking For Solid experience as a Data Engineer , ideally in fast-moving environments Strong Python andETLdevelopment skills Understanding of database structures and migration challenges Problem-solving mindset with a bias for action Bonus: experience in healthcare, ML, or startup settings Why This Role? Mission-led product More ❯
Data Engineer Salary £60,000–£75,000 + Equity, 25 days holiday, Pension + more Location - London (Hybrid) | Python | ETL | Impact-Driven Team I’m working with a fast-growing healthtech client that’s reshaping access to fertility care across Europe. They’ve just hired one Data Engineer and urgently need another to join their lean, high-performing team. … driven ideal for a Data Engineer who thrives on solving real problems and wants their work to make a tangible difference. 🔧 What You’ll Be Doing Build and optimize ETL pipelines in Python Migrate legacy databases to modern platforms Clean andtransform unstructured data into usable formats Support ML workflows and data-driven product features Collaborate with engineering, product … and leadership to deliver fast, meaningful results 🧠 What We’re Looking For Solid experience as a Data Engineer , ideally in fast-moving environments Strong Python andETLdevelopment skills Understanding of database structures and migration challenges Problem-solving mindset with a bias for action Bonus: experience in healthcare, ML, or startup settings 💡 Why This Role? Mission-led product More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions
Data Engineer Salary £60,000–£75,000 + Equity, 25 days holiday, Pension + more Location - London (Hybrid) | Python | ETL | Impact-Driven Team I’m working with a fast-growing healthtech client that’s reshaping access to fertility care across Europe. They’ve just hired one Data Engineer and urgently need another to join their lean, high-performing team. … driven ideal for a Data Engineer who thrives on solving real problems and wants their work to make a tangible difference. 🔧 What You’ll Be Doing Build and optimize ETL pipelines in Python Migrate legacy databases to modern platforms Clean andtransform unstructured data into usable formats Support ML workflows and data-driven product features Collaborate with engineering, product … and leadership to deliver fast, meaningful results 🧠 What We’re Looking For Solid experience as a Data Engineer , ideally in fast-moving environments Strong Python andETLdevelopment skills Understanding of database structures and migration challenges Problem-solving mindset with a bias for action Bonus: experience in healthcare, ML, or startup settings 💡 Why This Role? Mission-led product More ❯
world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role: Seeking a skilled Data Analyst with expertise in Teradata, Informatica ETL, and risk modeling using SAS and Python. Design and implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit … deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting and dashboard creation using Python and BI tools. Optimize Teradata queries andETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your Profile: 3+ … years of experience in data analysis andETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. Experience building and validating risk models using SAS and Python. Solid understanding of statistical techniques and risk modeling frameworks. Familiarity with data governance and compliance standards. Excellent problem-solving and communication skills. ABOUT CAPGEMINI Capgemini is a global business and technology transformation More ❯
world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role: Seeking a skilled Data Analyst with expertise in Teradata, Informatica ETL, and risk modeling using SAS and Python. Design and implement ETL workflows to manage data extraction, transformation, and loading into Teradata. Build and maintain risk models for credit … deliver data-driven solutions. Perform exploratory data analysis to uncover trends and support risk mitigation. Automate reporting and dashboard creation using Python and BI tools. Optimize Teradata queries andETL performance for efficient data processing. Document data flows, model logic, and technical specifications for transparency. Ensure compliance with data governance and contribute to continuous improvement initiatives. Your Profile: 3+ … years of experience in data analysis andETL development. Strong proficiency in Teradata SQL and Informatica PowerCenter. Experience building and validating risk models using SAS and Python. Solid understanding of statistical techniques and risk modeling frameworks. Familiarity with data governance and compliance standards. Excellent problem-solving and communication skills. ABOUT CAPGEMINI Capgemini is a global business and technology transformation More ❯