sector is responsible for designing, developing, and maintaining robust ETL (Extract, Transform, Load) solutions using Informatica tools. YOUR PROFILE Assist in designing and developing ETL processes to extract, transform, andload data from various sources into data warehouses or data marts. Very good in Informatica development, setup and IDMC cloud migration Strong in writing SQL, joining between tables and compare … the table data Collaborate with team members to understand data requirements and translate them into technical specifications. Support the maintenance and enhancement of existing ETL processes to ensure data accuracy and reliability. Conduct data quality checks and troubleshoot issues related to ETL processes. Participate in code reviews and provide feedback to improve ETL processes and performance. Document ETL processes and … clear and comprehensive records. Learn and apply best practices in ETL development and data integration. Knowledge of scripting languages (Python, Shell scripting) is advantageous. Very good knowledge in Datawarehouse andETL concepts ABOUT CAPGEMINI Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact More ❯
Gen2), Databricks, Power BI, Event Hubs, and Functions . Strong background in data modeling, data warehousing, and real-time/batch data processing . Experience implementing data integration pipelines, ETL/ELT workflows, and data transformation frameworks on Azure. Knowledge of Azure security and identity management (RBAC, Key Vault, Managed Identities, networking, and encryption) . Familiarity with cost optimization techniques More ❯
best practices. Participate in Agile delivery using Azure DevOps for backlog management, sprint planning, and CI/CD. Technical Skills Azure Data Factory: Expert in building, automating, and optimising ETL pipelines. Azure Synapse Analytics: Strong experience with dedicated SQL pools, data warehousing concepts, and performance tuning. Power BI: Advanced experience managing enterprise models, datasets, and governance processes. SQL: Expert-level More ❯
solutions within Microsoft Fabric (including Data Factory, Synapse, and OneLake). Advanced proficiency in Power BI, including DAX, Power Query (M), and data modelling. Deep understanding of data warehousing, ETL, and data lakehouse concepts. Strong working knowledge of Databricks, including Delta Lake and notebooks. Strong interpersonal skills with the ability to influence and communicate complex data topics clearly. Excellent analytical More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Oscar Associates (UK) Limited
This role has a direct opportunity to grow into a Head of Data and AI position. Key Responsibilities Data Engineering & Architecture Lead the development and maintenance of data pipelines, ETL processes, and warehouse architecture (GCP, Azure). Ensure high-quality, scalable, and secure data infrastructure that supports campaign reporting and advanced analytics. Design and support the delivery of AI andMore ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
suited to someone who's confident communicating with data, product, and engineering teams, not a 'heads-down coder' type. Top 4 Core Skills Python - workflow automation, data processing, andETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Focused Futures Consultancy LTD
and lakehouses using Databricks , with additional exposure to Snowflake, Azure Synapse, or Fabric being a bonus. Data Modelling – Build and optimise enterprise-grade data models across varied data layers. ETL/ELT Engineering – Use tooling such as Databricks, SSIS, ADF, Informatica, IBM DataStage to drive efficient data ingestion and transformation. Data Governance – Implement governance and MDM using tools like Unity More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tata Consultancy Services
Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL/ELT, and data pipeline orchestration Experience with data quality frameworks, metadata management, and data lineage Hands-on experience with machine learning pipelines and generative AI engineering Familiarity with DevOps More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions
internal best practices What You’ll Bring: Strong experience in data engineering with Microsoft Fabric Solid understanding of DataOps, CI/CD, and automation Hands-on experience with Jira, ETL/ELT, and data modelling Familiarity with Power BI, DAX, or Azure DevOps Excellent communication and stakeholder engagement skills Consulting or client-facing experience is a plus 🌱 Career Progression: Clear More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions
Data Factory, Lakehouse, Power BI) Strong proficiency in SQL, DAX, and Power Query (M) Experience with Azure Data Services (Synapse, Data Lake, Azure SQL) Solid understanding of data modelling, ETL processes, and BI architecture Familiarity with CI/CD pipelines, DevOps, and version control (Git) Excellent communication and stakeholder management skills Ability to work independently and lead technical delivery Desirable More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Asset Resourcing
Extensive experience in designing cloud data platforms using Azure, AWS, or exceptional on-premise design expertise. - At least 5 years in data engineering or business intelligence roles. - Proficiency in ETLand data pipeline design, with a technology-agnostic approach. - A solid understanding of data warehouse and data lake principles. - Expert SQL skills and demonstrable data modelling capabilities. About the Company More ❯
background in hands-on development of platforms/dashboards inc SQL, ADF, PowerBI etc Management and leadership of multi-disciplinary teams within data – Data Science, Data Analysis, Engineering/ETL, Data Visualisation Experience of developing and, critically, delivering AI/NLP/ML and predictive analytics capability within commercial environments. Experience of ensuring that all data-related activities comply with More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Billigence
to data engineering teams, driving innovation and best practices in data cloud implementations Design, develop, and implement scalable data solutions using modern cloud data platforms Architect and deliver robust ETL/ELT pipelines and data integration solutions for enterprise clients Drive technical excellence across projects, establishing coding standards, best practices, and quality assurance processes Collaborate with cross-functional teams including … data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with the ability to collaborate across cross-functional teams Experience managing client relationships at various levels Strong problem-solving abilities More ❯
and business intelligence (BI) systems document source-to-target mappings re-engineer manual data flows to enable scaling and repeatable use support the build of data streaming systems write ETL (extract, transform, load) scripts and code to ensure the ETL process performs optimally develop business intelligence reports that can be reused build accessible data for analysis Skills needed for this More ❯
in Python and SQL Experience with modern data tools (dbt, Airflow, Prefect, Dagster, etc.) Knowledge of cloud platforms like AWS , GCP , or Azure An understanding of data modelling andETL best practices Curiosity, creativity, and a mindset that thrives in fast-moving environments Why You’ll Love It Work on meaningful data challenges that directly impact their products Small, high More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Harrington Starr
transforming raw data into actionable intelligence, working closely with data scientists, quants, and business stakeholders to shape cutting-edge betting products. Key Responsibilities Build and optimise data pipelines andETL workflows in AWS using Python and SQL. Partner with analysts and quants to deliver reliable datasets for predictive modelling and pricing. Design and maintain data models supporting trading, risk, andMore ❯
City of London, London, United Kingdom Hybrid/Remote Options
Recann
critical business functions. What you’ll be doing Building and maintaining scalable data pipelines using Azure Data Factory, Azure Data Fabric, and Azure Synapse Analytics. Developing robust ELT/ETL processes to integrate data from multiple business systems. Ensuring data consistency, security, and compliance (including GDPR). Supporting analytics/reporting teams with clean, structured datasets. Collaborating with IT, Finance More ❯
portfolio managers, quants, and analysts to design and deliver scalable, cloud-based data solutions that power trading and investment strategies. Key Responsibilities Design, build, and optimise data pipelines andETL workflows using AWS, Python, and SQL. Develop and maintain data models, ensuring accuracy and reliability of trading and market data. Deliver Power BI dashboards and reports to provide real-time More ❯
years of experience in data engineering or a similar role Strong SQL skills and proficiency in at least one programming language (ideally Python) Understanding of data warehousing concepts andETL/ELT patterns Experience with version control (Git), testing, and code review practices Familiarity with cloud-based data environments (e.g. AWS, GCP, or Azure) Exposure to modern data tools such More ❯
to detail. Strong interpersonal and communication skills, with experience liaising between technical and business teams. Understanding of data quality principles and governance frameworks. Desirable Skills and Experience Familiarity with ETL or data pipeline concepts and tools (e.g. Pentaho, MuleSoft, dbt Labs). Knowledge of data warehousing and reporting best practices. Understanding of data modelling and metadata management. Awareness of GDPR More ❯
Effective communicator with experience engaging both technical and non-technical stakeholders. Understanding of data quality principles and data governance frameworks within an education setting. Desirable Skills & Experience Familiarity with ETL or data pipeline tools (e.g., Pentaho, MuleSoft, dbt Labs). Knowledge of data warehousing and reporting best practices. Understanding of data modelling and metadata management. Awareness of GDPR and data More ❯
JavaScript is a plus but not required. Experience implementing development best practices including writing automated testing and CI/CD deployment. Responsibilities : Build and maintain reliable data pipelines andETL processes for data ingestion and transformation. Support the development and maintenance of data models and data warehouses used for reporting and analytics. Collaborate with senior engineers, analysts, and product teams More ❯
metadata management, and data governance. Experience with modern data platforms such as Azure Data Lake, Databricks, Power BI, and SAP BTP. Solid grasp of enterprise integration patterns (APIs, streaming, ETL/ELT, event-driven architectures). Ability to translate complex data concepts into clear, value-focused business outcomes. Excellent stakeholder management and communication skills across technical and non-technical audiences. More ❯
the business build a new Azure platform, so exposure to Azure, ADF and Azure DevOps would also be highly beneficial.Principal Responsibilities: Perform the day to day running of the ETL processes that feed into the central data repository. Work with key stakeholders and other teams to gather requirements, identify where the data is located and to then implement the required More ❯