sector is responsible for designing, developing, and maintaining robust ETL (Extract, Transform, Load) solutions using Informatica tools. YOUR PROFILE Assist in designing and developing ETL processes to extract, transform, andload data from various sources into data warehouses or data marts. Very good in Informatica development, setup and IDMC cloud migration Strong in writing SQL, joining between tables and compare … the table data Collaborate with team members to understand data requirements and translate them into technical specifications. Support the maintenance and enhancement of existing ETL processes to ensure data accuracy and reliability. Conduct data quality checks and troubleshoot issues related to ETL processes. Participate in code reviews and provide feedback to improve ETL processes and performance. Document ETL processes and … clear and comprehensive records. Learn and apply best practices in ETL development and data integration. Knowledge of scripting languages (Python, Shell scripting) is advantageous. Very good knowledge in Datawarehouse andETL concepts ABOUT CAPGEMINI Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact More ❯
sector is responsible for designing, developing, and maintaining robust ETL (Extract, Transform, Load) solutions using Informatica tools. YOUR PROFILE Assist in designing and developing ETL processes to extract, transform, andload data from various sources into data warehouses or data marts. Very good in Informatica development, setup and IDMC cloud migration Strong in writing SQL, joining between tables and compare … the table data Collaborate with team members to understand data requirements and translate them into technical specifications. Support the maintenance and enhancement of existing ETL processes to ensure data accuracy and reliability. Conduct data quality checks and troubleshoot issues related to ETL processes. Participate in code reviews and provide feedback to improve ETL processes and performance. Document ETL processes and … clear and comprehensive records. Learn and apply best practices in ETL development and data integration. Knowledge of scripting languages (Python, Shell scripting) is advantageous. Very good knowledge in Datawarehouse andETL concepts ABOUT CAPGEMINI Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact More ❯
target state from current DWH estate towards data products/marketplace model on AWS/Snowflake. Review AWS Infrastructure components design & usage and implement enhancements. Design and implement an ETL (Extract, Transform, Load) engine using AWS EMR (Elastic MapReduce) for efficient data processing. Design, review, and implement reporting solutions integrating Tableau with AWS services for seamless data visualization. Design and … tools, and practices. Troubleshoot and resolve infrastructure-related issues, providing technical support and guidance. Your Profile Essential Skills/Knowledge/Experience: Extensive AWS service knowledge Lambda Avaloq experience ETL (Extract, Transform, Load) Integrating Tableau with AWS services Amazon EKS (Elastic Kubernetes Service) Infrastructure as Code, scripting (Python/Bash), Helm charts, Docker, Kubernetes Tools like Terraform, Ansible, and Jenkins More ❯
AWS, Azure, GCP) Expertise with modern cloud data platforms (e.g. Microsoft Fabric, Databricks) Expertise with multiple data analytics tools (e.g. Power BI) Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/ More ❯
Gen2), Databricks, Power BI, Event Hubs, and Functions . Strong background in data modeling, data warehousing, and real-time/batch data processing . Experience implementing data integration pipelines, ETL/ELT workflows, and data transformation frameworks on Azure. Knowledge of Azure security and identity management (RBAC, Key Vault, Managed Identities, networking, and encryption) . Familiarity with cost optimization techniques More ❯
Gen2), Databricks, Power BI, Event Hubs, and Functions . Strong background in data modeling, data warehousing, and real-time/batch data processing . Experience implementing data integration pipelines, ETL/ELT workflows, and data transformation frameworks on Azure. Knowledge of Azure security and identity management (RBAC, Key Vault, Managed Identities, networking, and encryption) . Familiarity with cost optimization techniques More ❯
best practices. Participate in Agile delivery using Azure DevOps for backlog management, sprint planning, and CI/CD. Technical Skills Azure Data Factory: Expert in building, automating, and optimising ETL pipelines. Azure Synapse Analytics: Strong experience with dedicated SQL pools, data warehousing concepts, and performance tuning. Power BI: Advanced experience managing enterprise models, datasets, and governance processes. SQL: Expert-level More ❯
best practices. Participate in Agile delivery using Azure DevOps for backlog management, sprint planning, and CI/CD. Technical Skills Azure Data Factory: Expert in building, automating, and optimising ETL pipelines. Azure Synapse Analytics: Strong experience with dedicated SQL pools, data warehousing concepts, and performance tuning. Power BI: Advanced experience managing enterprise models, datasets, and governance processes. SQL: Expert-level More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
from the warehouse. Maintaining and improving the existing Python-based dynamic pricing model and web scraping tools. What we're looking for: Experience across a modern data stack for ETL/ELT processes and data warehousing. Strong SQL and Python skills, with an understanding of Kimball-style data modelling . Experience with DBT, Dagster, or Airflow for transformation and orchestration. More ❯
solutions within Microsoft Fabric (including Data Factory, Synapse, and OneLake). Advanced proficiency in Power BI, including DAX, Power Query (M), and data modelling. Deep understanding of data warehousing, ETL, and data lakehouse concepts. Strong working knowledge of Databricks, including Delta Lake and notebooks. Strong interpersonal skills with the ability to influence and communicate complex data topics clearly. Excellent analytical More ❯
solutions within Microsoft Fabric (including Data Factory, Synapse, and OneLake). Advanced proficiency in Power BI, including DAX, Power Query (M), and data modelling. Deep understanding of data warehousing, ETL, and data lakehouse concepts. Strong working knowledge of Databricks, including Delta Lake and notebooks. Strong interpersonal skills with the ability to influence and communicate complex data topics clearly. Excellent analytical More ❯
engineering. Mandatory Skills: Cloud Platforms:Deep experience with AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like Apache Airflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI, SCADA, MES, and Historian More ❯
etc.).Strong proficiency in SQL, DAX, and Powe - r Query (M).Experience with Azure Data Services (e.g., Azure Synapse, Azure Data Lake, - Azure SQL).Solid understanding of data modelling, ETL processes, and BI a - rchitecture.Familiarity with CI/CD pipelines, DevOps, and version control - (e.g., Git).Excellent communication and stakeholder manage - ment skills.Ability to work independently and lead technic al More ❯
and shape the direction of the platform as it evolves, pushing the boundaries of what's possible with data and AI. What You'll Do Design & build high-performance ETL/ELT pipelines in modern cloud environments (including Azure, AWS, GCP, Snowflake or Databricks). Lead CI/CD automation, environment versioning, and production deployments for data products. Integrate AI More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Oscar Associates (UK) Limited
This role has a direct opportunity to grow into a Head of Data and AI position. Key Responsibilities Data Engineering & Architecture Lead the development and maintenance of data pipelines, ETL processes, and warehouse architecture (GCP, Azure). Ensure high-quality, scalable, and secure data infrastructure that supports campaign reporting and advanced analytics. Design and support the delivery of AI andMore ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement More ❯
to have: Microsoft Power BI certification (e.g., PL-300: Power BI Data Analyst) Experience of financial reporting, FP&A and Excel modelling Knowledge of data model design principles andETL Understanding of Azure services (e.g., Azure Synapse, Azure Data Factory, Azure SQL) Knowledge of Power Automate and Power Apps integration with Power BI Knowledge of non-Microsoft data sources (Snowflake More ❯
suited to someone who's confident communicating with data, product, and engineering teams, not a 'heads-down coder' type. Top 4 Core Skills Python - workflow automation, data processing, andETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing More ❯
documentation Skills To Create Thrills Strong SQL skills, able to write complex and performant queries with ease. Solid experience in Python development for data workflows Experience building and maintaining ETL pipelines, ideally with Apache Airflow or a similar orchestration tool Hands-on experience with Google Cloud Platform (BigQuery, GCS, etc.) or another major cloud provider Good understanding of data modelling More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Akkodis
WFHDuration: 3 months rolling contractType of contract : Freelance, Inside IR35Level: mid-Senior Duties and Tasks: Develop and optimize data pipelines using Databricks and Spark.Design and implement data models andETL processes in Snowflake.Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.Ensure data quality, integrity, and security across platforms.Monitor and troubleshoot data workflows and performance issues. Requirements: Proven More ❯
standards, models, and frameworks. Design data solutions leveraging Azure services such as Azure Data Lake, Azure SQL Database, Azure Synapse Analytics, Azure Data Factory, and Azure Databricks. Data Integration & ETL Develop and optimize data pipelines for ingestion, transformation, and storage using Azure Data Factory and Databricks. Governance & Security Implement data governance, security, and compliance practices aligned with financial services regulations More ❯
and lakehouses using Databricks , with additional exposure to Snowflake, Azure Synapse, or Fabric being a bonus. Data Modelling – Build and optimise enterprise-grade data models across varied data layers. ETL/ELT Engineering – Use tooling such as Databricks, SSIS, ADF, Informatica, IBM DataStage to drive efficient data ingestion and transformation. Data Governance – Implement governance and MDM using tools like Unity More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Focused Futures Consultancy LTD
and lakehouses using Databricks , with additional exposure to Snowflake, Azure Synapse, or Fabric being a bonus. Data Modelling – Build and optimise enterprise-grade data models across varied data layers. ETL/ELT Engineering – Use tooling such as Databricks, SSIS, ADF, Informatica, IBM DataStage to drive efficient data ingestion and transformation. Data Governance – Implement governance and MDM using tools like Unity More ❯
scientists, analysts, and software engineers to ensure the company's data strategy underpins their innovative financial products. Key Responsibilities: Lead the design, development, and optimisation of data pipelines andETL processes. Architect scalable data solutions to support analytics, machine learning, and real-time financial applications. Drive best practices for data engineering, ensuring high levels of data quality, governance, and security. More ❯