takes pride in engineering fundamentals, thrives in a small team, and wants to see the commercial impact of their work. Key Responsibilities Develop, maintain, and enhance data pipelines andETL processes using Python and SQL. Manage and integrate API connections and FTP data feeds into internal systems. Build and support dashboards and reports to provide visibility across trading and operations. … Maths, Physics). 2–4 years’ experience in a technical, data, or engineering-focused role. Strong skills in Python, SQL, and Excel/VBA . Experience building or maintaining ETL/data pipelines , particularly around APIs or FTP processes. Working knowledge of Microsoft Azure and Git . Excellent analytical, communication, and problem-solving skills. A proactive, curious mindset and a More ❯
takes pride in engineering fundamentals, thrives in a small team, and wants to see the commercial impact of their work. Key Responsibilities Develop, maintain, and enhance data pipelines andETL processes using Python and SQL. Manage and integrate API connections and FTP data feeds into internal systems. Build and support dashboards and reports to provide visibility across trading and operations. … Maths, Physics). 2–4 years’ experience in a technical, data, or engineering-focused role. Strong skills in Python, SQL, and Excel/VBA . Experience building or maintaining ETL/data pipelines , particularly around APIs or FTP processes. Working knowledge of Microsoft Azure and Git . Excellent analytical, communication, and problem-solving skills. A proactive, curious mindset and a More ❯
london (city of london), south east england, united kingdom
Mondrian Alpha
takes pride in engineering fundamentals, thrives in a small team, and wants to see the commercial impact of their work. Key Responsibilities Develop, maintain, and enhance data pipelines andETL processes using Python and SQL. Manage and integrate API connections and FTP data feeds into internal systems. Build and support dashboards and reports to provide visibility across trading and operations. … Maths, Physics). 2–4 years’ experience in a technical, data, or engineering-focused role. Strong skills in Python, SQL, and Excel/VBA . Experience building or maintaining ETL/data pipelines , particularly around APIs or FTP processes. Working knowledge of Microsoft Azure and Git . Excellent analytical, communication, and problem-solving skills. A proactive, curious mindset and a More ❯
the development and delivery of AI solution to a Government customer Design, develop, and maintain data processing pipelines using Apache Spark Implement ETL/ELT workflows to extract, transformandload large-scale datasets efficiently Develop and optimize Python-based applications for data ingestion Collaborate on development of machine learning models Ensure data quality, integrity, and performance across distributed environments More ❯
Northampton, Northamptonshire, East Midlands, United Kingdom
Experis
Ab Initio Developer Northampton - expected 2-3 days on site each week 6-12 months Umbrella Only Job description: Design, develop, and maintain ETL solutions using Ab Initio (Graphical Development Environment, Co>Operating System, EME, and Conduct>It). Integrate data from multiple heterogeneous sources into unified, high-quality datasets. Perform performance tuning, debugging, and optimization of existing ETL processes. … shell scripting , and data warehouse concepts . Familiarity with big data ecosystems (Hadoop, Hive, Spark) and cloud platforms (AWS, Azure, GCP) is a plus. Proven ability to troubleshoot complex ETL jobs and resolve performance issues. Experience working with large-scale datasets and enterprise data environments. Experience in the banking, financial services domain preferred. Exposure to CI/CD pipelines and … DevOps tools for ETL automation. Knowledge of Ab Initio Control Center (AICC) and Metadata Hub advantageous. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply More ❯
Ab Initio Developer Northampton - expected 2-3 days on site each week 6-12 months Umbrella Only Job description: Design, develop, and maintain ETL solutions using Ab Initio (Graphical Development Environment, Co>Operating System, EME, and Conduct>It). Integrate data from multiple heterogeneous sources into unified, high-quality datasets. Perform performance tuning, debugging, and optimization of existing ETL processes. … shell scripting , and data warehouse concepts . Familiarity with big data ecosystems (Hadoop, Hive, Spark) and cloud platforms (AWS, Azure, GCP) is a plus. Proven ability to troubleshoot complex ETL jobs and resolve performance issues. Experience working with large-scale datasets and enterprise data environments. Experience in the banking, financial services domain preferred. Exposure to CI/CD pipelines and … DevOps tools for ETL automation. Knowledge of Ab Initio Control Center (AICC) and Metadata Hub advantageous. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply More ❯
Ab Initio Developer Northampton - expected 2-3 days on site each week 6-12 months Umbrella Only Job description: Design, develop, and maintain ETL solutions using Ab Initio (Graphical Development Environment, Co>Operating System, EME, and Conduct>It). Integrate data from multiple heterogeneous sources into unified, high-quality datasets. Perform performance tuning, debugging, and optimization of existing ETL processes. … shell scripting , and data warehouse concepts . Familiarity with big data ecosystems (Hadoop, Hive, Spark) and cloud platforms (AWS, Azure, GCP) is a plus. Proven ability to troubleshoot complex ETL jobs and resolve performance issues. Experience working with large-scale datasets and enterprise data environments. Experience in the banking, financial services domain preferred. Exposure to CI/CD pipelines and … DevOps tools for ETL automation. Knowledge of Ab Initio Control Center (AICC) and Metadata Hub advantageous. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply More ❯
databases like PostgreSQL, MySQL, or similar Cloud data ecosystem (AWS) : hands-on experience with core AWS data services. Key services include: S3 for data lake storage AWS Glue for ETLand data cataloging Amazon Redshift or Athena for data warehousing and analytics Lambda for event-driven data processing. ETL/ELT pipeline development : experience in designing, building, and maintaining robust More ❯
london, south east england, united kingdom Hybrid / WFH Options
Route Research Ltd
validation checks on our release data sets · Automation: Automate all routine data checks, data loading processes, and reporting workflows · Data Insights & Tooling: Build internal tools and analytical solutions to extractand surface valuable insights from our proprietary data · Reporting & Analysis: Provide direct data analysis to support the insight team, as well as handling customer queries and producing scheduled data extracts … Apply basic DevOps concepts to maintain services and work across our server and cloud platforms, including contributing to networking solutions Skill Set: Essential: · Programming & Pipelines: Proven experience building robust ETL/ELT pipelines using any major programming language, with a strong preference for Python · Databases & SQL: Extensive experience working with databases, SQL, and data warehouses. Familiarity with Postgres and BigQuery More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Route Research Ltd
validation checks on our release data sets · Automation: Automate all routine data checks, data loading processes, and reporting workflows · Data Insights & Tooling: Build internal tools and analytical solutions to extractand surface valuable insights from our proprietary data · Reporting & Analysis: Provide direct data analysis to support the insight team, as well as handling customer queries and producing scheduled data extracts … Apply basic DevOps concepts to maintain services and work across our server and cloud platforms, including contributing to networking solutions Skill Set: Essential: · Programming & Pipelines: Proven experience building robust ETL/ELT pipelines using any major programming language, with a strong preference for Python · Databases & SQL: Extensive experience working with databases, SQL, and data warehouses. Familiarity with Postgres and BigQuery More ❯
a real mix of experience, knowledge and skills. Essential Skills DataBricks or similar and Azure Data Factory Languages: SQL, Python Data Platform: Azure data lake and SQL Server RDBMS ETL: SQL Server Integration Services and ADF Experienced in effectively working in a collaborative manner in a large, fast paced environment within a multifunctional technical team Nice to have Languages: PySQL … C#, R Reporting: PowerBI or similar Modelling: SQL Server SSAS Tabular (DAX)/Dimensional Modelling/Data Warehousing Development Management: Azure DevOps SQL MI/DB ETL: Active Batch, API, Event Hub, Kafka AI/ML We’re grateful for your interest in joining our team. Kindly note that only applicants whose experience and qualifications most closely align with the More ❯
a real mix of experience, knowledge and skills. Essential Skills DataBricks or similar and Azure Data Factory Languages: SQL, Python Data Platform: Azure data lake and SQL Server RDBMS ETL: SQL Server Integration Services and ADF Experienced in effectively working in a collaborative manner in a large, fast paced environment within a multifunctional technical team Nice to have Languages: PySQL … C#, R Reporting: PowerBI or similar Modelling: SQL Server SSAS Tabular (DAX)/Dimensional Modelling/Data Warehousing Development Management: Azure DevOps SQL MI/DB ETL: Active Batch, API, Event Hub, Kafka AI/ML We’re grateful for your interest in joining our team. Kindly note that only applicants whose experience and qualifications most closely align with the More ❯
a real mix of experience, knowledge and skills. Essential Skills DataBricks or similar and Azure Data Factory Languages: SQL, Python Data Platform: Azure data lake and SQL Server RDBMS ETL: SQL Server Integration Services and ADF Experienced in effectively working in a collaborative manner in a large, fast paced environment within a multifunctional technical team Nice to have Languages: PySQL … C#, R Reporting: PowerBI or similar Modelling: SQL Server SSAS Tabular (DAX)/Dimensional Modelling/Data Warehousing Development Management: Azure DevOps SQL MI/DB ETL: Active Batch, API, Event Hub, Kafka AI/ML We’re grateful for your interest in joining our team. Kindly note that only applicants whose experience and qualifications most closely align with the More ❯
a data engineering strategy that aligns with organisational goals and technological advancements. Design and implement a scalable, reliable, and cost-efficient modern cloud data platform. Build and maintain robust ETL/ELT pipelines for processing and managing large volumes of structured and unstructured data. Create and manage Power BI dashboards, reports, and data models to provide strategic insights. Integrate cutting … GCP) and data processing services. Advanced skills in Power BI, including DAX, Power Query, and data modelling. Strong programming abilities in Python, SQL, and/or Scala. Expertise in ETL/ELT processes, data warehousing, and data mesh architectures. Familiarity with AI/ML concepts and their application in data analytics. Experience with metadata management, data lineage tracking, and data More ❯
a data engineering strategy that aligns with organisational goals and technological advancements. Design and implement a scalable, reliable, and cost-efficient modern cloud data platform. Build and maintain robust ETL/ELT pipelines for processing and managing large volumes of structured and unstructured data. Create and manage Power BI dashboards, reports, and data models to provide strategic insights. Integrate cutting … GCP) and data processing services. Advanced skills in Power BI, including DAX, Power Query, and data modelling. Strong programming abilities in Python, SQL, and/or Scala. Expertise in ETL/ELT processes, data warehousing, and data mesh architectures. Familiarity with AI/ML concepts and their application in data analytics. Experience with metadata management, data lineage tracking, and data More ❯
london (city of london), south east england, united kingdom
DGH Recruitment
a data engineering strategy that aligns with organisational goals and technological advancements. Design and implement a scalable, reliable, and cost-efficient modern cloud data platform. Build and maintain robust ETL/ELT pipelines for processing and managing large volumes of structured and unstructured data. Create and manage Power BI dashboards, reports, and data models to provide strategic insights. Integrate cutting … GCP) and data processing services. Advanced skills in Power BI, including DAX, Power Query, and data modelling. Strong programming abilities in Python, SQL, and/or Scala. Expertise in ETL/ELT processes, data warehousing, and data mesh architectures. Familiarity with AI/ML concepts and their application in data analytics. Experience with metadata management, data lineage tracking, and data More ❯
bradford, yorkshire and the humber, united kingdom
Maria Mallaband Care Group Ltd
from multiple sources Creating advanced Power BI dashboards and reports to support strategic decisions Conducting statistical and trend analysis to drive operational improvements Data Warehouse Management Designing and maintaining ETL processes and scalable data models Collaborating with IT and engineering teams to ensure seamless data integration Collaboration & Strategy Partnering with stakeholders to understand analytical needs and deliver solutions Contributing to … innovative techniques and staying ahead of BI trends What You’ll Bring Advanced Power BI skills (including DAX, Power Query) Strong SQL and data warehousing experience Solid understanding of ETL pipelines and data modelling Minimum 2–3 years in data analytics or BI roles Bonus points for Python or R knowledge Education & Certifications: Bachelor’s or Master’s in Data More ❯
from multiple sources Creating advanced Power BI dashboards and reports to support strategic decisions Conducting statistical and trend analysis to drive operational improvements Data Warehouse Management Designing and maintaining ETL processes and scalable data models Collaborating with IT and engineering teams to ensure seamless data integration Collaboration & Strategy Partnering with stakeholders to understand analytical needs and deliver solutions Contributing to … innovative techniques and staying ahead of BI trends What You’ll Bring Advanced Power BI skills (including DAX, Power Query) Strong SQL and data warehousing experience Solid understanding of ETL pipelines and data modelling Minimum 2–3 years in data analytics or BI roles Bonus points for Python or R knowledge Education & Certifications: Bachelor’s or Master’s in Data More ❯
and analysts , collaborating with senior stakeholders to drive a data-driven culture . Key Responsibilities: Design and implement a modern cloud data platform (Azure) to support scalable analytics. Build ETL/ELT pipelines to process structured and unstructured data, enabling real-time insights. Develop and maintain Power BI dashboards, forecasting models, and business intelligence tools . Establish data governance frameworks … Advanced proficiency in Power BI , including DAX, Power Query, and data modeling. Strong programming skills in Python, SQL, and/or Scala for data processing and automation. Experience with ETL/ELT, data warehousing, and event-driven architectures . Knowledge of AI/ML applications in data analytics and business intelligence. Proven leadership experience, with the ability to manage andMore ❯
london, south east england, united kingdom Hybrid / WFH Options
KDR Talent Solutions
Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks Data Platform, ensuring performance, scalability, and availability. 🔹 Data Pipelines – Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. 🔹 Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. 🔹 CI/CD & Automation – Create deployment pipelines and automate workflows. 🔹 Collaboration – Work with stakeholders across Global IT, Data … hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling/DWH Design ✅ Medallion Architecture ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Previous London Market Insurance experience Why Join? 🚀 Greenfield Project – Work on More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
KDR Talent Solutions
Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks Data Platform, ensuring performance, scalability, and availability. 🔹 Data Pipelines – Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. 🔹 Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. 🔹 CI/CD & Automation – Create deployment pipelines and automate workflows. 🔹 Collaboration – Work with stakeholders across Global IT, Data … hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling/DWH Design ✅ Medallion Architecture ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Previous London Market Insurance experience Why Join? 🚀 Greenfield Project – Work on More ❯
slough, south east england, united kingdom Hybrid / WFH Options
KDR Talent Solutions
Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks Data Platform, ensuring performance, scalability, and availability. 🔹 Data Pipelines – Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. 🔹 Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. 🔹 CI/CD & Automation – Create deployment pipelines and automate workflows. 🔹 Collaboration – Work with stakeholders across Global IT, Data … hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling/DWH Design ✅ Medallion Architecture ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Previous London Market Insurance experience Why Join? 🚀 Greenfield Project – Work on More ❯
join our collaborative team, where youll be supporting an OAS (Oracle Analytics Server) application and the data it holds. Youll play a vital part in designing, developing, and maintaining ETL processes using Oracle Data Integrator (ODI), ensuring the smooth running of services while helping our customers get the insights they n click apply for full job details More ❯
robust, scalable, and secure integration solutions across diverse platforms. With hands-on experience in: API development & integration (REST, SOAP) Middleware platforms (MuleSoft, Dell Boomi, Azure Logic Apps) Data transformation & ETL pipelines Cloud and on-premise system integration If you have technical expertise and the strategic mindset then please apply or send a copy of your CV Robert Half Ltd acts More ❯
C++,C#, NET, Java etc) (Essential) Experience of using configuration and version control tools (Rational ClearCase, Git) (Essential) Desirable Experience with database technologies (Oracle, SQL, etc) (Preferred) Knowledge of ETL toolsets (Pentaho etc) (Preferred) Understanding of middleware technologies (IBM MQ, RabbitMQ) (Preferred More ❯