Ipswich, Suffolk, England, United Kingdom Hybrid / WFH Options
365Jobs
paced project delivery team, working closely with internal business stakeholders across different geographies as well as 3rd party vendors. It is essential that the successful candidate has experience of data mappings, data migration and dealing with 3rd party software implementations, ideally within a commercial insurance setting.The successful candidate will take the lead in gathering requirements from business stakeholders … tangible technology outcomes, defining & developing technical and non-technical requirements, whilst ensuring these requirements fit in with the strategy of the application. You will also be responsible for analysing data, process mapping/modelling and technical requirements e.g. API s etc. Working alongside the project team, excellent written communication and document preparation skills are also vital for this role. … with existing technology.? Collaborate with change, training, and support teams to ensure readiness for go-live.? Support country-specific adaptations and regulatory compliance (e.g., retention periods, legal warehousing). Data Analysis & Migration Conduct data mapping for claims data capture and document structures.? Prepare and validate migration files for open and closed claims, ensuring historical reporting capabilities.? Coordinate More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
Company Description At The Stepstone Group, we have a simple yet very important mission: The right job for everyone. Using our data, platform, and technology, we create opportunities for job seekers and companies around the world to find a perfect match, in fair and equitable way. With over 20 brands across 30+ countries, we strive for fair and unbiased … becoming the worlds leading job-tech platform. Job Description The role of Product Analytics Manager in our company is at the intersection of several disciplines: Product Analytics, Insights Analytics, DataEngineering and Data Science. Your teams projects will cover a broad range of topics, from working with engineers to implement event tracking on our platforms, all the … way to generating data insights and sometimes building predictive models, ultimately helping our Product organisation deliver great products, which solve meaningful user problems. You and your team are part of a larger Product Analytics group, and will be working closely with Product Directors, Heads of Product, Data Engineers, Data Scientists and many other stakeholders to ensure we More ❯
erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit . Job Title: Senior Specialist - DataEngineering (Python Automation Lead) Work Location- Austin, TX Job Description: Seeking a Senior Specialist with 7 to 11 years of experience in Python and data technologies including … Flask Apache Spark Scala and Nginx to design and implement scalable data driven solutions Develop and maintain high performance data processing applications using Apache Spark and Scala Build and deploy RESTful APIs and web services using Flask framework. Utilize Python extensively for data manipulation automation and integration tasks Configure and optimize Nginx as a reverse proxy and … load balancer for web applications. Collaborate with cross functional teams to design scalable data architectures within the Python Data skill cluster. Ensure code quality performance and reliability through rigorous testing and best practices. Stay updated with the latest trends and advancements in dataengineering and Python ecosystems. Lead the design and development of data processing More ❯
us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing … and maintaining robust data transformation pipelines that support business intelligence, analytics, and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders … to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in dataMore ❯
us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing … and maintaining robust data transformation pipelines that support business intelligence, analytics, and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders … to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in dataMore ❯
us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing … and maintaining robust data transformation pipelines that support business intelligence, analytics, and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders … to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in dataMore ❯
us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing … and maintaining robust data transformation pipelines that support business intelligence, analytics, and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders … to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in dataMore ❯
london (city of london), south east england, united kingdom
Capgemini
us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing … and maintaining robust data transformation pipelines that support business intelligence, analytics, and data science initiatives. Key Responsibilities: Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders … to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in dataMore ❯
Title: Senior Data Engineer Location: Columbus OH(Locals Only) Position Type: Contract W2 Only Visa Type: GC/USC Job Description: Requirements Bachelors degree in computer science or related technical field 5+ years of hands-on experience in software or dataengineering with coding in Python Proficiency with Azure data services including Azure Data Lake … Azure Data Factory, and Databricks Strong experience with Python for developing and maintaining data solutions Expertise in Cloud Security, including Active Directory, network security groups, and encryption services Proven ability to build and maintain data architectures supporting both real-time and batch processing Solid understanding of database management, legacy and modern data modeling, and system architecture … teamwork, leadership, and cross-functional collaboration skills Experience working within Agile or Lean development frameworks Nice to Have Experience with cloud cost optimization strategies or governance tools Familiarity with data pipeline monitoring and automation Exposure to multi-cloud environments (e.g., AWS, GCP) Responsibilities Design, develop, deploy, and maintain software applications and systems that support client Data Lakes, including More ❯
Junior Data Engineers are required by this major client, as they continue to build the cloud engineering capability in their Leeds offices, where you will provide best in class DataEngineering services to a wide range of major Public Sector organisations. As a result of the work that they do, this client requires applicants to hold … UK national or dual UK national. Please note your application will not be taken forward if you cannot fulfil these requirements. In order to secure one of these Junior Data Engineer roles you must be able to demonstrate the following experience: Commercial experience gained in a DataEngineering role on any major cloud platform (Azure, AWS or … Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Some experience with the design, build and maintenance of data pipelines and infrastructure Excellent problem solving skills with experience of troubleshooting and resolving data-related issues Skills they would love to see: Interest in building Machine learning and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
Microsoft Fabric Consultant | DataEngineering & DataOps? Hybrid (2 days onsite in London)?? Permanent | Full-time? £70,000 – £80,000 + 10% Bonus & 25 days holiday, pension & other benefitsAre you a Fabric Consultant or Data Engineer looking to work with cutting-edge tech and make a real impact We’re hiring for a hands-on role where you … ll design and build scalable data pipelines using Microsoft Fabric and Databricks , drive DataOps best practices, and manage agile delivery through Jira . You’ll be part of a high-performing consultancy team delivering modern data platforms for enterprise clients.What You’ll Be Doing as a Fabric Consultant: Building end-to-end data pipelines with Microsoft Fabric … Databricks Driving DataOps workflows, CI/CD automation, and agile delivery via Jira Leading client workshops and collaborating with cross-functional teams Supporting data governance, compliance, and privacy initiatives Mentoring junior engineers and contributing to internal best practices What You’ll Bring: Strong experience in dataengineering with Microsoft Fabric Solid understanding of DataOps, CI/CD More ❯
Sr.Databricks Engineer (AWS) Location: Glasgow Duration: 31/12/2026 Days on site: 2-3 Rate: £402/day on Umbrella Role Description: We are currently migrating our data pipelines from AWS to Databricks, and are seeking a Senior Databricks Engineer to lead and contribute to this transformation. This is a hands-on engineering role focused on … designing, building, and optimizing scalable data solutions using the Databricks platform. Key Responsibilities: Lead the migration of existing AWS-based data pipelines to Databricks. Design and implement scalable dataengineering solutions using Apache Spark on Databricks. Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines. Optimize performance and cost … efficiency of Databricks workloads. Develop and maintain CI/CD workflows for Databricks using GitLab or similar tools. Ensure data quality and reliability through robust unit testing and validation frameworks. Implement best practices for data governance, security, and access control within Databricks. Provide technical mentorship and guidance to junior engineers. Must-Have Skills: Strong hands-on experience with More ❯
Microsoft Fabric Consultant | DataEngineering & DataOps 📍 Hybrid (2 days onsite in London) 💼 Permanent | Full-time 💰 £70,000 – £75,000 + 10% Bonus & 25 days holiday, pension & other benefits Are you a Fabric Consultant or Data Engineer looking to work with cutting-edge tech and make a real impact? We’re hiring for a hands-on role where … you’ll design and build scalable data pipelines using Microsoft Fabric and Databricks , drive DataOps best practices, and manage agile delivery through Jira . You’ll be part of a high-performing consultancy team delivering modern data platforms for enterprise clients. What You’ll Be Doing as a Fabric Consultant: Building end-to-end data pipelines with … Microsoft Fabric & Databricks Driving DataOps workflows, CI/CD automation, and agile delivery via Jira Leading client workshops and collaborating with cross-functional teams Supporting data governance, compliance, and privacy initiatives Mentoring junior engineers and contributing to internal best practices What You’ll Bring: Strong experience in dataengineering with Microsoft Fabric Solid understanding of DataOps, CI More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
Microsoft Fabric Consultant | DataEngineering & DataOps 📍 Hybrid (2 days onsite in London) 💼 Permanent | Full-time 💰 £70,000 – £75,000 + 10% Bonus & 25 days holiday, pension & other benefits Are you a Fabric Consultant or Data Engineer looking to work with cutting-edge tech and make a real impact? We’re hiring for a hands-on role where … you’ll design and build scalable data pipelines using Microsoft Fabric and Databricks , drive DataOps best practices, and manage agile delivery through Jira . You’ll be part of a high-performing consultancy team delivering modern data platforms for enterprise clients. What You’ll Be Doing as a Fabric Consultant: Building end-to-end data pipelines with … Microsoft Fabric & Databricks Driving DataOps workflows, CI/CD automation, and agile delivery via Jira Leading client workshops and collaborating with cross-functional teams Supporting data governance, compliance, and privacy initiatives Mentoring junior engineers and contributing to internal best practices What You’ll Bring: Strong experience in dataengineering with Microsoft Fabric Solid understanding of DataOps, CI More ❯
Party Data Governance Analyst Banking London New opportunity Job title: Party Data Governance Analyst Employer: Investment Banking Location: London hybrid working 50/50 Permanent salary £50,000- £75,000 Focus of the role: Support the implementation of the data governance strategy and policy Technical stack: Power BI, Tableau and SharePoint, Collibra, Informatica, SQL, Python, R and … DataEngineering Requirements: understanding of party data & other domain data sets such as Human Resources, ESG/Sustainability, General Affairs & Public Relations. This is a new and exclusive opportunity for a Party Data Governance Analyst to join this thriving investment Bank as they are expanding their data team due to growth and investment for … The Party Data Governance Analyst is a very important role as part of the Division's Data Governance and BCBS239 Programme. The ideal candidate must have knowledge of party/customer-related data management, including the implementation and support of related data policies, standards, procedures & processes. The analyst will also have the opportunity to work on More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Huxley
Party Data Governance Analyst Banking London New opportunity Job title: Party Data Governance Analyst Employer: Investment Banking Location: London hybrid working 50/50 Permanent salary £50,000- £75,000 Focus of the role: Support the implementation of the data governance strategy and policy Technical stack: Power BI, Tableau and SharePoint, Collibra, Informatica, SQL, Python, R and … DataEngineering Requirements: understanding of party data & other domain data sets such as Human Resources, ESG/Sustainability, General Affairs & Public Relations. This is a new and exclusive opportunity for a Party Data Governance Analyst to join this thriving investment Bank as they are expanding their data team due to growth and investment for … The Party Data Governance Analyst is a very important role as part of the Division's Data Governance and BCBS239 Programme. The ideal candidate must have knowledge of party/customer-related data management, including the implementation and support of related data policies, standards, procedures & processes. The analyst will also have the opportunity to work on More ❯
Party Data Governance Analyst Banking London New opportunity Job title: Party Data Governance Analyst Employer: Investment Banking Location: London hybrid working 50/50 Permanent salary £50,000- £75,000 Focus of the role: Support the implementation of the data governance strategy and policy Technical stack: Power BI, Tableau and SharePoint, Collibra, Informatica, SQL, Python, R and … DataEngineering Requirements: understanding of party data & other domain data sets such as Human Resources, ESG/Sustainability, General Affairs & Public Relations. This is a new and exclusive opportunity for a Party Data Governance Analyst to join this thriving investment Bank as they are expanding their data team due to growth and investment for … The Party Data Governance Analyst is a very important role as part of the Division's Data Governance and BCBS239 Programme. The ideal candidate must have knowledge of party/customer-related data management, including the implementation and support of related data policies, standards, procedures & processes. The analyst will also have the opportunity to work on More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Huxley Associates
Party Data Governance Analyst Banking London New opportunity Job title: Party Data Governance Analyst Employer: Investment Banking Location: London hybrid working 50/50 Permanent salary £50,000- £75,000 Focus of the role: Support the implementation of the data governance strategy and policy Technical stack: Power BI, Tableau and SharePoint, Collibra, Informatica, SQL, Python, R and … DataEngineering Requirements: understanding of party data & other domain data sets such as Human Resources, ESG/Sustainability, General Affairs & Public Relations. This is a new and exclusive opportunity for a Party Data Governance Analyst to join this thriving investment Bank as they are expanding their data team due to growth and investment for … The Party Data Governance Analyst is a very important role as part of the Division's Data Governance and BCBS239 Programme. The ideal candidate must have knowledge of party/customer-related data management, including the implementation and support of related data policies, standards, procedures & processes. The analyst will also have the opportunity to work on More ❯
Birmingham, West Midlands, England, United Kingdom Hybrid / WFH Options
Robert Half
Robert Half is recruiting for a Head of Data on a permanent basis, with flexible and remote working (office based in Birmingham), paying up to £90,000. This is a newly created role, where the Head of Data is responsible for the strategy and delivery of BI and analytics. This is a 50/50 split between strategic … oversight and hands-on delivery, focusing on delivering high-value data products including dashboards and advanced analytics solutions. Key Responsibilities Engage with senior leadership to understand business objectives and align BI initiatives to requirements Lead the evolution of existing dashboards into high-value tools Work closely with other BI leads across sectors to ensure consistency and alignment of solutions … Act as bridge between the business and data team to increase business value of data products Key Skills Expert in BI and data visualisation, leading the data strategy for large enterprises Strong understanding of cloud-based data platforms Proven experience partnering with senior leaders to champion business value behind data analytics and BI All More ❯
Overview We are seeking an experienced Data Engineer (Python Enterprise Developer) to design, develop, and optimize data-driven solutions within enterprise environments. The ideal candidate will have deep expertise in Python programming, strong SQL skills, and hands-on experience with modern data platforms and cloud technologies. Key Responsibilities: Develop, test, and deploy scalable dataengineering solutions using Python . Build and maintain data pipelines leveraging libraries such as NumPy, pandas, BeautifulSoup, Selenium, pdfplumber, and Requests . Write and optimize complex SQL queries and manage databases including PostgreSQL . Integrate and automate workflows using DevOps tools (e.g., CI/CD, Jenkins, Git ). Collaborate on cloud-based data initiatives using AWS (S3) and … zones. Mentor team members and foster a positive, results-driven work environment. Qualifications: 8+ years of professional experience in Python development and scripting. Proven ability to design and implement data solutions within enterprise environments. Strong problem-solving, communication, and leadership skills. Familiarity with modern dataengineering best practices and tools. If you believe this job description aligns More ❯
Overview We are seeking an experienced Data Engineer (Python Enterprise Developer) to design, develop, and optimize data-driven solutions within enterprise environments. The ideal candidate will have deep expertise in Python programming, strong SQL skills, and hands-on experience with modern data platforms and cloud technologies. Key Responsibilities: Develop, test, and deploy scalable dataengineering solutions using Python . Build and maintain data pipelines leveraging libraries such as NumPy, pandas, BeautifulSoup, Selenium, pdfplumber, and Requests . Write and optimize complex SQL queries and manage databases including PostgreSQL . Integrate and automate workflows using DevOps tools (e.g., CI/CD, Jenkins, Git ). Collaborate on cloud-based data initiatives using AWS (S3) and … zones. Mentor team members and foster a positive, results-driven work environment. Qualifications: 8+ years of professional experience in Python development and scripting. Proven ability to design and implement data solutions within enterprise environments. Strong problem-solving, communication, and leadership skills. Familiarity with modern dataengineering best practices and tools. If you believe this job description aligns More ❯
Overview We are seeking an experienced Data Engineer (Python Enterprise Developer) to design, develop, and optimize data-driven solutions within enterprise environments. The ideal candidate will have deep expertise in Python programming, strong SQL skills, and hands-on experience with modern data platforms and cloud technologies. Key Responsibilities: Develop, test, and deploy scalable dataengineering solutions using Python . Build and maintain data pipelines leveraging libraries such as NumPy, pandas, BeautifulSoup, Selenium, pdfplumber, and Requests . Write and optimize complex SQL queries and manage databases including PostgreSQL . Integrate and automate workflows using DevOps tools (e.g., CI/CD, Jenkins, Git ). Collaborate on cloud-based data initiatives using AWS (S3) and … zones. Mentor team members and foster a positive, results-driven work environment. Qualifications: 8+ years of professional experience in Python development and scripting. Proven ability to design and implement data solutions within enterprise environments. Strong problem-solving, communication, and leadership skills. Familiarity with modern dataengineering best practices and tools. If you believe this job description aligns More ❯
Overview We are seeking an experienced Data Engineer (Python Enterprise Developer) to design, develop, and optimize data-driven solutions within enterprise environments. The ideal candidate will have deep expertise in Python programming, strong SQL skills, and hands-on experience with modern data platforms and cloud technologies. Key Responsibilities: Develop, test, and deploy scalable dataengineering solutions using Python . Build and maintain data pipelines leveraging libraries such as NumPy, pandas, BeautifulSoup, Selenium, pdfplumber, and Requests . Write and optimize complex SQL queries and manage databases including PostgreSQL . Integrate and automate workflows using DevOps tools (e.g., CI/CD, Jenkins, Git ). Collaborate on cloud-based data initiatives using AWS (S3) and … zones. Mentor team members and foster a positive, results-driven work environment. Qualifications: 8+ years of professional experience in Python development and scripting. Proven ability to design and implement data solutions within enterprise environments. Strong problem-solving, communication, and leadership skills. Familiarity with modern dataengineering best practices and tools. If you believe this job description aligns More ❯
Overview We are seeking an experienced Data Engineer (Python Enterprise Developer) to design, develop, and optimize data-driven solutions within enterprise environments. The ideal candidate will have deep expertise in Python programming, strong SQL skills, and hands-on experience with modern data platforms and cloud technologies. Key Responsibilities: Develop, test, and deploy scalable dataengineering solutions using Python . Build and maintain data pipelines leveraging libraries such as NumPy, pandas, BeautifulSoup, Selenium, pdfplumber, and Requests . Write and optimize complex SQL queries and manage databases including PostgreSQL . Integrate and automate workflows using DevOps tools (e.g., CI/CD, Jenkins, Git ). Collaborate on cloud-based data initiatives using AWS (S3) and … zones. Mentor team members and foster a positive, results-driven work environment. Qualifications: 8+ years of professional experience in Python development and scripting. Proven ability to design and implement data solutions within enterprise environments. Strong problem-solving, communication, and leadership skills. Familiarity with modern dataengineering best practices and tools. If you believe this job description aligns More ❯
london (city of london), south east england, united kingdom
PIXIE
Overview We are seeking an experienced Data Engineer (Python Enterprise Developer) to design, develop, and optimize data-driven solutions within enterprise environments. The ideal candidate will have deep expertise in Python programming, strong SQL skills, and hands-on experience with modern data platforms and cloud technologies. Key Responsibilities: Develop, test, and deploy scalable dataengineering solutions using Python . Build and maintain data pipelines leveraging libraries such as NumPy, pandas, BeautifulSoup, Selenium, pdfplumber, and Requests . Write and optimize complex SQL queries and manage databases including PostgreSQL . Integrate and automate workflows using DevOps tools (e.g., CI/CD, Jenkins, Git ). Collaborate on cloud-based data initiatives using AWS (S3) and … zones. Mentor team members and foster a positive, results-driven work environment. Qualifications: 8+ years of professional experience in Python development and scripting. Proven ability to design and implement data solutions within enterprise environments. Strong problem-solving, communication, and leadership skills. Familiarity with modern dataengineering best practices and tools. If you believe this job description aligns More ❯