Cardiff, Wales, United Kingdom Hybrid / WFH Options
Navtech, Inc
extensive experience in designing, implementing, and optimizing database solutions in a microservices-based environment. As a member of our team, you will contribute to the full lifecycle of our data persistence layer, from schema design and performance tuning to ensuring robust replication, disaster recovery, and seamless integration within our cloud-native microservice ecosystem. Responsibilities: Design & Develop Database Solutions: Architect … design, and implement highly optimized relational (e.g., MySQL, PostgreSQL, AWS Aurora, SQL Server) and NoSQL (e.g., MongoDB, DynamoDB, Redis) database schemas, ensuring data integrity, performance, and scalability for microservices. Performance Optimization & Tuning: Proactively analyze and optimize complex queries, implement efficient indexing strategies, and manage partitioning/sharding to ensure peak database performance and handle high throughput. Reliability & Disaster Recovery … in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., Apache Airflow). Familiarity with ChangeDataCapture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM frameworks (Entity Framework, Dapper, SQLAlchemy, Hibernate) from a database performance More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Navtech, Inc
extensive experience in designing, implementing, and optimizing database solutions in a microservices-based environment. As a member of our team, you will contribute to the full lifecycle of our data persistence layer, from schema design and performance tuning to ensuring robust replication, disaster recovery, and seamless integration within our cloud-native microservice ecosystem. Responsibilities: Design & Develop Database Solutions: Architect … design, and implement highly optimized relational (e.g., MySQL, PostgreSQL, AWS Aurora, SQL Server) and NoSQL (e.g., MongoDB, DynamoDB, Redis) database schemas, ensuring data integrity, performance, and scalability for microservices. Performance Optimization & Tuning: Proactively analyze and optimize complex queries, implement efficient indexing strategies, and manage partitioning/sharding to ensure peak database performance and handle high throughput. Reliability & Disaster Recovery … in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., Apache Airflow). Familiarity with ChangeDataCapture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM frameworks (Entity Framework, Dapper, SQLAlchemy, Hibernate) from a database performance More ❯
work permit required: Yes col-narrow-right Job Reference: b91a69fc55fe Job Views: 3 Posted: 29.06.2025 Expiry Date: 13.08.2025 col-wide Job Description: A Little About Us EDB provides a data and AI platform that enables organizations to harness the full power of Postgres for transactional, analytical, and AI workloads across any cloud, anywhere. EDB empowers enterprises to control risk … manage costs and scale efficiently for a data and AI led world. Serving more than 1,500 customers globally and as the leading contributor to the vibrant and fast-growing PostgreSQL community, EDB supports major government organizations, financial services, media and information technology companies. EDB’s data-driven solutions enable customers to modernize legacy systems and break data … Experience with database monitoring and observability tools. Hands-on experience with cloud platforms (AWS, Azure, GCP) and container orchestration (OpenShift). Knowledge of ChangeDataCapture (CDC) tools like Debezium. Experience in a remote or virtual team environment. Fluency in English is required; proficiency in French, Spanish, Italian, or German is a strong asset. EDB is committed More ❯
Data Engineer (Enterprise DataWarehouse Developer) Description: As a Data Engineer, you'll design and maintain data scrapers and data pipelines, design & optimize analytics & relational databases, and build analytics models using DBT and bespoke aggregation engines. You'll work closely with business stakeholders, other BI Developers and DataOps as well as System engineers to support both data … bespoke tools written in Python/Java, as well as tools such as Meltano, Airflow, Mulesoft/Snaplogic, Apache NIFI, and Kafka, ensuring a robust, well-modelled, and scalable data analytics infrastructure running on MySQL and Postgres style databases primarily. Requirements: Advanced SQL development and deep understanding of RDBMS concepts and engines Practical knowledge of Data Warehouse infrastructure … with version control (e.g. Git, SVN) and CI/CD workflows for deployment Experience scraping external data sources using Beautiful Soup, Scrapy, or similar Familiarity with Database Replication & CDC technologies such as Debezium Familiarity with message & event-driven architecture, including tools like AWS MQ, Kafka Exposure to cloud database services (e.g., AWS RDS, Snowflake) 25 days of holiday Bonus More ❯
As a Data Engineer, you'll design and maintain data scrapers and data pipelines, design & optimize analytics & relational databases, and build analytics models using DBT and bespoke aggregation engines. You'll work closely with business stakeholders, other BI Developers and DataOps as well as System engineers to support both data and application integrations using bespoke tools … written in Python/Java, as well as tools such as Meltano, Airflow, Mulesoft/Snaplogic, Apache NIFI, and Kafka, ensuring a robust, well-modelled, and scalable data analytics infrastructure running on MySQL and Postgres style databases primarily. Requirements Essential Skills/Experience: SQL, Data Modelling & Database Administration Advanced SQL development and deep understanding of RDBMS concepts and … with version control (e.g. Git, SVN) and CI/CD workflows for deployment Experience scraping external data sources using Beautiful Soup, Scrapy, or similar Familiarity with Database Replication & CDC technologies such as Debezium Familiarity with message & event-driven architecture, including tools like AWS MQ, Kafka Exposure to cloud database services (e.g., AWS RDS, Snowflake) Benefits 25 days of holiday More ❯
Gridiron IT is seeking a Mid-Level Data Engineer with a passion for learning and problem-solving to join our team. The ideal candidate will be responsible for designing and implementing data migration processes, integrating data from various sources, and ensuring the secure and efficient handling of both structured and unstructured data. Proficiency with Talend, Qlik Replicate … and cloud platforms such as Azure or AWS is highly desirable. The successful candidate will play a key role in optimizing and securing our data migration and integration workflows. Travel - up to 10% You Will: Collaborate with data architects and project managers to develop comprehensive data migration strategies for structured (e.g., relational databases) and unstructured data (e.g., documents, media). Use Qlik Replicate for real-time replication, ChangeDataCapture (CDC), and data migration from source databases to target environments. Design, implement, and optimize Talend ETL/ELT pipelines for batch processing and transformation of structured and unstructured data. Ensure More ❯
a legacy that you can feel proud of.Join us and discover how our people write our story. MAIN PURPOSE OF THE ROLE & RESPONSIBILITIES IN THE BUSINESS: As a Senior Data Engineer (Senior Associate) at Kainos, you will be responsible or designing and developing data processing and data persistence software components for solutions which handle data at … scale. Working in agile teams, Senior Data Engineers provide strong development leadership and take responsibility for significant technical components of data systems. You will work within a multi-skilled agile team to design and develop large-scale data processing software to meet user needs in demanding production environments. Your responsibilities will include: Working to develop data … Experience of data visualisation and complex data transformations Experience with steaming and event-processing architectures including technologies such as Kafka and change-data-capture (CDC) products Expertise in continuous improvement and sharing input on data best practice So what are you waiting for? Let’s write the next incredible chapter of our story together. More ❯
a legacy that you can feel proud of.Join us and discover how our people write our story. MAIN PURPOSE OF THE ROLE & RESPONSIBILITIES IN THE BUSINESS: As a Senior Data Engineer (Senior Associate) at Kainos, you will be responsible or designing and developing data processing and data persistence software components for solutions which handle data at … scale. Working in agile teams, Senior Data Engineers provide strong development leadership and take responsibility for significant technical components of data systems. You will work within a multi-skilled agile team to design and develop large-scale data processing software to meet user needs in demanding production environments. Your responsibilities will include: Working to develop data … Experience of data visualisation and complex data transformations Experience with steaming and event-processing architectures including technologies such as Kafka and change-data-capture (CDC) products Expertise in continuous improvement and sharing input on data best practice So what are you waiting for? Let’s write the next incredible chapter of our story together. More ❯
culture that fosters innovation, flexibility, and creativity. At Octopus, great people get stuff done, all whilst being themselves. At Octopus Energy we are passionate about building great technology to change the way customers use, and think about, energy for the good of the planet. This is a fantastic opportunity to work on data problems that genuinely move us … closer to Net Zero and support the energy transition. We have developed a data platform that is used by all our businesses around the world. The platform empowers users with self-service data analytics and automates our data processing workflows, from simple ETL jobs to ML training and prediction. The Data Platform Team works on anything … would be helpful to have experience/expertise/knowledge in the following (in rough priority order): AWS Kubernetes (EKS) Data/network security Python Docker Grafana Postgres CDC systems Data related products (airflow, jupyter, spark, etc) The projects will be varied and we’re looking for someone who can work autonomously and proactively to scope problems and More ❯
ups working in the workplace, home, insurance and wealth areas. Joining us means helping create brighter financial futures for all our customers. Job Description Legal and General Retail's Data Operations team are currently hiring three Lead Data Engineers following the merger of internal divisions resulting in them expanding into a new area. These positions are to focus … on the retirements side of the Retail division and will build out new data pipelines utilising tools such as Synapse, DBT, Azure Devops and Snowflake. This role will see you responsible for designing, building, and implementing a variety of data solutions using modern ETL techniques and tools and you will be driving projects forward while serving as a … following methodologies: Scrum/Kanban, DevOps/DataOps, Dimensional Data Modelling (Kimball) Expertise in a variety of database technologies and data warehousing paradigms Familiarity with cloud platforms, CDC and streaming technologies, and data architecture principles Strong background in delivering data engineering solutions, adhering to project delivery methodologies (Agile, Waterfall) Experience in stakeholder management, complex dataMore ❯
scalable software systems using Java, modern microservices architectures, and RESTful APIs. The ideal candidate has strong experience with object-oriented principles, is well-versed in database technologies, and understands data replication concepts. Experience with cross-domain transfer or access solutions a plus. This position may be filled as a Software Engineer II, III or Sr. Software Engineer. Please see … object-oriented programming (OOP) principles to produce clean, maintainable code. Design and maintain database schemas, stored procedures, and complex queries. Contribute to system design discussions and documentation. Address secure data handling and access in cross-domain environments. Participate in code reviews and collaborate with peers in a hybrid team environment. Troubleshoot and resolve production issues and defects. Education/… with cross-domain solutions or secure data transfer mechanisms. Familiarity with Kafka or other distributed messaging systems. Experience with Debezium or other changedatacapture (CDC) tools. Exposure to NoSQL databases or document stores (e.g., MongoDB, Couchbase). Experience writing and maintaining JUnit tests or other unit test frameworks. Familiarity with modern JavaScript frameworks (e.g., React More ❯
London, England, United Kingdom Hybrid / WFH Options
Deel
Join to apply for the Data Engineer role at Deel Join to apply for the Data Engineer role at Deel Who We Are Is What We Do. Deel is the all-in-one payroll and HR platform for global teams. Our vision is to unlock global opportunity for every person, team, and business. Built for the way the … in just over five years—you'll drive meaningful impact while building expertise that makes you a sought-after leader in the transformation of global work. The Team The Data Platform team at Deel is dedicated to enhancing data quality, optimizing pipeline performance, building robust platform tools, and managing costs across the entire data stack—from ingestion … Proficiency in designing efficient database schemas. Workflow Orchestration: Familiarity with tools like Apache Airflow. Data Streaming: Experience with data streaming and ChangeDataCapture (CDC). Infrastructure: Proficiency in Terraform and GitHub Actions. Compliance: Experience in setting up PII anonymization and RBAC. Collaboration: Strong ability to work with cross-functional teams and communicate technical concepts More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
locations noted above. About this opportunity A great opportunity has arisen for a Software Engineer to join our engineering team and make a real difference. Working in the Enterprise Data Provisioning Platform, you can help shape the Group’s ambition by enabling the bank to gain value from data by providing safe and efficient Group wide data capabilities and services. We recognise the impact data can have on operating the bank, and we appreciate the sizeable opportunities data can bring to better serving our customers and growing the business. It is the foundation of our operation and is a vital component of building the Bank of the Future. Our Data Platform Services team … really useful: Expertise in Google Cloud Platform (GCP) services, including BigQuery, Data Fusion, Cloud Composer, IAM, and others. Hands-on experience with ChangeDataCapture (CDC) architectural patterns and tools, such as Kafka, Kafka Connect, IBM InfoSphere CDC, and Debezium, and others. Ability to define objectives and measure metrics. About working for us Our ambition is More ❯
London, England, United Kingdom Hybrid / WFH Options
Deel
in just over five years—you'll drive meaningful impact while building expertise that makes you a sought-after leader in the transformation of global work. The Team The Data Platform team at Deel is dedicated to enhancing data quality, optimizing pipeline performance, building robust platform tools, and managing costs across the entire data stack—from ingestion … to outbound integrations and everything in between. As a Data Engineer on this team, you’ll play a critical role in shaping the future of Deel’s data infrastructure, ensuring it scales effectively with 30+ Analytics Engineers and 100+ data professionals embedded across the organization. Our team collaborates cross-functionally with analysts, analytics engineers, data scientists … with cloud-based data warehouses. Workflow Orchestration: Familiarity with tools like Apache Airflow. Data Streaming: Experience with data streaming and ChangeDataCapture (CDC). Infrastructure: Proficiency in Terraform and GitHub Actions. Compliance: Experience in setting up PII anonymization and RBAC. Collaboration: Strong ability to work with cross-functional teams and communicate technical concepts More ❯
locations noted above. About this opportunity A great opportunity has arisen for a Software Engineer to join our engineering team and make a real difference. Working in the Enterprise Data Provisioning Platform, you can help shape the Group’s ambition by enabling the bank to gain value from data by providing safe and efficient Group wide data capabilities and services. We recognise the impact data can have on operating the bank, and we appreciate the sizeable opportunities data can bring to better serving our customers and growing the business. It is the foundation of our operation and is a vital component of building the Bank of the Future. Our Data Platform Services team … really useful: Expertise in Google Cloud Platform (GCP) services, including BigQuery, Data Fusion, Cloud Composer, IAM, and others. Hands-on experience with ChangeDataCapture (CDC) architectural patterns and tools, such as Kafka, Kafka Connect, IBM InfoSphere CDC, and Debezium, and others. Ability to define objectives and measure metrics. About working for us Our ambition is More ❯
locations noted above. About this opportunity A great opportunity has arisen for a Software Engineer to join our engineering team and make a real difference. Working in the Enterprise Data Provisioning Platform, you can help shape the Group's ambition by enabling the bank to gain value from data by providing safe and efficient Group wide data capabilities and services. We recognise the impact data can have on operating the bank, and we appreciate the sizeable opportunities data can bring to better serving our customers and growing the business. It is the foundation of our operation and is a vital component of building the Bank of the Future. Our Data Platform Services team … really useful: Expertise in Google Cloud Platform (GCP) services, including BigQuery, Data Fusion, Cloud Composer, IAM, and others. Hands-on experience with ChangeDataCapture (CDC) architectural patterns and tools, such as Kafka, Kafka Connect, IBM InfoSphere CDC, and Debezium, and others. Ability to define objectives and measure metrics. About working for us Our ambition is More ❯
Principal Architect – Data and AI Main Purpose of the Role & Responsibilities: As a Principal Architect for Data & AI, you will be responsible for delivering modern, effective data and AI solutions across a range of clients. You’ll act as a thought leader in your domain, working closely with senior stakeholders to define principles, set strategic direction, and … insight into emerging technologies and maintain a commitment to ongoing learning and growth—both for yourself and those around you. Minimum (Essential) Requirements: Comprehensive understanding of current and emerging data and AI technologies (e.g., IoT, edge processing, event-driven architectures, serverless computing, RDBMS, NoSQL, machine learning, cognitive services). Proven accountability for designing, building, and deploying scalable, data … powered systems in a production environment. Hands-on experience with at least one major public cloud platform (AWS, Azure, or GCP), including IaaS, SaaS, and PaaS offerings. Proficient in data integration approaches such as messaging systems, queuing, changedatacapture, or data virtualization. Ability to balance and prioritise non-functional requirements in solution design while More ❯
City of London, England, United Kingdom Hybrid / WFH Options
With Intelligence
years our company has transformed from a traditional financial publisher to a product-led Fintech, offering a global interchange connecting investors and managers to the people and insight-enriched data they need to raise and allocate assets. With Intelligence has recently raised a new round of funding from a successful technology investor with a value-creation plan centred around … elevating our product to be the pioneering and market-leading platform we envisage. We are looking for data and user-centric people eager to help drive us through our next growth phase. What you'll do Gather and translate business requirements into user stories and acceptance criteria Prioritise the product backlog, understanding benefits to users and our business, and … user story requirements, SDLCs, unit tests and Gherkin acceptance criteria formats Familiarity with modern data integration approaches Experience working with ETL, data pipelines, data warehousing, and CDC methodologies Familiarity with CI/CD practices. Understand core agile methodologies such as scrum and kanban, be able to pick the right approach for your development team and spearhead ceremonies More ❯
Veterans Lighting. Inc. dba Veterans Electrical Solutions
Minimum Requirements: Must have a 4 Year degree (Computer Science, Information Systems or equivalent) 8+ years overall IT experience (5+ years as DBA) Technical Skills: Hands-on expertise in data migration between databases on-prem to AWS cloud RDS. Experience in usage of AWS Data Migration Service (DMS) Experience in export/import of huge size Database Schema … its architecture. Experience in Conversion of Schema from one DB to another is an added advantage. Experience in Database and server consolidation. Strong hands-on experience in building logical data models, Data Quality, and Data security. Understand Application lifecycle and build Service continuity documents. Responsible for building knowledge base - Run books, Cheat sheets, DR Drill books, Escalation More ❯
as Salesforce Developer: Design and implement scalable solutions across Sales, Service, Experience Clouds, and AgentForce using Apex, Visualforce, LWC, and platform tools. Consolidate multiple Salesforce orgs post-acquisition, aligning data models, automation, and access. Migrate legacy systems using APIs, ETL tools, and platforms like Workato or n8n. Develop and maintain integrations with third-party systems via REST/SOAP … APIs. Design, document, and standardize Salesforce data models, moving toward standard objects. Resolve complex 3rd-line support issues and optimize workflows and UI/UX. Define role-based permissions and ensure compliance with security and audit standards. Maintain technical documentation and provide guidance to support teams. Support system upgrades, releases, and end users across Salesforce applications. Identify and implement … performance tuning. Key Skills & Experience for Salesforce Developer: Salesforce Admin (ADM 201) & Platform Developer II certified 3–5 years’ experience in Salesforce development Proficiency in Salesforce Clouds, mobile, APIs, CDC, Platform Events Strong knowledge of Force.com architecture, data modeling, and integrations. Proficient in Apex, Visualforce, LWC, SOQL, and Salesforce admin/developer tools. Solid SQL Server (T-SQL) experience. More ❯
driven architectures/Apache Kafka. Building on the AWS using any of the common automation frameworks like Pulumi, Terraform or Ansible. Experience with large-scale system handling terabytes of data and tens of thousands of messages per second. Experience operating relational (Oracle, PostgreSQL) and NoSQL (DynamoDB, Cassandra, ElasticSearch) databases. Experience with monolith-to-microservices migrations and patterns such as … strangler and change-data-capture. You like music :) What we offer: Competitive local benefits based on your location We promote flexible working regarding time and/or place Both mental and physical health initiatives Comprehensive training and development opportunities Industry insider events, team socials and company events Enhanced holiday allowance We welcome candidates from all backgrounds, regardless of … permit for EU/UK? What is the name of your referrer? (please answer only if you have been referred by an ICE employee) I understand that my personal data will be processed in accordance with our privacy policy. Select More ❯
you can feel proud of. Join us and discover how our people write our story. MAIN PURPOSE OF THE ROLE & RESPONSIBILITIES IN THE BUSINESS: As a Principal Architect for Data & AI in Kainos, you’ll be accountable for successful delivery of contemporary data/AI solutions across multiple customers. You’ll have gravitas within the data/… ll do this whilst advising about new technologies and approaches, with room to learn, develop and grow. MINIMUM (ESSENTIAL) REQUIREMENTS Broad knowledge of a spectrum of current and emerging data and AI technologies, and able to communicate clearly and influence C-level stakeholders (such as IOT, edge processing, event driven data architectures, serverless processing, file system, object, document … key-value, search, RDBMS, graph, cognitive services, machine learning). Proven experience being accountable for designing, building and productionising modern data or AI intensive applications, including systems that are distributed and scalable. Experience with public cloud data platforms, at least one of AWS, Azure or GCP including use of IaaS, SaaS and PaaS offerings. Comfortable with dataMore ❯
you can feel proud of. Join us and discover how our people write our story. MAIN PURPOSE OF THE ROLE & RESPONSIBILITIES IN THE BUSINESS: As a Principal Architect for Data & AI in Kainos, you’ll be accountable for successful delivery of contemporary data/AI solutions across multiple customers. You’ll have gravitas within the data/… ll do this whilst advising about new technologies and approaches, with room to learn, develop, and grow. MINIMUM (ESSENTIAL) REQUIREMENTS Broad knowledge of a spectrum of current and emerging data and AI technologies, and able to communicate clearly and influence C-level stakeholders (such as IOT, edge processing, event driven data architectures, serverless processing, file system, object, document … key-value, search, RDBMS, graph, cognitive services, machine learning). Proven experience being accountable for designing, building, and productionising modern data or AI intensive applications, including systems that are distributed and scalable. Experience with public cloud data platforms, at least one of AWS, Azure or GCP including use of IaaS, SaaS and PaaS offerings. Comfortable with dataMore ❯
Manchester, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Social network you want to login/join with: SAP Cloud Data Solutions Architect, manchester col-narrow-left Client: Location: manchester, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 4 Posted: 26.06.2025 Expiry Date: 10.08.2025 col-wide Job Description: SAP Cloud Data Solutions Architect with SAP BW - SAP Datasphere migration experience … and a strong background in consulting, sales, and commercial strategy to join our dynamic team. This role is ideal for a cloud solutions sales professional specializing in data analytics offerings who can serve as an overlay sales for our core sales team. The ideal candidate will have experience in building proposals, responding to RFPs, and drafting SOWs, as well … Design and implement scalable, cost-effective architectures across AWS, Azure, or GCP environments. Data Engineering: Develop and manage ETL/ELT pipelines, data integration workflows, and implement CDC and delta load design for effective data management. SAP Tools: Lead SAP BW to SAP Datasphere migration strategies and leverage SAP Data Analytics for cloud solutions. Legacy DataMore ❯
Plymouth, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Social network you want to login/join with: SAP Cloud Data Solutions Architect, plymouth col-narrow-left Client: Location: plymouth, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 1 Posted: 31.05.2025 Expiry Date: 15.07.2025 col-wide Job Description: SAP Cloud Data Solutions Architect with SAP BW - SAP Datasphere migration experience … and a strong background in consulting, sales, and commercial strategy to join our dynamic team. This role is ideal for a cloud solutions sales professional specializing in data analytics offerings who can serve as an overlay sales for our core sales team. The ideal candidate will have experience in building proposals, responding to RFPs, and drafting SOWs, as well … Design and implement scalable, cost-effective architectures across AWS, Azure, or GCP environments. Data Engineering: Develop and manage ETL/ELT pipelines, data integration workflows, and implement CDC and delta load design for effective data management. SAP Tools: Lead SAP BW to SAP Datasphere migration strategies and leverage SAP Data Analytics for cloud solutions. Legacy DataMore ❯