City of London, London, United Kingdom Hybrid / WFH Options
8Bit - Games Industry Recruitment
We’re hiring an AI DataEngineer to turn 3D assets and visuals into high-quality datasets for next-gen AI in digital play. Own the full data pipeline and help build scalable, creative systems at the crossroads of AI, data, and fun RESPONSIBILITIES Design and operate data ingestion pipelines for 3D assets, geometry, images … annotation vendors, and automate quality checks to ensure scalable, reliable datasets. Define schemas and metadata for digital assets, maintaining catalogues, lineage, and versioning to ensure reproducibility and governance. Enforce data quality, drift detection, and compliance (GDPR/COPPA), including watermarking and traceability to safeguard IP and user privacy. Collaborate with AI infrastructure engineers to optimize data storage and … compute performance for large-scale processing. Work with AI engineers to define data specifications, support experimentation, and integrate active learning loops that improve AI models over time. REQUIREMENTS 2 years of proven experience in data engineering for ML/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with More ❯
We’re hiring an AI DataEngineer to turn 3D assets and visuals into high-quality datasets for next-gen AI in digital play. Own the full data pipeline and help build scalable, creative systems at the crossroads of AI, data, and fun RESPONSIBILITIES Design and operate data ingestion pipelines for 3D assets, geometry, images … annotation vendors, and automate quality checks to ensure scalable, reliable datasets. Define schemas and metadata for digital assets, maintaining catalogues, lineage, and versioning to ensure reproducibility and governance. Enforce data quality, drift detection, and compliance (GDPR/COPPA), including watermarking and traceability to safeguard IP and user privacy. Collaborate with AI infrastructure engineers to optimize data storage and … compute performance for large-scale processing. Work with AI engineers to define data specifications, support experimentation, and integrate active learning loops that improve AI models over time. REQUIREMENTS 2 years of proven experience in data engineering for ML/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Inara
Senior DataEngineer ***Open to Contract or Perm*** Location - Remote/North (happy for you to work remotely but ideally you would be based up North where the leadership are based) Contract - £500-600/day, Perm £70-85k (open to negotiation for the right person) We are working with a growing company that is leading the … way in data in their space and what they can offer their client based globally. They work with a variety of customers to lead on decision making using the latest cutting edge technology. You would be working in a small team, and have autonomy and responsibility to lead and deliver best in class data solutions. This role and … as Google Cloud, Python, Big Query, Kafka and Devops practises around CI/CD and more. Your responsibilities within your role will include building and working on a quality data framework,. helping restructure how you store data, how you validate the data and also looking at how you optimise the data. They have a huge amount More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Inara
Senior DataEngineer Open to Contract or Perm Location - Remote/North (happy for you to work remotely but ideally you would be based up North where the leadership are based) Contract - £500-600/day, Perm £70-85k (open to negotiation for the right person) We are working with a growing company that is leading the … way in data in their space and what they can offer their client based globally. They work with a variety of customers to lead on decision making using the latest cutting edge technology. You would be working in a small team, and have autonomy and responsibility to lead and deliver best in class data solutions. This role and … as Google Cloud, Python, Big Query, Kafka and Devops practises around CI/CD and more. Your responsibilities within your role will include building and working on a quality data framework,. helping restructure how you store data, how you validate the data and also looking at how you optimise the data. They have a huge amount More ❯
Data Science Engineer - MLOPS, Machine Learning, AI, Artificial Intelligence, Azure, PyTorch, TensorFlow, LangChain, OpenAI, Docker, Kubernetes, GenAI, ETL We are actively working with a global law firm who are actively looking to bolster their IT team as they undergo a global-scale cloud transformation. At present they are looking to take on a new Data Science Engineer … join a top-tier global law firm who have a long-stream of projects in the pipeline alongside a diverse and collaborative team environment. To be considered for this Data Science Engineer (MLOPS, Machine Learning, AI, Artificial Intelligence, Azure, PyTorch, TensorFlow, LangChain, OpenAI, Docker, Kubernetes, GenAI, ETL) role, it's ideal you have: Ideal but not required law … firm experience 2-4 years experience within AI/ML positions Knowledge of cloud platforms (Ideally Azure) AI/ML Frameworks Generative AI Data engineering knowledge Solution Delivery Design, build, and deploy data science and AI solutions end-to-end, from design and development through testing, release, monitoring, and support. Operationalize models with CI/CD pipelines, automated More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Nexus Jobs
DataEngineer with C# Dot Net asp.net with SQL Server SSIS SSRS Our Client is a bank based in Central London who are looking to recruit at least 7 years plus experience as a DataEngineer with the ability to work with C# Dot net and SQL Server with SSIS. You must have solid expertise of … in-house systems, i.e. CORE, SharePoint interface, Equation, Kondor, Eximbills, end of day cycle. Support Supporting budgeting and financial planning processes for Finance department, including loading and refreshing of data based on requirements. Understand and conduct the front-end functionality to amend and change hierarchical structures within the environment. Build data flows within the SQL environment in SSIS … processes thereby eliminating the need for duplicate entry. Utilise web services to integrate from cutting edge technology into legacy systems such as Equation. Integrate with all the systems using data abstraction and connectivity layers, i.e. ODBC, ADO.net. ?Duties Maintain knowledge of all applicable regulatory requirements including the Banks Risk and Compliance policies and procedures and adhere to these to More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Altech Group Ltd
Data Analytics Engineer Location: Liverpool City Centre/Hybrid Salary: £50k Basic The Company Marketing SaaS Technology business that thrive in a modern technical stack. Used by the automotive space (think BMW, Mercedes). They are transforming how people discover, compare, and purchase vehicles. Data is at the heart of everything they do, and they are looking … to add a Data Analytics Engineer to help turn raw data into actionable insights that drive key business decisions. They are a fairly small but tight-knit team and really push a modern tech stack. Everyone in the team is the type of person that reads articles about new technology, and experiments with it in the business … available technology. AI naturally is a huge push for them right now along with a modern Azure cloud stack. What You'll Be Doing Design, build, and maintain reliable data pipelines from a variety of sources, including web data, pricing feeds, and customer interactions. Transform raw data into analytics-ready datasets using tools like dbt and SQL. More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Altech Group Ltd
Data Analytics Engineer Location: Liverpool City Centre/Hybrid Salary: £50k Basic The Company Marketing SaaS Technology business that thrive in a modern technical stack. Used by the automotive space (think BMW, Mercedes). They are transforming how people discover, compare, and purchase vehicles. Data is at the heart of everything they do, and they are looking … to add a Data Analytics Engineer to help turn raw data into actionable insights that drive key business decisions. They are a fairly small but tight-knit team and really push a modern tech stack. Everyone in the team is the type of person that reads articles about new technology, and experiments with it in the business … available technology. AI naturally is a huge push for them right now along with a modern Azure cloud stack. What You’ll Be Doing Design, build, and maintain reliable data pipelines from a variety of sources, including web data, pricing feeds, and customer interactions. Transform raw data into analytics-ready datasets using tools like dbt and SQL. More ❯
As dataengineer, you will contribute to the development of a robust, validated and easily accessible data foundation instrumental in facilitating new analytical workflows and applications. About us Hexegic are a leading technical consultancy providing agile multi-disciplinary teams to high performing organisations. The company promises exciting, engaging and rewarding projects for those that are keen to … develop and build a successful career. Core Responsibilities Establishing new data integrations within the data foundation Conduct ETL activities as conducted by SMEs Configuring connections to other datasets within the data foundation Collaborate with SMEs to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we … are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary of £50,000-£60,000 £5000 a year professional development budget Wellness program 25 days annual leave Hybrid working arrangements More ❯
We're seeking a highly skilled and motivated Senior DataEngineer to join our growing data team. In this role, you'll architect and maintain robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and … collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads … and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes to improve efficiency and reduce manual intervention - Monitor pipeline performance, identify bottlenecks, and resolve issues proactively - Apply best practices in CI/CD, version More ❯
We're seeking a highly skilled and motivated Senior DataEngineer to join our growing data team. In this role, you'll architect and maintain robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and … collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads … and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes to improve efficiency and reduce manual intervention - Monitor pipeline performance, identify bottlenecks, and resolve issues proactively - Apply best practices in CI/CD, version More ❯
Data Analytics Engineer Contract Role Outside IR35 | 475-500/day | Remote | Initial 6 Months | Extension Likely We have a new Data Analytics Engineer contract opportunity a customer of ours, supporting their Global Operations and Customer Service teams. This role is remote-first, with the very odd occasional travel to Newcastle. The initial contract is … months, with strong potential to extend. Were looking for someone with a strong technical foundation and a deep understanding of operational data needs - especially around workforce planning and customer service analytics. Key Responsibilities: Build and maintain scalable data pipelines and datasets using ELT/ETL best practices Deliver high-quality insights through reporting and analytics Partner with stakeholders … to define and implement BI solutions Ensure data quality, system reliability, and performance Support production environments and month-end processing Communicate complex insights to technical and non-technical audiences Tech Stack: Snowflake Python Fivetran Azure/AWS Salesforce CRM (preferred) Ideal Candidate: 5+ years in BI, Data Analytics, or Data Engineering Experience supporting Ops and Customer Service More ❯
What you'll be doing • Managing data into and out of our data warehouse and reporting platforms to ensure accuracy and efficiency. • Definition and quality of KPIs that support effective contact centre operations. • Delivering data development roadmaps aligned to either our Consumer or Enterprise business strategies. • Empowering end users with access to reliable, well-governed data sets in a secure environment. • Troubleshooting complex data issues across multiple source systems and ensuring smooth data operations. • Leading strategic, cross-functional projects to deliver impactful, data-driven solutions, while documenting processes for team-wide support. What skills you'll need • Certified Google Cloud (GCP) DataEngineer with strong expertise in data acquisition … modelling, ETL, and data warehousing. • Proficient in Terraform, YAML, and GitLab for environment management. • Skilled in MS SQL, GCP SQL, and Oracle DB design, with a focus on data quality and governance. • Strong understanding of contact centre metrics and systems (e.g. WFM, IVR, Call Routing). • Proven ability to lead technical teams and deliver end-to-end technology More ❯
What you'll be doing • Managing data into and out of our data warehouse and reporting platforms to ensure accuracy and efficiency. • Definition and quality of KPIs that support effective contact centre operations. • Delivering data development roadmaps aligned to either our Consumer or Enterprise business strategies. • Empowering end users with access to reliable, well-governed data sets in a secure environment. • Troubleshooting complex data issues across multiple source systems and ensuring smooth data operations. • Leading strategic, cross-functional projects to deliver impactful, data-driven solutions, while documenting processes for team-wide support. What skills you'll need • Certified Google Cloud (GCP) DataEngineer with strong expertise in data acquisition … modelling, ETL, and data warehousing. • Proficient in Terraform, YAML, and GitLab for environment management. • Skilled in MS SQL, GCP SQL, and Oracle DB design, with a focus on data quality and governance. • Strong understanding of contact centre metrics and systems (e.g. WFM, IVR, Call Routing). • Proven ability to lead technical teams and deliver end-to-end technology More ❯
What you’ll be doing • Managing data into and out of our data warehouse and reporting platforms to ensure accuracy and efficiency. • Definition and quality of KPIs that support effective contact centre operations. • Delivering data development roadmaps aligned to either our Consumer or Enterprise business strategies. • Empowering end users with access to reliable, well-governed data sets in a secure environment. • Troubleshooting complex data issues across multiple source systems and ensuring smooth data operations. • Leading strategic, cross-functional projects to deliver impactful, data-driven solutions, while documenting processes for team-wide support. What skills you’ll need • Certified Google Cloud (GCP) DataEngineer with strong expertise in data acquisition … modelling, ETL, and data warehousing. • Proficient in Terraform, YAML, and GitLab for environment management. • Skilled in MS SQL, GCP SQL, and Oracle DB design, with a focus on data quality and governance. • Strong understanding of contact centre metrics and systems (e.g. WFM, IVR, Call Routing). • Proven ability to lead technical teams and deliver end-to-end technology More ❯
What you’ll be doing • Managing data into and out of our data warehouse and reporting platforms to ensure accuracy and efficiency. • Definition and quality of KPIs that support effective contact centre operations. • Delivering data development roadmaps aligned to either our Consumer or Enterprise business strategies. • Empowering end users with access to reliable, well-governed data sets in a secure environment. • Troubleshooting complex data issues across multiple source systems and ensuring smooth data operations. • Leading strategic, cross-functional projects to deliver impactful, data-driven solutions, while documenting processes for team-wide support. What skills you’ll need • Certified Google Cloud (GCP) DataEngineer with strong expertise in data acquisition … modelling, ETL, and data warehousing. • Proficient in Terraform, YAML, and GitLab for environment management. • Skilled in MS SQL, GCP SQL, and Oracle DB design, with a focus on data quality and governance. • Strong understanding of contact centre metrics and systems (e.g. WFM, IVR, Call Routing). • Proven ability to lead technical teams and deliver end-to-end technology More ❯
DataEngineer 4th Floor, Printworks, 35 - 39 Queen Street, Belfast, BT1 6EA, United Kingdom, Belfast, United Kingdom Full-time Company Description We believe in the power of ingenuity to build a positive human future. As strategies, technologies, and innovation collide, we create opportunity from complexity. Our teams of interdisciplinary experts combine innovative thinking and breakthrough technologies to progress … and own the outcome. We combine strategic thinking, customer-centric service design, and agile engineering practices to accelerate innovation in a tech-driven world. Why consider joining our Digital & Data community? Join our Digital & Data team working alongside product, design and a wide range of other experts and cross-disciplinary teams to bring ideas to life through innovative More ❯
Dunbar Brown Group are delighted to be working with a market leading business in London to find a Senior DataEngineer with a real specialism in Databricks. This role will be working with some amazing customers to deliver data solutions that provide real value. They operate across multiple industries from financial services to pharmaceuticals. The right person … will have at least 5 years experience as a DataEngineer within a cloud environment. Expertise in Databricks and advanced python and SQL skills. In addition, experience with CI/CD and automated testing. This will be a hybrid role with 2-3 days per week in their central London office. *Must have full right to work in More ❯
Dunbar Brown Group are delighted to be working with a market leading business in London to find a Senior DataEngineer with a real specialism in Databricks. This role will be working with some amazing customers to deliver data solutions that provide real value. They operate across multiple industries from financial services to pharmaceuticals. The right person … will have at least 5 years experience as a DataEngineer within a cloud environment. Expertise in Databricks and advanced python and SQL skills. In addition, experience with CI/CD and automated testing. This will be a hybrid role with 2-3 days per week in their central London office. Must have full right to work in More ❯
Dunbar Brown Group are delighted to be working with a market leading business in London to find a Senior DataEngineer with a real specialism in Databricks. This role will be working with some amazing customers to deliver data solutions that provide real value. They operate across multiple industries from financial services to pharmaceuticals. The right person … will have at least 5 years experience as a DataEngineer within a cloud environment. Expertise in Databricks and advanced python and SQL skills. In addition, experience with CI/CD and automated testing. This will be a hybrid role with 2-3 days per week in their central London office. *Must have full right to work in More ❯
JD as below: What you’ll be doing • Managing data into and out of our data warehouse and reporting platforms to ensure accuracy and efficiency. • Definition and quality of KPIs that support effective contact centre operations. • Delivering data development roadmaps aligned to either our Consumer or Enterprise business strategies. • Empowering end users with access to reliable, well … governed data sets in a secure environment. • Troubleshooting complex data issues across multiple source systems and ensuring smooth data operations. • Leading strategic, cross-functional projects to deliver impactful, data-driven solutions, while documenting processes for team-wide support. What skills you’ll need • Certified Google Cloud (GCP) DataEngineer with strong expertise in data acquisition, modelling, ETL, and data warehousing. • Proficient in Terraform, YAML, and GitLab for environment management. • Skilled in MS SQL, GCP SQL, and Oracle DB design, with a focus on data quality and governance. • Strong understanding of contact centre metrics and systems (e.g. WFM, IVR, Call Routing). • Proven ability to lead technical teams and deliver end-to More ❯
DataEngineer - Sought by Boutique Hedge Fund - Permanent - London Salary : £95 - 125k + Bonus About the Role : Build and operate data pipelines and analytics platforms in the cloud. Work with engineers and analysts to design, implement, and maintain reliable, observable ETL/ELT workflows using Airflow and managed cloud services. Focus on Python-first implementations, high-quality … SQL, Airflow orchestration, and query engines such as Athena, Trino, or ClickHouse. Required Skills: Hands-on software development and data engineering experience. Must have strong financial or trading background. Strong Python & SQL skills: writing clean, testable code following SOLID principles. Hands-on experience using Athena, Trino, ClickHouse, or other distributed SQL engines and knowledge of cost/scan optimization. … Experience with cloud platforms (AWS/Azure/GCP): working with object storage, managed query services, and data catalogs. More ❯
DataEngineer - Sought by Boutique Hedge Fund - Permanent - London Salary : £95 - 125k + Bonus About the Role : Build and operate data pipelines and analytics platforms in the cloud. Work with engineers and analysts to design, implement, and maintain reliable, observable ETL/ELT workflows using Airflow and managed cloud services. Focus on Python-first implementations, high-quality … SQL, Airflow orchestration, and query engines such as Athena, Trino, or ClickHouse. Required Skills: Hands-on software development and data engineering experience. Must have strong financial or trading background. Strong Python & SQL skills: writing clean, testable code following SOLID principles. Hands-on experience using Athena, Trino, ClickHouse, or other distributed SQL engines and knowledge of cost/scan optimization. … Experience with cloud platforms (AWS/Azure/GCP): working with object storage, managed query services, and data catalogs. More ❯
We're seeking a highly skilled and motivated Senior DataEngineer to join our growing data team. In this role, you'll architect and maintain robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and … collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads … and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes to improve efficiency and reduce manual intervention - Monitor pipeline performance, identify bottlenecks, and resolve issues proactively - Apply best practices in CI/CD, version More ❯
We're seeking a highly skilled and motivated Senior DataEngineer to join our growing data team. In this role, you'll architect and maintain robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and … collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads … and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes to improve efficiency and reduce manual intervention - Monitor pipeline performance, identify bottlenecks, and resolve issues proactively - Apply best practices in CI/CD, version More ❯