London, England, United Kingdom Hybrid / WFH Options
Tecton
experience working in a customer-facing role (IT Support, HelpDesk, Retail, and similar) Experience with AWS, GCP, and Kubernetes Distributed file system Experience with data platforms such as Spark, Databricks, EMR, Snowflake, or BigQuery. Knowledge of JIRA, Github, or Gitlab Familiarity with Machine Learning and Data Science tooling, such as Jupyter Notebooks, Tensorflow, Scikit-learn, and PyTorch Nice To Have More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
simultaneously. Strong written and verbal communication skills, with attention to detail and clarity. Preferred Qualifications: Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity More ❯
Haywards Heath, Sussex, United Kingdom Hybrid / WFH Options
First Central Services
practices and tools (Azure DevOps preferred). Experience with microservices architecture, RESTful API development, and system integration. Prior experience in financial services or insurance sectors advantageous. Familiarity with AzureML, Databricks, related Azure technologies, Docker, Kubernetes, and containerization is advantageous. Advanced proficiency in Python, and familiarity with AI frameworks such as LangChain Skilled in designing and operationalising AI Ops frameworks within More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
First Central Services
practices and tools (Azure DevOps preferred). Experience with microservices architecture, RESTful API development, and system integration. Prior experience in financial services or insurance sectors advantageous. Familiarity with AzureML, Databricks, related Azure technologies, Docker, Kubernetes, and containerization is advantageous. Advanced proficiency in Python, and familiarity with AI frameworks such as LangChain Skilled in designing and operationalising AI Ops frameworks within More ❯
external vendors to enhance data management capabilities. Provide expert-level troubleshooting, root cause analysis, and performance optimisation for data platforms, such as Azure SQL databases, Fabric Warehouse and Onelake, Databricks, and Azure Data Factory. Document technical solutions, best practices, and knowledge base articles to facilitate effective knowledge transfer and continuous improvement. Diagnose and resolve data-related incidents and problems escalated More ❯
Newport, Gwent, Wales, United Kingdom Hybrid / WFH Options
Intellectual Property Office
The IPO is a modern organisation which depends on its IT and Data services to operate and innovate effectively. In order to provide up to date services to our customers both nationally and internationally, our systems need to be developed More ❯
Lead/Principle Python Engineer for Generative AI Backend Development Join to apply for the Lead/Principle Python Engineer for Generative AI Backend Development role at Trimble Inc. Overview We are seeking a Generative AI Lead/Principal Python More ❯
Get AI-powered advice on this job and more exclusive features. Continue with Google Continue with Google Continue with Google Continue with Google Continue with Google Continue with Google This range is provided by Xcede. Your actual pay will be More ❯
Newport, Wales, United Kingdom Hybrid / WFH Options
Manchester Digital
£46,262 - £56,996 please read allowances on the Civil Service Jobs link Published on Full-time (Permanent) £46,262 - £56,996 please read allowances on the Civil Service Jobs link Published on 26 June 2025 Deadline 6 July 2025 More ❯
Newport, Wales, United Kingdom Hybrid / WFH Options
Yolk Recruitment Ltd
Senior Data Engineer Newport (x4 per month) £46,262 - £56,996 **Must be eligible for SC Clearance** The Opportunity Yolk Recruitment are excited to be working with an innovation-driven civil service organisation as they journey through an incredible digital More ❯
Data Engineer | Bristol (Hybrid) | £40,000 - £50,000 P.A | Permanent Peaple Talent have partnered with an existing client in Bristol looking to recruit a Data Engineer. The successful candidate will collaborate to build, enhance, and support both data-driven solutions More ❯
London, England, United Kingdom Hybrid / WFH Options
Capgemini
Capgemini Invent At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities, collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client More ❯
Job Description Is this the next step in your career Find out if you are the right candidate by reading through the complete overview below. We have an exciting and rewarding opportunity within the Client Onboarding & Know Your Customer technology More ❯
Atherstone, Warwickshire, England, United Kingdom Hybrid / WFH Options
Aldi
An exciting opportunity has arisen for an experienced Data Solution Architect to become part of the National Data and Analytics (NDA) Platform & Engineering Team.The NDA Platform & Engineering Team is working on providing data to drive the business forward. You'll More ❯
Newport, Gwent, United Kingdom Hybrid / WFH Options
Yolk Recruitment
Senior Data Engineer Newport (x4 per month) £46,262 - £56,996 **Must be eligible for SC Clearance** The Opportunity Yolk Recruitment are excited to be working with an innovation-driven civil service organisation as they journey through an incredible digital More ❯
The IPO is a modern organisation which depends on its IT and Data services to operate and innovate effectively. In order to provide up to date services to our customers both nationally and internationally, our systems need to be developed More ❯
Press Tab to Move to Skip to Content Link Location: London Other locations: Primary Location Only Requisition ID: At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse More ❯
Job Description - Data Warehouse Developer About Hiscox: At Hiscox, we care about our people. We hire the best people for the work, and we're committed to diversity and creating a truly inclusive culture, which we believe drives success. We More ❯
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows … and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure … access control, and metadata management across data assets. Ensure data security best practices, such as encryption at rest and in transit, and role-based access control (RBAC) within Azure Databricks and Azure services. Performance Tuning & Optimization: Optimize Spark jobs for performance by tuning configurations, partitioning data, and caching intermediate results to minimize processing time and resource consumption. Continuously monitor and More ❯
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows … and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure … access control, and metadata management across data assets. Ensure data security best practices, such as encryption at rest and in transit, and role-based access control (RBAC) within Azure Databricks and Azure services. Performance Tuning & Optimization: Optimize Spark jobs for performance by tuning configurations, partitioning data, and caching intermediate results to minimize processing time and resource consumption. Continuously monitor and More ❯
Solution Architect - Databricks Admin with GCP/AWS Solution Architect - Databricks Admin with GCP/AWS 2 days ago Be among the first 25 applicants Get AI-powered advice on this job and more exclusive features. Sign in to access AI-powered advices Continue with Google Continue with Google Continue with Google Continue with Google Continue with Google Continue with … Google Continue with Google Continue with Google Continue with Google Continue with Google Continue with Google Continue with Google Responsibilities will include designing, implementing, and maintaining the Databricks platform, and providing operational support. Operational support responsibilities include platform set-up and configuration, workspace administration, resource monitoring, providing technical support to data engineering, Data Science/ML, and Application/integration … the root causes of issues, and resolving issues. The position will also involve the management of security and changes. The position will work closely with the Team Lead, other Databricks Administrators, System Administrators, and Data Engineers/Scientists/Architects/Modelers/Analysts. ͏ Required Skills: 3+ years of production support of the Databricks platform Preferred: 2+ years of experience More ❯
join a skilled and collaborative analytics team of eight, working closely with senior leadership. The team has already laid the foundations for a modern data platform using Azure and Databricks and is now focused on building out scalable ETL processes, integrating AI tools, and delivering bespoke analytics solutions across the organisation. THE ROLE As a Data Engineer, you'll play … a pivotal role in designing and implementing robust data pipelines, supporting the migration from legacy Azure systems to Databricks, and working closely with stakeholders to deliver tailored data solutions. This role combines hands-on development with collaborative architecture design, and offers the opportunity to contribute to AI readiness within a fast-paced business. KEY RESPONSIBILITIES Develop and maintain ETL pipelines … loads Connect and integrate diverse data sources across cloud platforms Collaborate with analytics and design teams to create bespoke, scalable data solutions Support data migration efforts from Azure to Databricks Use Terraform to manage and deploy cloud infrastructure Build robust data workflows in Python (e.g., pandas, PySpark) Ensure the platform is scalable, efficient, and ready for future AI use cases More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
Data Engineering Consultant | London (Hybrid) | Permanent £70,000 - £85,000 + Bonus Are you a Databricks expert ready to design the future of data platforms? Join a next-gen data & AI consultancy delivering real-world impact through smart, scalable solutions for some of the UK’s top organisations. 🔍 The Role We’re looking for a Data Engineering Consultant who can … blend deep technical expertise with strong client-facing skills. You’ll lead the design and delivery of modern data architectures using Databricks on AWS or Azure , developing high-performance pipelines and helping businesses unlock the power of their data. What You’ll Do Design and build modern data solutions using Databricks on the cloud ☁️ Create scalable pipelines in Python and … solution design Stay at the forefront of data engineering and AI trends What We’re Looking For 6+ years of hands-on experience in data engineering Proven expertise in Databricks (deployment, pipeline development, performance optimisation) Strong coding skills in Python and SQL Solid experience with AWS or Azure Familiarity with DevOps practices (CI/CD, Infrastructure as Code) Confident in More ❯
Data Engineering Consultant | London (Hybrid) | Permanent £70,000 - £85,000 + Bonus Are you a Databricks expert ready to design the future of data platforms? Join a next-gen data & AI consultancy delivering real-world impact through smart, scalable solutions for some of the UK’s top organisations. 🔍 The Role We’re looking for a Data Engineering Consultant who can … blend deep technical expertise with strong client-facing skills. You’ll lead the design and delivery of modern data architectures using Databricks on AWS or Azure , developing high-performance pipelines and helping businesses unlock the power of their data. What You’ll Do Design and build modern data solutions using Databricks on the cloud ☁️ Create scalable pipelines in Python and … solution design Stay at the forefront of data engineering and AI trends What We’re Looking For 6+ years of hands-on experience in data engineering Proven expertise in Databricks (deployment, pipeline development, performance optimisation) Strong coding skills in Python and SQL Solid experience with AWS or Azure Familiarity with DevOps practices (CI/CD, Infrastructure as Code) Confident in More ❯
role in designing and evolving the BI architecture across a cloud-first data ecosystem, ensuring solutions are fit-for-purpose, secure, and future-ready. You’ll work across Azure, Databricks, and Power BI to support reporting, analytics, and data engineering capabilities. What You’ll Need Proven experience designing BI or analytics platforms in a cloud environment (Azure preferred) Hands-on … experience with Databricks , Power BI , and broader Azure services (storage, security, networking, monitoring) Strong understanding of data platform architecture, including cost optimisation and governance A background in agile, DevOps, and engineering-led environments Excellent stakeholder engagement and documentation skills Ideally certified in Microsoft or Databricks technologies Qualifications Degree in Computer Science, Data Science, Information Systems, or a related field Relevant … cloud/data certifications (e.g., Microsoft Azure, Databricks) If you would like to find out more, please reach out to charntel.dignum@cvmpeople.com #J-18808-Ljbffr More ❯