AI/ML model training infrastructure (e.g., GPU orchestration, model serving) for both Diffusion- and LLM pipelines. Familiarity with data lake architectures and tools like Delta Lake , LakeFS , or Databricks . Knowledge of security and compliance best practices (e.g., SOC2, ISO 27001). Exposure to MLOps platforms or frameworks (e.g., MLflow, Kubeflow, Vertex AI). What We Offer Competitive salary More ❯
AI/ML model training infrastructure (e.g., GPU orchestration, model serving) for both Diffusion- and LLM pipelines. Familiarity with data lake architectures and tools like Delta Lake , LakeFS , or Databricks . Knowledge of security and compliance best practices (e.g., SOC2, ISO 27001). Exposure to MLOps platforms or frameworks (e.g., MLflow, Kubeflow, Vertex AI). What We Offer Competitive salary More ❯
Kirkby on Bain, England, United Kingdom Hybrid / WFH Options
ANGLIAN WATER-2
delivery pipelines if the solution is to adopt modern DevOps processes. What does it take to be an Enterprise Data Engineer? Previous strong experience in data engineering ideally using Databricks, Azure Data Factory, Spark, Python, SQL, PowerBI Strong data engineering experience atleast 3-5 years Dimensional data modelling Experience in delivering end to end BI solution from requirements, design to More ❯
particularly in Data & Analytics function. Expert proficiency in Python, R, SQL, and distributed computing frameworks (e.g., Spark, Hadoop). Advanced knowledge of data engineering tools (e.g., Airflow, Kafka, Snowflake, Databricks). Proficiency in machine learning frameworks (TensorFlow, PyTorch, Scikit-learn). Ability to implement robust data governance and AI model explainability frameworks. Commitment to ethical AI practices and responsible data More ❯
Security and Compliance teams to ensure adherence to data privacy regulations (e.g., GDPR) and internal governance standards Lead evaluation and integration of data tools, platforms, and technologies (e.g., Snowflake, Databricks, Azure Synapse, Kafka, dbt, Power BI) Oversee data integration strategy across the enterprise—including ETL/ELT pipelines, APIs, and event-driven data streaming Contribute to the development of a More ❯
engineering, with at least 3 years in a management role. Proven track record in architecting, building, monitoring, and optimizing large-scale data systems and analytics infrastructure. Experience working in Databricks & Azure environment. Experience working with pipeline scheduling tools such as Airflow & Astronomer. Experience working with CI/CD tools such as TeamCity, Terraform, Github, Octopus. Knowledge of software coding practices More ❯
Data Engineering Manager - Commercial & Supply page is loaded Data Engineering Manager - Commercial & Supply Apply locations Asda House time type Full time posted on Posted 9 Days Ago time left to apply End Date: June 28, 2025 (5 days left to More ❯
London, England, United Kingdom Hybrid / WFH Options
Endeavour Recruitment Solutions
Technologies: Data Engineer ETL SQL Power BI Azure Data Warehouse We have an exciting Hybrid working opportunity for a Data Engineer to join our clients growing Data team, playing a key role in surfacing data within their fast-growing Finance More ❯
Join to apply for the Ambitious Senior Azure Data Engineer role at Laminar Projects Join to apply for the Ambitious Senior Azure Data Engineer role at Laminar Projects Want to join a company that works on real world problems that More ❯
We are seeking a skilled, motivated Data Engineer to join our dynamic and innovative team. As a Data Engineer in the Data & Insight Team you will work as part of a team of data engineers to design, develop, test and More ❯
Senior Databricks Data Engineer | Energy Trading - Contract My client is an Energy Trading Client based in the UK who are looking to appoint a Senior Databricks Data Engineer to help modernise their data infrastructure and deliver end-to-end solutions that drive business value. This is a high-impact role working across data strategy, architecture, and engineering to deliver cloud … AWS, Azure or SAP ETL/ELT Development Data Modeling Data Integration & Ingestion Data Manipulation & Processing Version Control & DevOps: Skilled in GitHub, GitHub Actions, Azure DevOps Azure Data Factory, Databricks, SQL DB, Synapse, Stream Analytics Glue, Airflow, Kinesis, Redshift SonarQube, PyTest If you're ready to take on a new challenge and shape data engineering in a trading-first environment More ❯
Join to apply for the Solutions Architect - Databricks role at Lumenalta . At Lumenalta, we partner with enterprise clients to build cutting-edge digital products that interact with hundreds of millions of customers, transactions, and data points. As a Databricks Solutions Architect, you will be pivotal in designing and implementing scalable, high-performance solutions to tackle the most demanding data … challenges in various industries. You will have the opportunity to drive end-to-end architecture and help our clients unlock the full potential of their data using Databricks and other modern cloud technologies. We are looking for someone who can thrive in a fast-paced, collaborative environment, with experience in industries such as FinTech, Transportation, Insurance, Media, or other complex … multifactor environments. Key Responsibilities: Solutions Design: Architect end-to-end scalable data solutions using Databricks, Azure, AWS, and other cloud-based services to meet client requirements. Data Strategy & Architecture: Develop robust data architectures, implement ETL pipelines, and establish data governance frameworks to ensure data accuracy and consistency. Technical Leadership: Lead the design and implementation of data platforms, define best practices More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
Job Description My client, based in the London area, is currently looking to recruit an experienced AWS Databricks Data Architect to join their team. They are a leader within the consulting space and are experiencing growth, backed by a large multi-equity firm aiming to strengthen their financial position. The company anticipates year-on-year growth, enabling them to adopt … cutting-edge technologies available, including recent implementations of Gen AI. Your role will include: Designing and implementing effective architectural solutions around AWS serverless technologies (S3, Lambda, Athena, Kafka) and Databricks, including Data Lake and Data Warehousing. Assessing database implementation procedures to ensure GDPR and data compliance. Guiding and influencing technology teams and stakeholders on solution options, benefits, and drawbacks. Setting … technologies (AWS and Azure). The ideal candidate will have: Extensive coding experience, data pipeline development, and digital transformation expertise. Proven experience implementing solutions in AWS cloud environments (S3, Databricks, Athena, Glue). Deep understanding of workflows, asset bundles, SQS, EKS, Terraform. Strong knowledge of data modeling and Kinesis. Understanding of SQL and database management. Hands-on experience with Data More ❯
reference data) Understanding of regulatory reporting processes Proven ability to work directly with demanding front office stakeholders Experience with real-time data feeds and low-latency requirements Preferred Skills Databricks experience Capital markets knowledge (equities, fixed income, derivatives) Experience with financial data vendors (Bloomberg, Reuters, MarkIt) Cloud platforms (Azure preferred) and orchestration tools Understanding of risk metrics and P&L More ❯
reference data) Understanding of regulatory reporting processes Proven ability to work directly with demanding front office stakeholders Experience with real-time data feeds and low-latency requirements Preferred Skills Databricks experience Capital markets knowledge (equities, fixed income, derivatives) Experience with financial data vendors (Bloomberg, Reuters, MarkIt) Cloud platforms (Azure preferred) and orchestration tools Understanding of risk metrics and P&L More ❯
Senior Lead Software Engineer -Python and Databricks 2 weeks ago Be among the first 25 applicants Get AI-powered advice on this job and more exclusive features. Job Description Be an integral part of an agile team that's constantly pushing the envelope to enhance, build, and deliver top-notch technology products. As a Senior Lead Software Engineer at JPMorgan … a Big Data technologies (Spark, Impala, Hive, Redshift, Kafka, etc.) Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues Experience with Databricks, Snowflake, Iceberg are required Preferred Qualifications, Capabilities, And Skills Experience in application and data design disciplines with an emphasis on real-time processing and delivery e.g. Kafka is preferable Understanding More ❯
a Big Data technologies (Spark, Impala, Hive, Redshift, Kafka, etc.) Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues Experience with Databricks, Snowflake, Iceberg are required Preferred qualifications, capabilities, and skills Experience in application and data design disciplines with an emphasis on real-time processing and delivery e.g. Kafka is preferable Understanding More ❯
to design and optimize cloud-based data solutions in secure Azure environments. Key Responsibilities: Develop and manage Azure data pipelines and architectures. Build ETL processes using Azure Data Factory, Databricks, and Synapse Analytics. Manage scalable, secure cloud data platforms. Implement storage solutions like Azure Data Lake and SQL Database. Optimize data pipelines and monitor performance. Collaborate with data scientists, analysts … and stakeholders. Ensure compliance with data governance and security standards. Requirements: Active DV Clearance (Mandatory). Proficient with Azure Data Factory, Databricks, Synapse, SQL, and Data Lake. Strong SQL, Python, or Scala skills for data transformation. Experience with CI/CD pipelines using Azure DevOps. Familiar with data governance and Power BI for visualization. Contract role with competitive day rates. More ❯
team based in Leeds, working mostly remotely with just one day on-site per week. You ll lead the design and delivery of scalable, cloud-based data solutions using Databricks, Python, and SQL, while mentoring a team and driving engineering best practices. About You You might currently be a Senior Data Engineer ready to grow your leadership skills. You re … passionate about building robust, efficient data pipelines and shaping cloud data architecture in an agile environment. Key Responsibilities Lead development of data pipelines and solutions using Databricks, Python, and SQL Design and maintain data models supporting analytics and business intelligence Build and optimise ELT/ETL processes on AWS or Azure Collaborate closely with analysts, architects, and stakeholders to deliver … as code Mentor and support your team, taking ownership of technical delivery and decisions Drive continuous improvements in platform performance, cost, and reliability Key Requirements Hands-on experience with Databricks or similar data engineering platforms Strong Python and SQL skills in data engineering contexts Expertise in data modelling and building analytics-ready datasets Experience with AWS or Azure cloud data More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
VIQU Limited
team based in Leeds, working mostly remotely with just one day on-site per week. You'll lead the design and delivery of scalable, cloud-based data solutions using Databricks, Python, and SQL, while mentoring a team and driving engineering best practices. About You You might currently be a Senior Data Engineer ready to grow your leadership skills. You're … passionate about building robust, efficient data pipelines and shaping cloud data architecture in an agile environment. Key Responsibilities Lead development of data pipelines and solutions using Databricks, Python, and SQL Design and maintain data models supporting analytics and business intelligence Build and optimise ELT/ETL processes on AWS or Azure Collaborate closely with analysts, architects, and stakeholders to deliver … as code Mentor and support your team, taking ownership of technical delivery and decisions Drive continuous improvements in platform performance, cost, and reliability Key Requirements Hands-on experience with Databricks or similar data engineering platforms Strong Python and SQL skills in data engineering contexts Expertise in data modelling and building analytics-ready datasets Experience with AWS or Azure cloud data More ❯
team based in Leeds, working mostly remotely with just one day on-site per week. You'll lead the design and delivery of scalable, cloud-based data solutions using Databricks, Python, and SQL, while mentoring a team and driving engineering best practices. About You You might currently be a Senior Data Engineer ready to grow your leadership skills. You're … passionate about building robust, efficient data pipelines and shaping cloud data architecture in an agile environment. Key Responsibilities Lead development of data pipelines and solutions using Databricks, Python, and SQL Design and maintain data models supporting analytics and business intelligence Build and optimise ELT/ETL processes on AWS or Azure Collaborate closely with analysts, architects, and stakeholders to deliver … as code Mentor and support your team, taking ownership of technical delivery and decisions Drive continuous improvements in platform performance, cost, and reliability Key Requirements Hands-on experience with Databricks or similar data engineering platforms Strong Python and SQL skills in data engineering contexts Expertise in data modelling and building analytics-ready datasets Experience with AWS or Azure cloud data More ❯
implementation of customers modern data platforms. The ideal candidate will have extensive experience migrating traditional data warehouse technologies, including Teradata, Oracle, BW, Hadoop to modern cloud data platforms like Databricks, Snowflake, Redshift, Bigquery, or Microsoft fabric. You will be responsible for leading data platform migrations and the design and development of scalable data solutions that support our organization's strategic … and big data platforms. Establish best practices and standards for data modeling, integration, and management. Platform Design and Implementation: Architect, design, and implement data warehouse solutions using platforms like Databricks, Redshift, BigQuery, Synapse, and Snowflake. Develop scalable big data solutions using cloud data technologies and services. Ensure the data architecture supports data quality, security, and governance requirements. Technology Leadership: Evaluate … data architecture, data warehousing, and big data solutions. 5+ years of experience working in Cloud AWS, GCP or Azure. 5+ years of experience working in modern cloud data platforms (Databricks, Redshift, BigQuery, Synapse, SAP Datasphere, and Snowflake.) 5+ years of experience designing Cloud Infrastructure on AWS, GCP or Azure. Extensive experience with data warehouse platforms such as Teradata Oracle, SAP More ❯
varied end points to move data at speed and at scale. The right candidate will have a wealth of knowledge in the data world with a strong focus on Databricks, and will be keen to expand upon their existing knowledge, learning new technologies along the way as well as supporting both future and legacy technologies and processes. You will be … data problems and challenges every day. Key Responsibilities: Design, Build, and Optimise Real-Time Data Pipelines: Develop and maintain robust and scalable stream and micro-batch data pipelines using Databricks, Spark (PySpark/SQL), and Delta Live Tables. Implement Change Data Capture (CDC): Implement efficient CDC mechanisms to capture and process data changes from various source systems in near real … and schema evolution, to ensure data quality and reliability. Champion Data Governance with Unity Catalog: Implement and manage data governance policies, data lineage, and fine-grained access control using Databricks Unity Catalog. Enable Secure Data Sharing with Delta Sharing: Design and implement secure and governed data sharing solutions to distribute data to both internal and external consumers without data replication. More ❯
varied end points to move data at speed and at scale. The right candidate will have a wealth of knowledge in the data world with a strong focus on Databricks, and will be keen to expand upon their existing knowledge, learning new technologies along the way as well as supporting both future and legacy technologies and processes. You will be … data problems and challenges every day. Key Responsibilities: Design, Build, and Optimise Real-Time Data Pipelines: Develop and maintain robust and scalable stream and micro-batch data pipelines using Databricks, Spark (PySpark/SQL), and Delta Live Tables. Implement Change Data Capture (CDC): Implement efficient CDC mechanisms to capture and process data changes from various source systems in near real … and schema evolution, to ensure data quality and reliability. Champion Data Governance with Unity Catalog: Implement and manage data governance policies, data lineage, and fine-grained access control using Databricks Unity Catalog. Enable Secure Data Sharing with Delta Sharing: Design and implement secure and governed data sharing solutions to distribute data to both internal and external consumers without data replication. More ❯
the support team to maintain, enhance, and ensure the reliability of our BI systems hosted on Microsoft Azure. Optimize and manage Azure data services, including Azure Data Factory, Azure Databricks, and Azure SQL Database, as well as Power BI for data analysis and visualization. Monitor and troubleshoot data pipelines to ensure seamless and efficient operations. Stay updated with advancements in … fast-paced, dynamic environment. Being open minded, motivated, and self-organized. Nice to have : Hands on experience with Cloud platform. Microsoft Azure is preferable. Particularly Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse Analytics, and Azure Data Lake Storage. Familiarity with programming languages such as Python, Scala, Java, C#. Bachelor's or Master's degree in computer science More ❯