transformation, and we’re looking for a talented Data Engineer to join our growing data team. In this role, you’ll be instrumental in building and deploying modern Azure Databricks based data solutions, enabling the business to make faster, data-driven decisions. You’ll work hands-on with Azure Databricks, Azure Data Factory, Delta Lake, and Power BI to design … data at Zodiac Maritime while working with cutting-edge cloud technologies. Key responsibilities and primary deliverables Design, develop, and optimize end-to-end data pipelines (batch & streaming) using Azure Databricks, Spark, and Delta Lake. Implement Medallion Architecture to structure raw, enriched, and curated data layers efficiently. Build scalable ETL/ELT processes with Azure Data Factory and PySpark. Work with … streaming with Kafka/Event Hubs, Knowledge Graphs). Advocate for best practices in data engineering across the organization. Skills profile Relevant experience & education Hands-on experience with Azure Databricks, Delta Lake, Data Factory, and Synapse. Strong understanding of Lakehouse architecture and medallion design patterns. Proficient in Python, PySpark, and SQL (advanced query optimization). Experience building scalable ETL pipelines More ❯
transformation, and we’re looking for a talented Data Engineer to join our growing data team. In this role, you’ll be instrumental in building and deploying modern Azure Databricks based data solutions, enabling the business to make faster, data-driven decisions. You’ll work hands-on with Azure Databricks, Azure Data Factory, Delta Lake, and Power BI to design … data at Zodiac Maritime while working with cutting-edge cloud technologies. Key responsibilities and primary deliverables Design, develop, and optimize end-to-end data pipelines (batch & streaming) using Azure Databricks, Spark, and Delta Lake. Implement Medallion Architecture to structure raw, enriched, and curated data layers efficiently. Build scalable ETL/ELT processes with Azure Data Factory and PySpark. Work with … streaming with Kafka/Event Hubs, Knowledge Graphs). Advocate for best practices in data engineering across the organization. Skills profile Relevant experience & education Hands-on experience with Azure Databricks, Delta Lake, Data Factory, and Synapse. Strong understanding of Lakehouse architecture and medallion design patterns. Proficient in Python, PySpark, and SQL (advanced query optimization). Experience building scalable ETL pipelines More ❯
and collaborate with others passionate about solving business problems. Key responsibilities: Data Platform Design and Architecture Design, develop, and maintain a high-performing, secure, and scalable data platform, leveraging Databricks Corporate Lakehouse and Medallion Architectures. Utilise our metadata-driven data platform framework combined with advanced cluster management techniques to create and optimise scalable, robust, and efficient data solutions. Implement comprehensive … multiple organisational SQL databases and SaaS applications using end-to-end dependency-based data pipelines, to establish an enterprise source of truth. Create ETL and ELT processes using Azure Databricks, ensuring audit-ready financial data pipelines and secure data exchange with Databricks Delta Sharing and SQL Warehouse endpoints. Governance and Compliance Ensure compliance with information security standards in our highly … regulated financial landscape by implementing Databricks Unity Catalog for governance, data quality monitoring, and ADLS Gen2 encryption for audit compliance. Development and Process Improvement Evaluate requirements, create technical design documentation, and work within Agile methodologies to deploy and optimise data workflows, adhering to data platform policies and standards. Collaboration and Knowledge Sharing Collaborate with stakeholders to develop data solutions, maintain More ❯
and collaborate with others passionate about solving business problems. Key responsibilities: Data Platform Design and Architecture Design, develop, and maintain a high-performing, secure, and scalable data platform, leveraging Databricks Corporate Lakehouse and Medallion Architectures. Utilise our metadata-driven data platform framework combined with advanced cluster management techniques to create and optimise scalable, robust, and efficient data solutions. Implement comprehensive … multiple organisational SQL databases and SaaS applications using end-to-end dependency-based data pipelines, to establish an enterprise source of truth. Create ETL and ELT processes using Azure Databricks, ensuring audit-ready financial data pipelines and secure data exchange with Databricks Delta Sharing and SQL Warehouse endpoints. Governance and Compliance Ensure compliance with information security standards in our highly … regulated financial landscape by implementing Databricks Unity Catalog for governance, data quality monitoring, and ADLS Gen2 encryption for audit compliance. Development and Process Improvement Evaluate requirements, create technical design documentation, and work within Agile methodologies to deploy and optimise data workflows, adhering to data platform policies and standards. Collaboration and Knowledge Sharing Collaborate with stakeholders to develop data solutions, maintain More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
CipherTek Recruitment
greenfield MLOps pipelines that handle very large datasets. You will be responsible for building out a greenfield standaridised framework for Capital markets. The core platform is built on Azure Databricks Lakehouse, consolidating data from various front and Middle Office systems to support BI, MI, and advanced AI/ML analytics. As a lead, you will shape the MLOps framework and … data sources (orders, quotes, trades, risk, etc.). Essential Requirements: 2+ years of experience in MLOps and at least 3 years in AI/ML engineering. Knowledge in Azure Databricks and associated services. Proficiency with ML frameworks and libraries in Python. Proven experience deploying and maintaining LLM services and solutions. Expertise in Azure DevOps and GitHub Actions. Familiarity with Databricks … CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets and performing data preparation and integration. Experience with Agile methodologies and SDLC practices. Strong problem-solving, analytical, and communication skills. More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
greenfield MLOps pipelines that handle very large datasets. You will be responsible for building out a greenfield standaridised framework for Capital markets. The core platform is built on Azure Databricks Lakehouse, consolidating data from various front and Middle Office systems to support BI, MI, and advanced AI/ML analytics. As a lead, you will shape the MLOps framework and … across various data sources (orders, quotes, trades, risk, etc.). 2+ years of experience in MLOps and at least 3 years in AI/ML engineering. Knowledge in Azure Databricks and associated services. Proficiency with ML frameworks and libraries in Python. Proven experience deploying and maintaining LLM services and solutions. Expertise in Azure DevOps and GitHub Actions. Familiarity with Databricks … CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets and performing data preparation and integration. Experience with Agile methodologies and SDLC practices. Strong problem-solving, analytical, and communication skills. More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
greenfield MLOps pipelines that handle very large datasets. You will be responsible for building out a greenfield standaridised framework for Capital markets. The core platform is built on Azure Databricks Lakehouse, consolidating data from various front and Middle Office systems to support BI, MI, and advanced AI/ML analytics. As a lead, you will shape the MLOps framework and … across various data sources (orders, quotes, trades, risk, etc.). 2+ years of experience in MLOps and at least 3 years in AI/ML engineering. Knowledge in Azure Databricks and associated services. Proficiency with ML frameworks and libraries in Python. Proven experience deploying and maintaining LLM services and solutions. Expertise in Azure DevOps and GitHub Actions. Familiarity with Databricks … CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets and performing data preparation and integration. Experience with Agile methodologies and SDLC practices. Strong problem-solving, analytical, and communication skills. More ❯
London, England, United Kingdom Hybrid / WFH Options
Artefact
engineering and a proven track record of leading data projects in a fast-paced environment. Key Responsibilities Design, build, and maintain scalable and robust data pipelines using SQL, Python, Databricks, Snowflake, Azure Data Factory, AWS Glue, Apache Airflow and Pyspark. Lead the integration of complex data systems and ensure consistency and accuracy of data across multiple platforms. Implement continuous integration … engineering with a strong technical proficiency in SQL, Python, and big data technologies. Extensive experience with cloud services such as Azure Data Factory and AWS Glue. Demonstrated experience with Databricks and Snowflake. Solid understanding of CI/CD principles and DevOps practices. Proven leadership skills and experience managing data engineering teams. Strong project management skills and the ability to lead … Strong communication and interpersonal skills. Excellent understanding of data architecture including data mesh, data lake and data warehouse. Preferred Qualifications: Certifications in Azure, AWS, or similar technologies. Certifications in Databricks, Snowflake or similar technologies Experience in the leading large scale data engineering projects Working Conditions This position may require occasional travel. Hybrid work arrangement: two days per week working from More ❯
Python (PySpark) . Ingest, transform, and curate data from multiple sources into Azure Data Lake and Delta Lake formats. Build and optimize datasets for performance and reliability in Azure Databricks . Collaborate with analysts and business stakeholders to translate data requirements into robust technical solutions. Implement and maintain ETL/ELT pipelines using Azure Data Factory or Synapse Pipelines . … Monitor and troubleshoot production jobs and processes. Preferred Skills & Experience: Strong proficiency in SQL for data transformation and performance tuning. Solid experience with Python , ideally using PySpark in Azure Databricks . Hands-on experience with Azure Data Lake Storage Gen2 . Understanding of data warehouse concepts , dimensional modelling , and data architecture . Experience working with Delta Lake and large-scale More ❯
schema design. Experience architecting and building data applications using Azure, specifically Data Warehouse and/or Data Lake. Technologies: Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks, and Power BI. Experience creating low-level designs for data platform implementations. ETL pipeline development for data source integration and transformations, including documentation. Proficiency working with APIs and integrating them … into data pipelines. Strong programming skills in Python. Experience with data wrangling such as cleansing, quality enforcement, and curation (e.g., Azure Synapse notebooks, Databricks). Data modeling experience to describe data landscape, entities, and relationships. Experience migrating data from legacy systems to the cloud. Experience with Infrastructure as Code (IaC), particularly Terraform. Proficiency in developing Power BI dashboards. Strong focus More ❯
design. Experience architecting and building data applications using Azure, specifically a Data Warehouse and/or Data Lake. Technologies : Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks and Power BI. Experience with creating low-level designs for data platform implementations. ETL pipeline development for the integration with data sources and data transformations including the creation of supplementary … with APIs and integrating them into data pipelines. Strong programming skills in Python. Experience of data wrangling such as cleansing, quality enforcement and curation e.g. using Azure Synapse notebooks, Databricks, etc. Experience of data modelling to describe the data landscape, entities and relationships. Experience with data migration from legacy systems to the cloud. Experience with Infrastructure as Code (IaC) particularly More ❯
and interest in at least four others: SQL Python Power BI/Analysis Services/DAX Data Modelling/Data Warehouse Theory Azure Fundamentals Additional desirable skills include Azure Databricks, Synapse Analytics, Data Factory, DevOps, MSBI stack, PowerShell, Azure Functions, PowerApps, Data Science, and Azure AI services. Certifications such as Databricks Certified Associate/Professional and Microsoft Azure Certifications are More ❯
more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and … solutions. Job Accountabilities · Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. · Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala. · Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploymachine learning models and algorithms aimed at addressing specific … workflow and knowledge of when and how to use dedicated hardware. · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) · Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. · Experience with data quality and/or and data lineage frameworks like Great Expectations, dbt data quality, OpenLineage or Marquez More ❯
more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and … solutions. Job Accountabilities Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala. Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine learning models and algorithms aimed at addressing … workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations, dbt data quality, OpenLineage or Marquez More ❯
design. Experience architecting and building data applications using Azure, specifically a Data Warehouse and/or Data Lake. Technologies : Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks and Power BI. Experience with creating low-level designs for data platform implementations. ETL pipeline development for the integration with data sources and data transformations including the creation of supplementary … with APIs and integrating them into data pipelines. Strong programming skills in Python. Experience of data wrangling such as cleansing, quality enforcement and curation e.g. using Azure Synapse notebooks, Databricks, etc. Experience of data modelling to describe the data landscape, entities and relationships. Experience with data migration from legacy systems to the cloud. Experience with Infrastructure as Code (IaC) particularly More ❯
critical decisions. What You’ll Do Lead the development and deployment of advanced analytics, data science, and machine learning tools and solutions. Use technologies such as Python, R, Azure, Databricks, SQL, Power BI, and Tableau to deliver actionable insights from complex data. Guide and mentor junior data scientists and analysts, fostering a culture of growth and technical excellence. Collaborate with … business objectives. Experience using R and NLP or deep learning techniques (e.g. TF-IDF, word embeddings, CNNs, RNNs). Familiarity with Generative AI and prompt engineering. Experience with Azure Databricks, MLflow, Azure ML services, Docker, Kubernetes. Exposure to Agile development environments and software engineering best practices. Experience working in large or complex organisations or regulated industries. Strong working knowledge of More ❯
critical decisions. What You’ll Do Lead the development and deployment of advanced analytics, data science, and machine learning tools and solutions. Use technologies such as Python, R, Azure, Databricks, SQL, Power BI, and Tableau to deliver actionable insights from complex data. Guide and mentor junior data scientists and analysts, fostering a culture of growth and technical excellence. Collaborate with … business objectives. Experience using R and NLP or deep learning techniques (e.g. TF-IDF, word embeddings, CNNs, RNNs). Familiarity with Generative AI and prompt engineering. Experience with Azure Databricks, MLflow, Azure ML services, Docker, Kubernetes. Exposure to Agile development environments and software engineering best practices. Experience working in large or complex organisations or regulated industries. Strong working knowledge of More ❯
more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and … solutions. Job Accountabilities Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala. Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine learning models and algorithms aimed at addressing … workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations, dbt data quality, OpenLineage or Marquez More ❯
emerging data technologies and industry trends, build strong relationships across teams, and implement data quality and observability checks. What you'll be doing: Develop solutions using Microsoft Azure and Databricks, ensuring simplicity, testability, deployability, and operational monitoring. Optimize data pipelines and frameworks for easier development. Support production data applications and infrastructure. Create proof of concepts, evaluate performance, data modeling, and … e-commerce. Knowledge of Big Data and Distributed Computing. Familiarity with streaming technologies like Spark Structured Streaming or Apache Flink. Additional programming skills in PowerShell or Bash. Understanding of Databricks Ecosystem components. Experience with Data Observability or Data Quality frameworks. Additional Information What's in it for you? Competitive salary, pension, private medical care. Performance bonus, flexible benefits, 25 days More ❯
any required data migrations from on-premises or 3rd party hosted databases/repositories Build and support data-pipelines using ETL tools such as MS Azure Data Factory and Databricks Design and manage a standard access method to both cloud and on-premise data sources for use in data visualisation and reporting (predominantly using Microsoft Power Bi) Own and build … government services Your knowledge and certifications: Any MS Azure data certifications 2 years + working with Azure data engineering tools e.g: Azure Data factory Azure Synapse Azure SQL Azure DataBricks Microsoft Fabric Azure data lake Exposure to other data engineering and storage tools: Snowflake AWS tools - Kinesis/Glue/Redshift Google tools - BigQuery/Looker Experience working with open More ❯
any required data migrations from on-premises or 3rd party hosted databases/repositories Build and support data-pipelines using ETL tools such as MS Azure Data Factory and Databricks Design and manage a standard access method to both cloud and on-premise data sources for use in data visualisation and reporting (predominantly using Microsoft Power Bi) Own and build … government services Your Knowledge And Certifications Any MS Azure data certifications 2 years + working with Azure data engineering tools e.g: Azure Data factory Azure Synapse Azure SQL Azure DataBricks Microsoft Fabric Azure data lake Exposure to other data engineering and storage tools: Snowflake AWS tools – Kinesis/Glue/Redshift Google tools – BigQuery/Looker Experience working with open More ❯
Lutterworth, England, United Kingdom Hybrid / WFH Options
PharmaLex
database. Collaborate with Data Analysts and Scientists to optimise data quality, reliability, security, and automation. Skills & Responsibilities: Core responsibility will be using the NHS Secure Data Environment which utilises databricks, to design and extract regular datasets. Configure and troubleshoot Microsoft Azure, manage data ingestion using LogicApps and Data Factory. Develop ETL scripts using MS SQL, Python, handle web scraping, APIs More ❯
Sprints/Agile) and project management software (Jira). Excellent verbal and written communication skills. Technical Skills: Familiarity with SQL Server. Advanced SQL scripting (T-SQL, PL/SQL, Databricks SQL). Familiarity with ETL/ELT tools and experience navigating data pipelines. Experience using scripting languages (e.g. Python, PowerShell etc.) to extract insights from file-based storage. Familiarity with … software. Knowledge of Orchestration Tools and processes (e.g. SSIS, Data Factory, Alteryx) Power BI Development including the data model, DAX, and visualizations. Relational and Dimensional (Kimball) data modelling. Desirable: Databricks (or Alternative Modern Data Platform such as Snowflake) Experience working in a regulated environment and knowledge of the risk and compliance requirements associated with this. Oracle Database MongoDB Cloud Data More ❯
e.g SSIS, Data Factory, Alteryx) • Power BI Development including the data model, DAX, and visualizations. • Relational and Dimensional (Kimball) data modelling • Proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Desirable: • Databricks (or Alternative Modern Data Platform such as Snowflake) • Experience working in a regulated environment and knowledge of the risk and compliance requirements associated with this. • Oracle Database More ❯
Wideopen, England, United Kingdom Hybrid / WFH Options
myGwork - LGBTQ+ Business Community
in this role - Solid experience in data engineering, with exposure to multiple platforms and tools - Expertise in platforms like MS Fabric, Azure data tooling, AWS data tooling, Snowflake or Databricks - Strong proficiency in SQL and scripting languages such as Python - Experience with cloud platforms like Azure, AWS, or GCP and their data services - Solid problem-solving and stakeholder management skills More ❯