London, South East, England, United Kingdom Hybrid / WFH Options
Carrington Recruitment Solutions Ltd
DATA HEAVY Product Owners who have managed complex, Global products. Read on for more details... Experience required: Technical proficiency: Familiarity with Azure services (e.g., Data Lake, Synapse, Fabric) and Databricks for data engineering, analytics, performance optimisation, and governance. Experience with implementing and optimising scalable cloud infrastructure is highly valued. Backlog management: Demonstrated expertise in maintaining and prioritizing product backlogs, writing More ❯
Newbury, Berkshire, South East, United Kingdom Hybrid / WFH Options
Fdo Consulting Limited
into a management (mentoring, coaching, team development etc) Very strong technical skills that will include - SQL, SSIS, SSRS, SAS, Power BI, Power Platform, Azure Data Factory, Azure Data Lake, Databricks A good understanding of dimensional modelling techniques, including Kimball's Business Development Lifecycle Ability to design hybrid data solutions across on-prem and cloud data sources Expert with data engineering More ❯
across data modelling, integration, governance, and transformation. Experience with AWS (S3, Glue, Redshift, Lambda, Kinesis) and/or Azure (ADF, Synapse, Fabric). Familiarity with modern data platforms (e.g. Databricks, Snowflake, or Lakehouse environments). Ability to operate confidently in highly secure, mission-focused settings. Why Join? Work on meaningful security projects with real-world impact. Join a culture built More ❯
across data modelling, integration, governance, and transformation. Experience with AWS (S3, Glue, Redshift, Lambda, Kinesis) and/or Azure (ADF, Synapse, Fabric). Familiarity with modern data platforms (e.g. Databricks, Snowflake, or Lakehouse environments). Ability to operate confidently in highly secure, mission-focused settings. Why Join? Work on meaningful security projects with real-world impact. Join a culture built More ❯
across data modelling, integration, governance, and transformation. Experience with AWS (S3, Glue, Redshift, Lambda, Kinesis) and/or Azure (ADF, Synapse, Fabric). Familiarity with modern data platforms (e.g. Databricks, Snowflake, or Lakehouse environments). Ability to operate confidently in highly secure, mission-focused settings. Why Join? Work on meaningful security projects with real-world impact. Join a culture built More ❯
london (city of london), south east england, united kingdom
Opensourced® Agency
across data modelling, integration, governance, and transformation. Experience with AWS (S3, Glue, Redshift, Lambda, Kinesis) and/or Azure (ADF, Synapse, Fabric). Familiarity with modern data platforms (e.g. Databricks, Snowflake, or Lakehouse environments). Ability to operate confidently in highly secure, mission-focused settings. Why Join? Work on meaningful security projects with real-world impact. Join a culture built More ❯
across data modelling, integration, governance, and transformation. Experience with AWS (S3, Glue, Redshift, Lambda, Kinesis) and/or Azure (ADF, Synapse, Fabric). Familiarity with modern data platforms (e.g. Databricks, Snowflake, or Lakehouse environments). Ability to operate confidently in highly secure, mission-focused settings. Why Join? Work on meaningful security projects with real-world impact. Join a culture built More ❯
Job Summary : We are seeking a highly skilled and experienced Senior Data Engineer to join our team and contribute to the development and maintenance of our cutting-edge Azure Databricks platform for economic data. This platform is critical for our Monetary Analysis, Forecasting, and Modelling activities. The Senior Data Engineer will be responsible for building and optimising data pipelines, implementing … Development & Optimisation : Design, develop, and maintain robust and scalable data pipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data providers) into the Azure Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark … or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and More ❯
Job Summary : We are seeking a highly skilled and experienced Senior Data Engineer to join our team and contribute to the development and maintenance of our cutting-edge Azure Databricks platform for economic data. This platform is critical for our Monetary Analysis, Forecasting, and Modelling activities. The Senior Data Engineer will be responsible for building and optimising data pipelines, implementing … Development & Optimisation : Design, develop, and maintain robust and scalable data pipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data providers) into the Azure Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark … or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and More ❯
Job Summary : We are seeking a highly skilled and experienced Senior Data Engineer to join our team and contribute to the development and maintenance of our cutting-edge Azure Databricks platform for economic data. This platform is critical for our Monetary Analysis, Forecasting, and Modelling activities. The Senior Data Engineer will be responsible for building and optimising data pipelines, implementing … Development & Optimisation: Design, develop, and maintain robust and scalable data pipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data providers) into the Azure Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark … or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and More ❯
Job Summary : We are seeking a highly skilled and experienced Senior Data Engineer to join our team and contribute to the development and maintenance of our cutting-edge Azure Databricks platform for economic data. This platform is critical for our Monetary Analysis, Forecasting, and Modelling activities. The Senior Data Engineer will be responsible for building and optimising data pipelines, implementing … Development & Optimisation: Design, develop, and maintain robust and scalable data pipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data providers) into the Azure Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark … or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and More ❯
london (city of london), south east england, united kingdom
Mastek
Job Summary : We are seeking a highly skilled and experienced Senior Data Engineer to join our team and contribute to the development and maintenance of our cutting-edge Azure Databricks platform for economic data. This platform is critical for our Monetary Analysis, Forecasting, and Modelling activities. The Senior Data Engineer will be responsible for building and optimising data pipelines, implementing … Development & Optimisation: Design, develop, and maintain robust and scalable data pipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data providers) into the Azure Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark … or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and More ❯
Salford, Lancashire, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
communicating results in a concise manner both verbally and written Desirable Postgraduate qualification in relevant field (eg Computer Science, Data Science, Operational Research) Experience with modern data platforms (eg Databricks, Snowflake, MS Fabric). Familiarity with MLOps practices and version control tools (e.g. Git). Experience deploying and maintaining of ML models in production environments. Exposure to A/B More ❯
Salford, Greater Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
communicating results in a concise manner both verbally and written Desirable Postgraduate qualification in relevant field (eg Computer Science, Data Science, Operational Research) Experience with modern data platforms (eg Databricks, Snowflake, MS Fabric). Familiarity with MLOps practices and version control tools (e.g. Git). Experience deploying and maintaining of ML models in production environments. Exposure to A/B More ❯
Domo, Power BI, Looker, or equivalent Strong understanding of UX/UI principles and visual storytelling through data Working knowledge of creating data connections - e.g. SQL and relational databases, DataBricks, JSON files, API services, Data Lakes Meticulous attention to detail with a user-first mindset Excellent communication and collaboration skills across technical and non-technical teams Comfortable working in agile More ❯
Proven experience as a Product Owner, Delivery Lead, Data Analyst or similar role in a data focused environment. Strong understanding of Data manipulation and MI/reporting tools (e.g. Databricks, Tableau, PowerBI, SQL). Familiarity with cloud services platforms, preferably AWS and Microsoft Azure. Able to clearly communicate your findings to a business audience and challenge colleagues on taking data More ❯
experience in the energy industry, with a specific focus on forecasting the short-term power markets GitHub or Azure DevOps knowledge is desired SQL knowledge is desired Experience in DataBricks is desired, but not necessary You will need to be comfortable working as part of a team, have an enthusiasm for providing an excellent user experience and keen to learn More ❯
CI/CD pipelines will be key to driving our advanced analytics initiatives forward. You Will Design and manage end-to-end Azure solutions (including App Services, Compute, Networking, Databricks, Data Factory and Function Apps using Terraform, ensuring they are optimised, secure, and follow best practices. Build and maintain robust CI/CD pipelines using Azure DevOps and GitHub Actions More ❯
will give you the knowledge to understand how critical data is in our decision making to drive our business forwards. Responsibilities: Use tools such as SharePoint, Power BI and Databricks (SQL, Python) to perform basic data analysis. Create visualisations and dashboards to communicate insights effectively. Support the development of regular and ad-hoc reports for business. Work collaboratively with internal More ❯
as modern data platforms, data product engineering, data marketplace architecture, data developer portals, platform engineering. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in AI/ML such as speaking at industry events, contributing to More ❯
as modern data platforms, data product engineering, data marketplace architecture, data developer portals, platform engineering. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in AI/ML such as speaking at industry events, contributing to More ❯
manchester, north west england, united kingdom Hybrid / WFH Options
Autotrader
That Supports Industry-leading Data Science. These Are Some Technologies That Our Data Scientists Use (we Don't Expect You To Have Experience With All Of These) Python and Databricks Spark, MLFlow, and Airflow for ML Workflows Google Cloud Platform for our analytics infrastructure dbt and BigQuery for data modelling and warehousing Some examples of our data science work can More ❯
as modern data platforms, data product engineering, data marketplace architecture, data developer portals, platform engineering. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in AI/ML such as speaking at industry events, contributing to More ❯
as modern data platforms, data product engineering, data marketplace architecture, data developer portals, platform engineering. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in AI/ML such as speaking at industry events, contributing to More ❯
Senior Data Engineer (Microsoft Fabric/Azure) Digital team Nationwide (London preferred) About us Hoare Lea is a human-centric and planet-conscious engineering consultancy. We offer intelligent and sustainable solutions to complex design challenges for the built environment throughout More ❯