DOING Lead the architectural design, implementation, and ongoing optimization of our international data platform, aligned with enterprise-wide standards. Build and maintain scalable ETL/ELT data pipelines using Databricks, Spark, and related technologies. Optimize Databricks clusters, workflows, and jobs to ensure cost efficiency and high performance. Design and manage data lakes, data warehouses, and associated infrastructure to ensure data … comprehensive documentation of data systems, pipelines, and processes. Provide on-call support as part of a shared team rota to ensure platform availability. ABOUT YOU Proven expertise working with Databricks, including Unity Catalog. Strong programming skills in Scala, Python, Spark & SQL/MYSQL. Solid experience with version control systems, particularly Git. Strong background in designing and optimizing complex data pipelines More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
E.ON
across the organisation by lifting data literacy and nurturing a self-serve culture Guide your team through a data transformation journey, creating a reliable single source of truth using DataBricks Develop and scale meaningful data models, including AI/ML opportunities, that unlock tangible value for our customers and business Champion and manage impactful, reliable data products that support commercial … conversations Experience creating and delivering data roadmaps with strong prioritisation and impact focus A passion for sustainable, well managed data products that make a difference Hands on knowledge of DataBricks and experience with data transformation programmes A genuine interest in leading and developing people, building happy and high performing teams A belief in strong data foundations and pragmatic delivery over … lead, space to innovate, and a team who values support, balance and shared success. Technologies EON Next loves: Languages: Python, SQL & Spark (PySpark, SparkSQL) Tools: Git (GitHub/GitLab) DataBricks, AWS services (inc. Athena/Glue, S3, SageMaker, Lambda, Transcribe, EC2) & Tableau Datastores: Postgres, DataBricks, Parquet/Delta Here's what else you need to know: Role may close earlier More ❯
Glue, Lake Formation, and Athena Develop scalable and secure ETL/ELT pipelines using Python, PySpark, and SQL Drive decisions on data modeling, lakehouse architecture, and integration strategies with Databricks and Snowflake Collaborate cross-functionally to embed data governance, quality, and lineage into platform design Lead technical evaluations of new tools and approaches to evolve the platform's capabilities Serve … governance, quality frameworks, and security best practices A builder's mindset, comfortable leading architectural decisions and also delivering code in production Bonus Points For: Experience with modern platforms like Databricks, Snowflake, or other lakehouse solutions Familiarity with analytics, ML workflows, or financial market data Why Join Us? This is your chance to shape a foundational data platform from day one. More ❯
development, orchestration, and transformation Excellent experience required with cloud platforms, preferably AWS but we’re open to Azure or GCP exposure if other tech xp aligns Good knowledge of Databricks would be useful as implementation of Databricks can be started immediately as a main aspect of the data strategy Experience of working with IoT or sensor-based systems would be More ❯
lifecycle - from data transformation and SQL-based data modelling to semantic modelling - to enable the development of robust business intelligence solutions. Leveraging tools such as Azure Data Factory and Databricks, you'll design and manage data pipelines that deliver high-quality, analysis-ready datasets, and then go onto develop interactive and insightful reports in Power BI to support data-driven … technologies! Requirements: Strong SQL and data modelling expertise Experience with semantic modelling Proven Power BI development skills Experience working in an Azure environment Bonus: Familiarity with Azure Data Factory & Databricks Benefits: Salary up to around £55,000 depending on experience 25 days annual leave plus a day off for your birthday Contributory pension scheme - matched up to 5% Please Note More ❯
lifecycle - from data transformation and SQL-based data modelling to semantic modelling - to enable the development of robust business intelligence solutions. Leveraging tools such as Azure Data Factory and Databricks, you'll design and manage data pipelines that deliver high-quality, analysis-ready datasets, and then go onto develop interactive and insightful reports in Power BI to support data-driven … technologies! Requirements: Strong SQL and data modelling expertise Experience with semantic modelling Proven Power BI development skills Experience working in an Azure environment Bonus: Familiarity with Azure Data Factory & Databricks Benefits: Salary up to around £55,000 depending on experience 25 days annual leave plus a day off for your birthday Contributory pension scheme - matched up to 5% Please Note More ❯
lifecycle - from data transformation and SQL-based data modelling to semantic modelling - to enable the development of robust business intelligence solutions. Leveraging tools such as Azure Data Factory and Databricks, you'll design and manage data pipelines that deliver high-quality, analysis-ready datasets, and then go onto develop interactive and insightful reports in Power BI to support data-driven … technologies! Requirements: Strong SQL and data modelling expertise Experience with semantic modelling Proven Power BI development skills Experience working in an Azure environment Bonus: Familiarity with Azure Data Factory & Databricks Benefits: Salary up to around £55,000 depending on experience 25 days annual leave plus a day off for your birthday Contributory pension scheme - matched up to 5% Please Note More ❯
lifecycle - from data transformation and SQL-based data modelling to semantic modelling - to enable the development of robust business intelligence solutions. Leveraging tools such as Azure Data Factory and Databricks, you'll design and manage data pipelines that deliver high-quality, analysis-ready datasets, and then go onto develop interactive and insightful reports in Power BI to support data-driven … technologies! Requirements: Strong SQL and data modelling expertise Experience with semantic modelling Proven Power BI development skills Experience working in an Azure environment Bonus: Familiarity with Azure Data Factory & Databricks Benefits: Salary up to around £55,000 depending on experience 25 days annual leave plus a day off for your birthday Contributory pension scheme - matched up to 5% Please Note More ❯
lifecycle - from data transformation and SQL-based data modelling to semantic modelling - to enable the development of robust business intelligence solutions. Leveraging tools such as Azure Data Factory and Databricks, you'll design and manage data pipelines that deliver high-quality, analysis-ready datasets, and then go onto develop interactive and insightful reports in Power BI to support data-driven … Strong SQL and data modelling expertise Experience with semantic modelling Proven Power BI development skills including DAX Experience working in an Azure environment Bonus: Familiarity with Azure Data Factory & Databricks Benefits: Salary up to around £55,000 depending on experience 25 days annual leave plus a day off for your birthday Contributory pension scheme - matched up to 5% Please Note More ❯
lifecycle - from data transformation and SQL-based data modelling to semantic modelling - to enable the development of robust business intelligence solutions. Leveraging tools such as Azure Data Factory and Databricks, you'll design and manage data pipelines that deliver high-quality, analysis-ready datasets, and then go onto develop interactive and insightful reports in Power BI to support data-driven … Strong SQL and data modelling expertise Experience with semantic modelling Proven Power BI development skills including DAX Experience working in an Azure environment Bonus: Familiarity with Azure Data Factory & Databricks Benefits: Salary up to around £55,000 depending on experience 25 days annual leave plus a day off for your birthday Contributory pension scheme - matched up to 5% Please Note More ❯
in our data science practices. This is a fantastic opportunity for a curious, solutions-focused data scientist to help build out our capability, working with cutting-edge tools like Databricks, AWS data services, PySpark, and CI/CD pipelines. What's in it for you? You'll be joining a collaborative, supportive team with a real passion for data-led … classification, regression, forecasting, and/or NLP Analytical mindset with the ability to present insights to both technical and non-technical audiences Experience deploying and maintaining ML pipelines within Databricks Comfortable working with AWS data services and modern data architectures Experience with CI/CD pipelines and code versioning best practices Preferred skills: Familiarity with Databricks Asset Bundles (DAB) for More ❯
Data Engineering Manager – Managing 2 Squads Tech Stack Knowledge - Databricks, Kafka, AWS 1 Day a Week Onsite - London We are seeking a customer-centric Data Engineering Manager to lead the teams responsible for building and evolving our core data infrastructure. In this role, you will oversee the development of our foundational data platform — encompassing experimentation frameworks, event ingestion pipelines, data … or platform teams at scale, ideally in consumer-facing or marketplace environments. Strong knowledge of distributed systems and modern data ecosystems, with hands-on experience using technologies such as Databricks, Apache Spark, Apache Kafka, and DBT. Proven success in building and managing data platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ELT More ❯
of leading technologies and platforms. Full training will be provided in SQL, Looker, and DBT for reporting, dashboarding, and data pipeline development. You’ll also get exposure to Python, Databricks, and Azure as you grow into the role. There is also future potential to gain exposure in data science and machine learning projects as the team evolves. Day-to-Day … proactive, entrepreneurial mindset with a desire to learn and solve problems Excellent communication skills and confidence working with business stakeholders Tech Stack You’ll Learn SQL Looker DBT Python Databricks Azure Let me know if you want to tailor this for LinkedIn or make it sound more casual or more technical More ❯
of leading technologies and platforms. Full training will be provided in SQL, Looker, and DBT for reporting, dashboarding, and data pipeline development. You’ll also get exposure to Python, Databricks, and Azure as you grow into the role. There is also future potential to gain exposure in data science and machine learning projects as the team evolves. Day-to-Day … proactive, entrepreneurial mindset with a desire to learn and solve problems Excellent communication skills and confidence working with business stakeholders Tech Stack You’ll Learn SQL Looker DBT Python Databricks Azure Let me know if you want to tailor this for LinkedIn or make it sound more casual or more technical More ❯
Data Engineering Manager – Managing 2 Squads Tech Stack Knowledge - Databricks, Kafka, AWS 1 Day a Week Onsite - London We are seeking a customer-centric Data Engineering Manager to lead the teams responsible for building and evolving our core data infrastructure. In this role, you will oversee the development of our foundational data platform — encompassing experimentation frameworks, event ingestion pipelines, data … or platform teams at scale, ideally in consumer-facing or marketplace environments. Strong knowledge of distributed systems and modern data ecosystems, with hands-on experience using technologies such as Databricks, Apache Spark, Apache Kafka, and DBT. Proven success in building and managing data platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ELT More ❯
contributed to the delivery of complex business cloud solutions. The ideal candidate will have a strong background in Machine Learning engineering and an expert in operationalising models in the Databricks MLFlow environment (chosen MLOps Platform). Responsibilities: Collaborate with Data Scientists and operationalise the model with auditing enabled, ensure the run can be reproduced if needed. Implement Databricks best practices More ❯
contributed to the delivery of complex business cloud solutions. The ideal candidate will have a strong background in Machine Learning engineering and an expert in operationalising models in the Databricks MLFlow environment (chosen MLOps Platform). Responsibilities: Collaborate with Data Scientists and operationalise the model with auditing enabled, ensure the run can be reproduced if needed. Implement Databricks best practices More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Somerset Bridge
pricing actions. Collaborate with cross-functional teams to deliver the platform in an agile manner. Provide guidance on the implementation and management of Azure Cache (Redis), Postgres, Azure Redis, Databricks Delta Live tables, and Snowflake. Ensure the platform supports microservices and API-driven architecture with sub-2-second calls. Develop and maintain documentation, architecture diagrams, and other technical artifacts. Manage … What you'll need: Proven experience in ML Ops engineering, with a focus on Azure and Databricks. Strong knowledge of Postgres, Azure Cache (Redis) and Azure Redis. Experience with Databricks Delta Live tables and Snowflake. Experience with Docker and Azure Container Services. Familiarity with API service development and orchestration. Experience in Data (Delta) Lake Architecture (Azure). Excellent problem-solving More ❯
sharing and documentation for a more effective platform; Open to traveling to Octopus offices across Europe and the US. Our Data Stack: SQL-based pipelines built with dbt on Databricks Analysis via Python Jupyter notebooks Pyspark in Databricks workflows for heavy lifting Streamlit and Python for dashboarding Airflow DAGs with Python for ETL running on Kubernetes and Docker Django for More ❯
Time and batch processing of financial data. This is a long-term contract opportunity. The following skills/experience is essential: Strong Power BI (DAX, Power Query, data modelling) Databricks, Python, SQL, Spark are highly desirable Experience with investment banking data (trades, positions, market data, reference data) Excellent communication skills Rate: Up to £800/day Duration: 6 months + More ❯
City of London, Greater London, UK Hybrid / WFH Options
Hunter Bond
Time and batch processing of financial data. This is a long-term contract opportunity. The following skills/experience is essential: Strong Power BI (DAX, Power Query, data modelling) Databricks, Python, SQL, Spark are highly desirable Experience with investment banking data (trades, positions, market data, reference data) Excellent communication skills Rate: Up to £800/day Duration: 6 months + More ❯
stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow) SQL (Redshift, Snowflake or similar) AWS SageMaker → Azure ML migration, with Docker, Git, Terraform, Airflow/ADF Optional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building optimisation or recommendation systems at scale. Strong grasp of mathematical optimisation (e.g., linear/integer programming, meta-heuristics) as well … containerise models, deploy via Airflow/ADF, monitor drift, automate retraining. Soft skills: clear comms, concise docs, and a collaborative approach with DS, Eng & Product. Bonus extras: Spark/Databricks, Kubernetes, big-data panel or ad-tech experience. More ❯
timing, and budget requirements. Your coaching style enables you to elevate other project members to a higher level. Required Skills & Knowledge: Data Lake - Azure SQL Server - Data Factory -Azure Databricks - Azure Databricks -Azure Synapse Azure Data Explorer - Azure DevOps - Microsoft Fabric -Python & PySpark You have a strong background and min 5 years of experience working with Microsoft Azure Data Services More ❯
Coventry, Warwickshire, United Kingdom Hybrid / WFH Options
Cadent Gas
thinking culture, and help drive the energy transition for the UK. Code & create - Develop complex SQL and ABAP CDS views for analytics and reporting Transform & optimise - Use PySpark and Databricks to manipulate big data efficiently Automate & schedule - Manage workflows, jobs and clusters for scalable data processing Collaborate & deliver - Engage across agile teams to build high-impact solutions What You'll … into structured insights. Experience in building data pipelines and models in SAP Datasphere or SAP BW4/Hana Advanced skills in SQL, data modelling, and data transformation Familiarity with Databricks, Apache Spark, PySpark, and Delta Lake Agile mindset with experience in DevOps and iterative delivery Excellent communication and stakeholder engagement abilities Sound like a fit? Let's build the future More ❯
Strong data visualisation using Power BI and coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯