teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous … of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python More ❯
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
Constructor’s Championship. You will be responsible for: Working with stakeholders to understand their data requirements Building pipelines to ingest data into the data lake Designing, building and deploying dashboards and reports in Power BI Performing data mapping and modelling Developing and update technical documentation Implementing and owning a … of designing and building data models using Data Analysis Expressions (DAX) Hands-on experience with Azure tools: Azure Data Factory, Synapse, Databricks, SQL, Data Lake, Power BI, DeltaLake, and Spark. Ability to design, build, and deploy interactive user interfaces for interrogating data Experience of Power Automate More ❯
business use cases. Strong knowledge of data governance, data warehousing, and data security principles. Hands-on experience with modern data stacks and technologies (e.g., DeltaLake, SQL, Python, Azure/AWS/GCP). Experience aligning data capabilities with commercial strategy and business performance. Exceptional communication skills. More ❯
a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯
Experience in open-source technologies like Spark, Kafka, Beam understanding of Cloud tools such as AWS, Microsoft Azure or Google Cloud Familiarity with DBT, DeltaLake, Databricks Experience working in an agile environment About Us We are Citation. We are far from your average service provider. Our colleagues More ❯
potential to extend). This is a fully remote role and sits outside IR35. Key Requirements: Significant experience designing and scaling cloud-based data lake architectures (ideally DeltaLake on Databricks or similar) Deep expertise in workflow orchestration using Airflow, with production-grade DAGs and solid dependency More ❯
code repositories, and automation. Requirements 5+ Years experience in the Data and Analytics Domain. Previous Management experience is preferred. Strong expertise in Databricks (Spark, DeltaLake, Notebooks). Advanced knowledge of SQL development. • Familiarity with Azure Synapse for orchestration and analytics. Working experience with Power BI for reporting More ❯
standards and drive excellence in engineering practices. Architect and oversee the development of cloud-native data infrastructure and pipelines using Databricks , Python , PySpark , and DeltaLake . Guide the implementation of embedded analytics, headless APIs, and real-time dashboards for customer-facing platforms. Partner with Product Owners and … in data/analytics engineering, including 2+ years in a leadership or mentoring role. Strong hands-on expertise in Databricks , Spark , Python , PySpark , and Delta Live Tables . Experience designing and delivering scalable data pipelines and streaming data processing (e.g., Kafka , AWS Kinesis , or Azure Stream Analytics ). Background More ❯
team of 45 people, including Data Scientists, ML Engineers and 2 Data Engineers. Day‑to‑day you will: Monitor, optimise and rebuild ETL/DeltaLake workflows in Databricks. Migrate legacy ingestion jobs to modern, cloud‑native patterns (Azure preferred, some AWS/GCP). Collaborate with scientists More ❯
and building end-to-end data pipelines. Proficient in Python and/or Scala; solid understanding of SQL and distributed computing principles. Experience with DeltaLake, Lakehouse architecture, and data governance frameworks. Excellent client-facing and communication skills. Experience in Azure Data Services is desirable (e.g. Azure Data … Lake, Synapse, Data Factory, Fabric). More ❯
with 3+ years leading Databricks-based solutions. Proven experience in a consulting environment delivering large-scale data platform projects. Hands-on expertise in Spark, DeltaLake, MLflow, Unity Catalog, and DBSQL. Strong proficiency in Python, SQL, and at least one major cloud platform (AWS, Azure, or GCP). More ❯
Modeler/Architect London Market Insurance industry experience Using Data Modelling such as Erwin Experience with modelling methodologies including Kimball etc Usage of Data Lake formats such as Parquet and DeltaLake Strong SQL skills Rate: £600 - £700 P/D Outside IR35 Contract Duration: 6 months More ❯
implement code ready for deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, DeltaLake, PySpark, Spark SQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
understanding of cloud security, networking, and DR principles Experience working on major digital transformation or IT consolidation programmes 💻 Tech Stack/Tooling Databricks , Spark, DeltaLake Terraform , GitHub/GitLab, Azure DevOps Cloud-native services: e.g. GCP (Dataflow, BigQuery), Azure, or AWS equivalents Jira, Confluence CI/CD More ❯
This role is ideal for someone who enjoys blending technical precision with innovation. You’ll: Build and manage ML pipelines in Databricks using MLflow, DeltaLake, Spark, and Mosaic AI. Train and deploy generative models (LLMs, GANs, VAEs) for NLP, content generation, and synthetic data. Architect scalable solutions More ❯
and data engineers, lead small project teams, and help recruit top talent as we grow. Technical Delivery: Provide hands-on support with Databricks notebooks, DeltaLake, Spark optimization, MLflow integration, and related areas when necessary. Business Development Support: Partner with sales and leadership teams to support presales activities More ❯