Science, Mathematics, Engineering or other STEM A strong team player with empathy, humility and dedication to joint success and shared development. Desirable Experience and Qualifications: Experience with Databricks or DeltaLake architecture. Experience building architecture and Data Warehousing within the Microsoft Stack Experience in development Source control (e.g. Bit Bucket, Github) Experience in Low Code Analytical Tools (e.g. More ❯
Python for data engineering tasks. • Familiarity with GitLab for version control and CI/CD. • Strong understanding of unit testing and data validation techniques. Preferred Qualifications: • Experience with Databricks DeltaLake, Unity Catalog, and MLflow. • Knowledge of CloudFormation or other infrastructure-as-code tools. • AWS or Databricks certifications. • Experience in large-scale data migration projects. • Background in Finance More ❯
Python for data engineering tasks. • Familiarity with GitLab for version control and CI/CD. • Strong understanding of unit testing and data validation techniques. Preferred Qualifications: • Experience with Databricks DeltaLake, Unity Catalog, and MLflow. • Knowledge of CloudFormation or other infrastructure-as-code tools. • AWS or Databricks certifications. • Experience in large-scale data migration projects. • Background in Finance More ❯
Python for data engineering tasks. • Familiarity with GitLab for version control and CI/CD. • Strong understanding of unit testing and data validation techniques. Preferred Qualifications: • Experience with Databricks DeltaLake, Unity Catalog, and MLflow. • Knowledge of CloudFormation or other infrastructure-as-code tools. • AWS or Databricks certifications. • Experience in large-scale data migration projects. • Background in Finance More ❯
across SQL Server, PostgreSQL, and cloud databases Proven track record with complex data migration projects (terabyte+ datasets, multiple legacy source systems, structures and unstructured data) Proficiency with Parquet/DeltaLake or other modern data storage formats Experience with streaming architectures using Kafka, Event Hubs, or Kinesis for real-time data processing Knowledge of data architectures supporting AI More ❯
cambridge, east anglia, united kingdom Hybrid / WFH Options
KDR Talent Solutions
a scalable, company-wide Databricks Lakehouse platform on AWS. Be the hands-on technical expert, building and optimising robust ELT/ETL pipelines using Python, Spark, and Databricks (e.g., Delta Live Tables, Databricks Workflows). Work with unique, complex, and high-volume datasets from IoT-enabled robotic systems, manufacturing lines, and core business functions. Partner with data scientists and … BI teams to establish best-in-class data models, governance, and data quality standards within Delta Lake. Evangelise the benefits of the Lakehouse across the organisation, championing best practices and mentoring other engineers to build their Databricks capability. Own the data platform's roadmap, ensuring it is scalable, reliable, and secure as the company grows. What You'll Need … Proven, deep commercial experience with Databricks. You must have hands-on expertise with DeltaLake and the Lakehouse paradigm. Strong expertise in the AWS data ecosystem (e.g., S3, AWS Glue, Kinesis, IAM) and a deep understanding of how to build, secure, and optimise a Databricks platform within it. Expert-level Python and SQL skills, specifically for data engineering More ❯
Cambridgeshire, England, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
a scalable, company-wide Databricks Lakehouse platform on AWS. Be the hands-on technical expert, building and optimising robust ELT/ETL pipelines using Python, Spark, and Databricks (e.g., Delta Live Tables, Databricks Workflows). Work with unique, complex, and high-volume datasets from IoT-enabled robotic systems, manufacturing lines, and core business functions. Partner with data scientists and … BI teams to establish best-in-class data models, governance, and data quality standards within Delta Lake. Evangelise the benefits of the Lakehouse across the organisation, championing best practices and mentoring other engineers to build their Databricks capability. Own the data platform's roadmap, ensuring it is scalable, reliable, and secure as the company grows. What You'll Need … Proven, deep commercial experience with Databricks. You must have hands-on expertise with DeltaLake and the Lakehouse paradigm. Strong expertise in the AWS data ecosystem (e.g., S3, AWS Glue, Kinesis, IAM) and a deep understanding of how to build, secure, and optimise a Databricks platform within it. Expert-level Python and SQL skills, specifically for data engineering More ❯
cambridgeshire, east anglia, united kingdom Hybrid / WFH Options
KDR Talent Solutions
a scalable, company-wide Databricks Lakehouse platform on AWS. Be the hands-on technical expert, building and optimising robust ELT/ETL pipelines using Python, Spark, and Databricks (e.g., Delta Live Tables, Databricks Workflows). Work with unique, complex, and high-volume datasets from IoT-enabled robotic systems, manufacturing lines, and core business functions. Partner with data scientists and … BI teams to establish best-in-class data models, governance, and data quality standards within Delta Lake. Evangelise the benefits of the Lakehouse across the organisation, championing best practices and mentoring other engineers to build their Databricks capability. Own the data platform's roadmap, ensuring it is scalable, reliable, and secure as the company grows. What You'll Need … Proven, deep commercial experience with Databricks. You must have hands-on expertise with DeltaLake and the Lakehouse paradigm. Strong expertise in the AWS data ecosystem (e.g., S3, AWS Glue, Kinesis, IAM) and a deep understanding of how to build, secure, and optimise a Databricks platform within it. Expert-level Python and SQL skills, specifically for data engineering More ❯
Python for data engineering tasks. Familiarity with GitLab for version control and CI/CD. Strong understanding of unit testing and data validation techniques. Preferred Qualifications: Experience with Databricks DeltaLake, Unity Catalog, and MLflow. Knowledge of CloudFormation or other infrastructure-as-code tools. AWS or Databricks certifications. Experience in large-scale data migration projects. Background in Finance More ❯
engineering tasks. . Familiarity with GitLab for version control and CI/CD. . Strong understanding of unit testing and data validation techniques. Preferred Qualifications: . Experience with Databricks DeltaLake, Unity Catalog, and MLflow. . Knowledge of CloudFormation or other infrastructure-as-code tools. . AWS or Databricks certifications. . Experience in large-scale data migration projects. More ❯
Python for data engineering tasks. Familiarity with GitLab for version control and CI/CD. Strong understanding of unit testing and data validation techniques. Preferred Qualifications: Experience with Databricks DeltaLake, Unity Catalog, and MLflow. Knowledge of CloudFormation or other infrastructure-as-code tools. AWS or Databricks certifications. Experience in large-scale data migration projects. Background in Finance More ❯
models and reports. Experience required: Strong background in data engineering, warehousing, and data quality. Proficiency in Microsoft 365, Power BI, and other BI tools Familiarity with Azure Databricks and DeltaLake is desirable. Ability to work autonomously in a dynamic environment and contribute to team performance. Strong communication, influencing skills, and a positive, can-do attitude. Knowledge of More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
maintain real-time streaming data systems (Kafka, Kinesis, or Flink) Build robust feature pipelines using Airflow, Prefect, or Dagster Manage and optimise data storage solutions (Snowflake, BigQuery, Redshift, or DeltaLake) Automate and scale model training pipelines in close partnership with ML engineers Deploy, observe, and improve pipelines using Docker, Kubernetes, Terraform, or dbt Champion data reliability, scalability More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
maintain real-time streaming data systems (Kafka, Kinesis, or Flink) Build robust feature pipelines using Airflow, Prefect, or Dagster Manage and optimise data storage solutions (Snowflake, BigQuery, Redshift, or DeltaLake) Automate and scale model training pipelines in close partnership with ML engineers Deploy, observe, and improve pipelines using Docker, Kubernetes, Terraform, or dbt Champion data reliability, scalability More ❯
models are learning, and improving, in production. Architect and maintain real-time streaming data systems (Kafka, Kinesis, or Flink) Manage and optimise data storage solutions (Snowflake, BigQuery, Redshift, or DeltaLake) Automate and scale model training pipelines in close partnership with ML engineers Champion data reliability, scalability, and performance across the platform Languages: Python, Scala, Go Data storage More ❯
the technical lead and design authority Ability to partner with and influence senior client stakeholders to drive the programme to the required outcomes Hands on experience of Databricks including DeltaLake and Unity Catalog Experience of cloud architectures. We favour Azure and AWS. You have guided data engineers and analysts through optimising their workloads and take FinOps at More ❯
unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, DeltaLake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the More ❯
Gloucestershire, England, United Kingdom Hybrid / WFH Options
Hexegic
requirements Updating security, software, dependencies, libraries etc Set up and maintain Spark Clusters on current platforms, migrating to new platforms Manage user accounts and permissions across IdP Maintain the DeltaLake Ensure secure by design assurance across the platform What we are looking for Excellent Linux engineering experience Strong Kubernetes and Docker engineering experience Confident in scripting languages More ❯
gloucester, south west england, united kingdom Hybrid / WFH Options
Hexegic
requirements Updating security, software, dependencies, libraries etc Set up and maintain Spark Clusters on current platforms, migrating to new platforms Manage user accounts and permissions across IdP Maintain the DeltaLake Ensure secure by design assurance across the platform What we are looking for Excellent Linux engineering experience Strong Kubernetes and Docker engineering experience Confident in scripting languages More ❯
Kubernetes stack with secure-by-design tools Update security, software, dependencies and libraries Set up and migrate Spark clusters across platforms Manage user accounts and IdP permissions Maintain the DeltaLake Ensure secure-by-design assurance throughout the platform Experience Required: Strong Linux engineering background Expertise in Kubernetes and Docker Proficient in scripting (Python, Bash) Experience with air More ❯
Kubernetes stack with secure-by-design tools Update security, software, dependencies and libraries Set up and migrate Spark clusters across platforms Manage user accounts and IdP permissions Maintain the DeltaLake Ensure secure-by-design assurance throughout the platform Experience Required: Strong Linux engineering background Expertise in Kubernetes and Docker Proficient in scripting (Python, Bash) Experience with air More ❯
engineering experience Strong Kubernetes and Docker knowledge Confident scripting in Python and Bash Experience with secure or air-gapped environments Familiarity with HPC or distributed data systems (e.g. Spark, DeltaLake) Knowledge of security, encryption, and compliance standards TO BE CONSIDERED: Please either apply through this advert or email me directly at . For further information, call me More ❯