Key Responsibilities Lead the design, development, and maintenance of scalable, high-performance data pipelines on Databricks. Architect and implement data ingestion, transformation, and integration workflows using PySpark, SQL, and Delta Lake. Guide the team in migrating legacy ETL processes to modern cloud-based data pipelines. Ensure data accuracy, schema consistency, row counts, and KPIs during migration and transformation. Collaborate … and analytics. ________________________________________ Required Skills & Qualifications 10-12 years of experience in data engineering, with at least 3+ years in a technical lead role. Strong expertise in Databricks , PySpark , and DeltaLake . DBT Advanced proficiency in SQL, ETL/ELT pipelines, and data modelling. Experience with Azure Data Services (ADLS, ADF, Synapse) or other major cloud platforms (AWS … of data warehousing , transformation logic , SLAs, and dependencies. Hands-on experience with real-time streaming near-realtime batch is a plus., optimisation of data bricks and DBT workload and DeltaLake Familiarity with CI/CD pipelines, DevOps practices, and Git-based workflows. Knowledge of data security, encryption, and compliance frameworks (GDPR, SOC2, ISO ).good to have Excellent More ❯
Birmingham, West Midlands, England, United Kingdom Hybrid / WFH Options
MYO Talent
Data Engineer/Data Engineering/Data Consultant/Lakehouse/DeltaLake/Data Warehousing/ETL/Azure/Azure Databricks/Python/SQL/Based in the West Midlands/Solihull/Birmingham area, Permanent role, £ + car/allowance (£5,000) + 15% bonus. One of our leading clients is looking to recruit … role Salary £ + car/allowance + bonus Experience: Experience in a Data Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks Microsoft SQL Server Lakehouse, DeltaLake Data Warehousing ETL Database Design Python/PySpark Azure Blob Storage Azure Data Factory Desirable: Exposure ML/Machine Learning/AI/Artificial Intelligence More ❯
Reston, Virginia, United States Hybrid / WFH Options
ICF
/Durham, NC, Reston, Virginia; Atlanta, GA. What you'll be doing: Enable secure, scalable, and efficient data exchange between federal client and external data sharing partners using Databricks Delta Sharing. Support the design and development of data pipelines and ETL routines in Azure Cloud environment for many source system types including RDBMS, API, and unstructured data using CDC … Data Analytics, or a related discipline. Minimum 5+ years in data engineering, data security practices, data platforms, and analytics 3+ years Databricks Platform Expertise - SME Level Proficiency including: Databricks, DeltaLake, and Delta Sharing Deep experience with distributed computing using Apache Spark Knowledge of Spark runtime internals and optimization Ability to design and deploy performant end-to More ❯
support advanced analytics, AI, and business intelligence use cases. Proven experience in designing architectures for structured, semi-structured, and unstructured data , leveraging technologies like Databricks, Snowflake, Apache Kafka , and DeltaLake to enable seamless data processing and analytics. Hands-on experience in data integration , including designing and optimising data pipelines (batch and streaming) and integrating cloud-based platforms More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: DeltaLake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: DeltaLake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
london, south east england, united kingdom Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: DeltaLake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: DeltaLake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: DeltaLake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: DeltaLake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You'll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Lambda, EMR) Strong communication skills and a collaborative mindset Comfortable working in Agile environments and engaging with stakeholders Bonus Skills Experience with Apache Iceberg or similar table formats (e.g., DeltaLake, Hudi) Exposure to CI/CD tools like GitHub Actions, GitLab CI, or Jenkins Familiarity with data quality frameworks such as Great Expectations or Deequ Interest in More ❯
facing consulting environment, with the ability to manage stakeholder expectations, navigate complex requirements, and deliver tailored solutions across diverse industries. 5+ years' experience working with Databricks, including Spark and DeltaLake Strong skills in Python and/or Scala for data engineering tasks Comfortable working with cloud platforms like Azure, AWS, and/or Google Cloud A problem More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Prism, Inc
stakeholder engagement and executive presentations. Bachelor's or Master's degree in Data Science, Computer Science, or a related field. Nice to Have: Familiarity with Azure Synapse, Databricks, or Delta Lake. Microsoft certifications (e.g., Azure Data Scientist Associate, Power BI). Exposure to CI/CD pipelines and DevOps practices for data workflows. Proficiency in Python for statistical modeling More ❯
Oak Brook, Illinois, United States Hybrid / WFH Options
Ace Hardware Corporation
performance and cost across the Databricks platform. Collaborate with teams to support data pipelines, ML workflows, and analytics use cases. Integrate Databricks with cloud storage solutions (S3, ADLS) and Delta Lake. Troubleshoot issues related to jobs, performance, or user access. Maintain platform documentation and conduct user training. Stay current with Databricks updates, new features, and best practices. Support hybrid … administering Cloudera Hadoop clusters. 2-3+ years of Databricks experience in production environments. 2+ years of Databricks administration experience on Azure (preferred). Strong knowledge of Spark and DeltaLake architecture. Experience with IAM, Active Directory, and SSO integration. Familiarity with DevOps and CI/CD for data platforms. Deep understanding of Hadoop ecosystem: Hive, Impala, Spark … Python for automation and system administration. Solid foundation in Linux/Unix system administration. Preferred Qualifications Experience with cloud technologies (Azure or GCP preferred). Exposure to modern data lake and hybrid cloud architectures. Cloudera and/or Databricks certifications. Familiarity with infrastructure-as-code and automation tools (e.g., Terraform). Experience supporting Oracle database administration (e.g., backups, user More ❯
Hollywood, Florida, United States Hybrid / WFH Options
INSPYR Solutions
hybrid role-60% administration and 40% development/support-to help us scale our data and DataOps infrastructure. You'll work with cutting-edge technologies like Databricks, Apache Spark, DeltaLake, and AWS CloudOps, Cloud Security, while supporting mission-critical data pipelines and integrations. If you're a hands-on engineer with strong Python skills, deep AWS experience … in integration framework development with a strong emphasis on Databricks, AWS, and ETL. Required Technical Skills Strong programming skills in Python and PySpark. Expertise in Databricks, Apache Spark, and Delta Lake. Proficiency in AWS CloudOps, Cloud Security, including configuration, deployment, and monitoring. Strong SQL skills and hands-on experience with Amazon Redshift. Experience with ETL development, data transformation, and More ❯
with the ability to manage stakeholder expectations, navigate complex requirements, and deliver tailored solutions across diverse industries. 7 years' experience working with Databricks. Good hands on experience with Spark, DeltaLake, and Unity Catalog Strong understanding of cloud platforms like Azure, AWS and/or Google Cloud Experience designing data lakes, lakehouses, and modern data platforms Proven experience More ❯
Herefordshire, West Midlands, United Kingdom Hybrid / WFH Options
IO Associates
and maintain platform software, libraries, and dependencies . Set up and manage Spark clusters , including migrations to new platforms. Manage user accounts and permissions across identity platforms. Maintain the DeltaLake and ensure platform-wide security standards. Collaborate with the wider team to advise on system design and delivery . What we're looking for: Strong Linux engineering More ❯