London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
testing methodologies and development team collaboration Experience working with Power BI and DAX Strong documentation, communication, and stakeholder engagement skills Preferred Qualifications: Experience with Lakehouse architecture, Delta Lake, or Databricks Exposure to Agile/Scrum working practices Microsoft certifications (e.g., Azure Data Engineer Associate) Background in consulting or professional services Understanding of data governance and data security principles Nice to More ❯
Derby, Derbyshire, East Midlands, United Kingdom Hybrid / WFH Options
Morson Talent
candidate. If you meet around 75% of the criteria, we'd still love to hear from you. • Strong experience with data engineering tools and platforms (e.g. Azure Data Factory, Databricks, SQL, Python). • Proven ability to engage stakeholders and present insights through data visualisation tools and storytelling. • Experience with data modelling, warehousing, and integration. • Interest in and awareness of AI More ❯
CDO level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
CDO level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data More ❯
can influence promised outcomes. Drive high client value and broaden relationships at senior levels with current and prospective clients. Our Tech Stack Cloud: Azure, sometimes GCP & AWS Data Platform: Databricks, Snowflake, BigQuery Data Engineering tools: Pyspark, Polars, DuckDB, Malloy, SQL Infrastructure-as-code: Terraform, Pulumi Data Management and Orchestration: Airflow, dbt Databases and Data Warehouses: SQL Server, PostgreSQL, MongoDB, Qdrant More ❯
richmond, virginia, united states Hybrid / WFH Options
CarMax
drive organization-wide solutions Data & Analytics Specific Qualifications: Experience building enterprise-grade cloud-native and SaaS Data & Analytics platforms and solutions. Microsoft Azure experience is preferred. Experience with Snowflake, Databricks, Azure Data Factory and other Azure services, Tableau/Power BI Experience with Data Orchestration and Observability technologies and platforms Familiarity with Azure AI services (Azure Machine Learning, Azure Cognitive More ❯
Remote, Oregon, United States Hybrid / WFH Options
INSPYR Solutions
and maintain scalable data pipelines using Apache Spark on Databricks. Build and manage Delta Lake architectures for efficient data storage and retrieval. Implement robust ETL/ELT workflows using Databricks notebooks, SQL, and Python. Collaborate with AI/ML teams to operationalize models within the Databricks environment. Optimize data workflows for performance, reliability, and cost-efficiency in cloud platforms (AWS … Azure, or GCP). Ensure data quality, lineage, and governance using tools like Unity Catalog and MLflow. Develop CI/CD pipelines for data and ML workflows using Databricks Repos and Git integrations. Monitor and troubleshoot production data pipelines and model deployments. Key Responsibilities Strong hands-on experience with Databricks, including Spark, Delta Lake, and MLflow. Proficiency in Python, SQL … architecture, and real-time streaming (Kafka, Spark Structured Streaming). Experience with version control, CI/CD, and infrastructure-as-code tools. Excellent communication and collaboration skills. Certifications in Databricks (e.g., Databricks Certified Data Engineer Associate/Professional). Experience with feature engineering and feature stores in Databricks. Exposure to MLOps practices and tools. Bachelor's or Master's degree More ❯
to clean, transform, and enrich data from various sources. Ensure that pipelines are automated, scalable, and fault-tolerant to accommodate large volumes of data. Experience with Notebooks (e.g., Jupyter, Databricks) for data exploration, analysis, and reporting. Design and optimise data workflows to streamline key processing tasks, enhancing operational efficiency. API Integration & Data Ingestion: Integrate external and internal APIs to ingest More ❯
Airflow, dbt, Talend, AWS Glue, GCP Dataform/Cloud Composer) Proven ability to design, deploy, and optimize data warehouses and lakehouse architectures using technologies like BigQuery, Redshift, Snowflake, and Databricks Experience with Infrastructure as Code tools (e.g., Terraform, AWS CloudFormation, GCP Deployment Manager) for cloud resource provisioning and management Proficiency with CI/CD pipelines and DevOps practices for data More ❯
basis your varied role will include, but will not be limited to: Design, build, and optimize high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks or Microsoft Fabric. Implement scalable solutions to ingest, store, and transform vast datasets, ensuring data availability and quality across the organization. Write clean, efficient, and reusable Python code tailored to More ❯
or similar platforms. Strong understanding of data warehousing concepts, data modelling, and ETL processes. Strong understanding of SAP Datasphere and basic knowledge of SAP Business data cloud and Azure Databricks Excellent analytical and problem-solving skills, with the ability to work with complex data sets. Strong communication and interpersonal skills, with the ability to collaborate effectively with technical and non More ❯
Salary: 50.000 - 60.000 € per year Requirements: • 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache Spark • Strong programming skills in Python, with experience in data manipulation libraries (e.g., PySpark, Spark SQL) • Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables • Solid understanding of data warehousing principles, ETL … and German (min. B2 levels) • Ability to work independently as well as part of a team in an agile environment Responsibilities: As a Data Engineer with a focus on Databricks, you will play a key role in building modern, scalable, and high-performance data solutions for our clients. You'll be part of our growing Data & AI team and work … hands-on with the Databricks platform, supporting clients in solving complex data challenges. • Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python • Building efficient and scalable ETL processes to ingest, transform, and load data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses • Leveraging the Databricks ecosystem (SQL, Delta Lake, Workflows, Unity More ❯
Promote user adoption, training, and change management initiatives Ensure high standards of documentation and data security compliance Technical Skills (desirable): Microsoft Azure Data Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding More ❯
Reading, England, United Kingdom Hybrid / WFH Options
HD TECH Recruitment
Promote user adoption, training, and change management initiatives Ensure high standards of documentation and data security compliance Technical Skills (desirable): Microsoft Azure Data Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding More ❯
slough, south east england, united kingdom Hybrid / WFH Options
HD TECH Recruitment
Promote user adoption, training, and change management initiatives Ensure high standards of documentation and data security compliance Technical Skills (desirable): Microsoft Azure Data Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding More ❯
as an Azure Data Architect or Lead Data Engineer. Expertise in Microsoft Fabric, as well as in one or more of the following platforms: Azure Synapse Analytics and Azure Databricks, and supporting data technologies like Power BI, Microsoft Purview, Azure SQL Database, Azure Data Lake, etc. Excellent stakeholder engagement and communication skills. Able to consult, advise, architect and estimate project More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Pharmaceutical Company - Manchester(Tech Stack: Data Engineer, Databricks, Python, Power BI, Azure, TSQL, ETL, Agile Methodologies)About the Role: We are seeking a talented and experienced Data Engineer on behalf of our client, a leading Software House. This is a fully remote position, offering the opportunity to work with cutting-edge technologies and contribute to exciting projects More ❯
Lincolnshire, England, United Kingdom Hybrid / WFH Options
Akkodis
real-world applications. We're looking for someone with a good foundation in Data Engineering, ideally with exposure to cloud platforms like AWS, Azure, or GCP, and tools like Databricks or Snowflake. If you've worked in Agile teams, used BI tools like Power BI or Tableau, or have a keen interest in data governance and automation, even better. In More ❯
East Midlands, United Kingdom Hybrid / WFH Options
Akkodis
real-world applications. We're looking for someone with a good foundation in Data Engineering, ideally with exposure to cloud platforms like AWS, Azure, or GCP, and tools like Databricks or Snowflake. If you've worked in Agile teams, used BI tools like Power BI or Tableau, or have a keen interest in data governance and automation, even better. In More ❯
equivalent regulated environments). Proven track record designing and deploying data platforms end-to-end (architecture through production). Hands-on expertise with: o Azure Synapse Analytics, Data Factory, Databricks, Event Hubs, Data Lake Storage (Gen2). o Database technologies (SQL Server, PostgreSQL, Cosmos DB). o Programming languages (Python, PySpark, SQL). Strong experience in data security, access management More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of unity catalog alongside core databricks functionality to drive metadata management Strong understanding of cloud economics, including cost management strategies and optimising solutions for customer needs. Experience with infrastructure as code, proficiency using tools such as More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
challenges. Culture builder, driving continuous improvement and operational excellence. Deep expertise in data compliance frameworks, cost management, and platform optimisation. Strong hands-on experience with modern cloud data warehouses (Databricks, Snowflake, AWS), SQL, Spark, Airflow, Terraform. Advanced Python skills with orchestration tooling; solid experience in CI/CD (Git, Jenkins). Proven track record in data modelling, batch/real More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
Skilled in team collaboration and exhibits good interpersonal skills. Able to prioritise, multi-task, and deliver at pace with an iterative mindset. Experience with modern data platforms such as Databricks or Snowflake is advantageous; LLM API experience is a bonus. Additional Information What’s in it for you?: Competitive salary that reflects your skills, experience and potential. Discretionary bonus scheme More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Amtis Professional Ltd
CD and testing practices for data workflows Troubleshoot issues and continuously improve data infrastructure Explore AI-driven enhancements to boost data accuracy and productivity Requirements: Strong experience with: Azure Databricks, Data Factory, Blob Storage Python/PySpark SQL Server, Parquet, Delta Lake Deep understanding of: ETL/ELT, CDC, stream processing Lakehouse architecture and data warehousing Scalable pipeline design and More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Bowerford Associates
experience (including QGIS) FME Advanced Database and SQL skills Certifications : AWS or FME certifications are a real plus. Experience with ETL tools such as AWS Glue, Azure Data Factory, Databricks or similar is a bonus. The role comes with excellent benefits to support your well-being and career growth. KEYWORDS Principal Geospatial Data Engineer, Geospatial, GIS, QGIS, FME, AWS, On More ❯
Employment Type: Temporary
Salary: £80000 - £500000/annum Pension, Good Holiday, Insurances