Databricks Data Engineer: £60,000 I am looking for a data engineer who has experience in Databricks, Azure, SQL, Python and Spark to join a well-established organisation who are currently expanding their data team. Our client is partnered with both Databricks and Microsoft and they deliver data solutions for … data engineers. You will be working directly with clients and work on a variety of different projects in an array of industries. Requirements: -Strong Databricks experience as well as Python and SQL -Azure or AWS experience Benefits: -Bonus -Flexible working -Annual salary review -25 days annual leave and bank holidays More ❯
Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Databricks Data Engineer: £60,000 I am looking for a data engineer who has experience in Databricks, Azure, SQL, Python and Spark to join a well-established organisation who are currently expanding their data team. Our client is partnered with both Databricks and Microsoft and they deliver data solutions for … data engineers. You will be working directly with clients and work on a variety of different projects in an array of industries. Requirements: -Strong Databricks experience as well as Python and SQL -Azure or AWS experience Benefits: -Bonus -Flexible working -Annual salary review -25 days annual leave and bank holidays More ❯
Burnley, Lancashire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Databricks Data Engineer: £60,000 I am looking for a data engineer who has experience in Databricks, Azure, SQL, Python and Spark to join a well-established organisation who are currently expanding their data team. Our client is partnered with both Databricks and Microsoft and they deliver data solutions for … data engineers. You will be working directly with clients and work on a variety of different projects in an array of industries. Requirements: -Strong Databricks experience as well as Python and SQL -Azure or AWS experience Benefits: -Bonus -Flexible working -Annual salary review -25 days annual leave and bank holidays More ❯
integrity, security, and accessibility. Key Skills & Experience: Strong programming skills in Python (PySpark). Hands-on experience with Azure Data Services (Azure Data Factory, Databricks, Synapse, Data Lakes, etc.). Experience with CI/CD pipelines for release management. Knowledge of best practices in data ingestion, transformation, and curation (Dimensional More ❯
have a strong understanding of various financial products and the trading life cycle. The role Advanced proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Knowledge of Kimball data modelling methodology Experience using scripting languages such as Python, PowerShell etc. Experience with Microsoft Azure. Strong knowledge of ETL/ More ❯
have a strong understanding of various financial products and the trading life cycle. The role Advanced proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Knowledge of Kimball data modelling methodology Experience using scripting languages such as Python, PowerShell etc. Experience with Microsoft Azure. Strong knowledge of ETL/ More ❯
with the expectation of 2-3 days/week onsite as the standard. Must Haves: Data Engineering/PowerBI experience Strong competence with Azure Databricks, SQL, Python, PowerBI, DAX Proven experience leading the design of data models, transformation logic, and build of all Power BI dashboards - including testing, optimization & integration More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Insight Global
with the expectation of 2-3 days/week onsite as the standard. Must Haves: Data Engineering/PowerBI experience Strong competence with Azure Databricks, SQL, Python, PowerBI, DAX Proven experience leading the design of data models, transformation logic, and build of all Power BI dashboards - including testing, optimization & integration More ❯
in a data engineering or data science capacity, ideally within a cloud-based setup. Solid grasp of SQL, ETL workflows, data modelling principles, and Databricks development. Familiarity with DevOps tooling such as Git, CI/CD pipelines, and Azure DevOps. Confident in working with programming languages like SQL and Python More ❯
in a data engineering or data science capacity, ideally within a cloud-based setup. Solid grasp of SQL, ETL workflows, data modelling principles, and Databricks development. Familiarity with DevOps tooling such as Git, CI/CD pipelines, and Azure DevOps. Confident in working with programming languages like SQL and Python More ❯
in code reviews Develop CI/CD pipelines using DevOps practices Monitor and support cloud services Essential Tech Skills: Cloud: Microsoft Azure Data Tools: Databricks, Data Factory, Dataflow, Azure SQL, Synapse Languages: SQL, Python Nice to Have: MARTech: MS Dynamics 365, SFMC, Informatica MDM, One Trust App Dev: .NET C# More ❯
Reading, Oxfordshire, United Kingdom Hybrid / WFH Options
Bright Purple
full Azure data stack ETL Pipelines Data Modelling (Logical, Physical, Conceptual) Data Mapping The skills we are looking for include: Azure Data Factory, Azure Databricks, Blobs, Azure SQL, Synapse, etc. Python development for data engineering C# experience would be advantageous Solid experience with databases A curious mindset! The role is More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
have expertise in some of the following: Python, SQL, Scala, and Java for data engineering. Strong experience with big data tools (Apache Spark, Hadoop, Databricks, Dask) and cloud platforms (AWS, Azure, GCP). Proficient in data modelling (relational, NoSQL, dimensional) and DevOps automation (Docker, Kubernetes, Terraform, CI/CD). More ❯
teams spread globally. What we value These skills will help you succeed in this role: Full stack cloud developer skills: Data (Delta Lake/Databricks), PL/SQL, Java/J2EE, React, CI/CD pipeline, and release management. Strong experience in Python, Scala/PySpark, PERL/scripting. Experience More ❯
as AWS. Ability to collaborate with stakeholders to develop clear business requirements. Experience with Big Data technologies, Data Lakes, Data Warehouses, Lakehouses. Proficiency in Databricks and Python, including concurrency and error handling. Experience with ETL tools and data visualization tools. Preferred qualifications, capabilities, and skills Experience with AWS services like More ❯
able to prioritize across several projects and to lead and coordinate larger initiatives. Good Python and SQL skills, experience with the AWS stack, Spark, Databricks and/or Snowflake desirable. Solid understanding of statistical modelling and machine learning algorithms, and experience deploying and managing models in production. Experience with Aviation More ❯
and SQL for accessing and processing data (PostgreSQL preferred but general SQL knowledge is more important). Familiarity with latest Data Science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and frameworks (e.g. Tensorflow, MXNet, scikit-learn). Knowledge of software engineering practices (coding practices to DS, unit testing, version control, code More ❯
Cambourne, Cambridgeshire, United Kingdom Hybrid / WFH Options
Remotestar
various industries to deliver cutting-edge technology solutions. Must Have Skills: Solid expertise in data modeling, with considerable experience in database technologies, particularly Azure Databricks, Azure Synapse, and Azure Data Factory, to support business intelligence, analytics, and reporting initiatives. Strong understanding of best practices in data modeling to ensure that More ❯
able to prioritise across several projects and to lead and coordinate larger initiatives. Good Python and SQL skills, experience with the AWS stack, Spark, Databricks and/or Snowflake desirable. Solid understanding of statistical modelling and machine learning algorithms, and experience deploying and managing models in production. Experience with Aviation More ❯
architectures, with a focus on Kafka and Confluent platforms. In-depth knowledge in architecting and implementing data lakes and lakehouse platforms, including experience with Databricks and Unity Catalog. Proficiency in conceptualising and applying Data Mesh and Data Fabric architectural patterns. Experience in developing data product strategies, with a strong inclination More ❯
Experience building scalable, high-quality data models that serve complex business use cases. Knowledge of dbt and experience with cloud data warehouses (BigQuery, Snowflake, Databricks, Azure Synapse etc). Proficiency in building BI dashboards and self-service capabilities using tools like Tableau and Looker. Experience communicating with business stakeholders, managing More ❯
Blackpool, England, United Kingdom Hybrid / WFH Options
Perch Group
infrastructure as code (IaC). Experience with containerization and orchestration technologies (Docker, Kubernetes). Experience with data integration technologies (e.g., Azure Data Factory, Azure Databricks). Experience with system monitoring tools. Experience with event driven architectures. ⌛️ The Application Timeline A first stage video call with the internal recruitment team More ❯
Exciting Opportunity in Health Tech! Position: Data Engineer (Python/Databricks) Location: Remote Salary: up to £80,000 + Bens Reporting To: Vice President of Software Development Are you passionate about health tech and innovation? Do you want to be at the forefront of transforming clinical research with cutting-edge … integrations. Ensure Data Security : Apply protocols and standards to secure clinical data in-motion and at-rest. Shape Data Workflows : Use your expertise with Databricks components such as Delta Lake, Unity Catalog, and ML Flow to ensure our data workflows are efficient, secure, and reliable. Key Responsibilities Data Engineering with … Databricks : Utilize Databricks to design and maintain scalable data infrastructure. Integration with Azure Data Factory : Leverage Azure Data Factory for orchestrating and automating data movement and transformation. Python Development : Write clean, efficient code in Python (3.x), using frameworks like FastAPI and Pydantic. Database Management : Design and manage relational schemas and More ❯
Data Engineer – Databricks About the Role We’re looking for a Databricks Champion to design, build, and optimize data pipelines using Databricks. You’ll work with clients and internal teams to deliver scalable, efficient data solutions tailored to business needs. Key Responsibilities Develop ETL/ELT pipelines with Databricks and … Delta Lake Integrate and process data from diverse sources Collaborate with data scientists, architects, and analysts Optimize performance and manage Databricks clusters Build cloud-native solutions (Azure preferred, AWS/GCP also welcome) Implement data governance and quality best practices Automate workflows and maintain CI/CD pipelines Document architecture … and processes What We’re Looking For Required: 5+ years in data engineering with hands-on Databricks experience Proficient in Databricks, Delta Lake, Spark, Python, SQL Cloud experience (Azure preferred, AWS/GCP a plus) Strong problem-solving and communication skills Databricks Champion More ❯
about solving business problems. Key responsibilities: Data Platform Design and Architecture Design, develop, and maintain a high-performing, secure, and scalable data platform, leveraging Databricks Corporate Lakehouse and Medallion Architectures. Utilise our metadata-driven data platform framework combined with advanced cluster management techniques to create and optimise scalable, robust, and … SaaS applications using end-to-end dependency-based data pipelines, to establish an enterprise source of truth. Create ETL and ELT processes using Azure Databricks, ensuring audit-ready financial data pipelines and secure data exchange with Databricks Delta Sharing and SQL Warehouse endpoints. Governance and Compliance Ensure compliance with information … security standards in our highly regulated financial landscape by implementing Databricks Unity Catalog for governance, data quality monitoring, and ADLS Gen2 encryption for audit compliance. Development and Process Improvement Evaluate requirements, create technical design documentation, and work within Agile methodologies to deploy and optimise data workflows, adhering to data platform More ❯