Remote Databricks Job Vacancies

76 to 100 of 860 Remote Databricks Jobs

Azure Data Engineer - SC Cleared

London, England, United Kingdom
Hybrid / WFH Options
ShareForce, Inc
This role is an opportunity to lead the build of bespoke data systems for our clients. Responsibilities Design and implement scalable data pipelines and ETL processes using Azure and Databricks technologies including Delta Live Tables. Lead technical discussions with clients and stakeholders to gather requirements and propose solutions. Help clients realise the potential of data science, machine learning, and scaled … data processing within Azure/Databricks ecosystem. Mentor junior team members and support their personal development. Take ownership for the delivery of core solution components. Support with planning, requirements refinement and work estimation. Skills and Experiences Design and develop end-to-end data solutions leveraging Azure services for batch, real-time, and streaming workloads (including data ingestion, cleansing, modelling, and … data platform development, concepts, and methods such as data warehouses and data lakehouse, with the ability to adapt and tailor based on requirements. Experience with Azure Synapse Analytics, Azure Databricks, Microsoft Fabric, Data Factory. Expertise in Python, SQL, and developer tooling such as Visual Studio Code, Azure DevOps. Good experience with CI/CD practices and tools for data platforms More ❯
Posted:

Lead Data Engineer

Leeds, England, United Kingdom
Hybrid / WFH Options
VIQU Limited
team based in Leeds, working mostly remotely with just one day on-site per week. You’ll lead the design and delivery of scalable, cloud-based data solutions using Databricks, Python, and SQL, while mentoring a team and driving engineering best practices. About You You might currently be a Senior Data Engineer ready to grow your leadership skills. You’re … passionate about building robust, efficient data pipelines and shaping cloud data architecture in an agile environment. Key Responsibilities Lead development of data pipelines and solutions using Databricks, Python, and SQL Design and maintain data models supporting analytics and business intelligence Build and optimise ELT/ETL processes on AWS or Azure Collaborate closely with analysts, architects, and stakeholders to deliver … as code Mentor and support your team, taking ownership of technical delivery and decisions Drive continuous improvements in platform performance, cost, and reliability Key Requirements Hands-on experience with Databricks or similar data engineering platforms Strong Python and SQL skills in data engineering contexts Expertise in data modelling and building analytics-ready datasets Experience with AWS or Azure cloud data More ❯
Posted:

Data Engineer

London, South East, England, United Kingdom
Hybrid / WFH Options
McGregor Boyall
next-generation data platform. Working as part of a growing data team, you will play a critical role in designing and deploying scalable data pipelines and solutions using Azure Databricks and related technologies. This is an opportunity to contribute to a cloud-first, modern data strategy within a collaborative and forward-thinking environment. Key Responsibilities: Design and develop end-to … end data pipelines (batch and streaming) using Azure Databricks, Spark, and Delta Lake. Implement the Medallion Architecture and ensure consistency across raw, enriched, and curated data layers. Build and optimise ETL/ELT processes using Azure Data Factory and PySpark. Enforce data governance through Azure Purview and Unity Catalog. Apply DevOps and CI/CD practices using Git and Azure … analysts and business stakeholders to ensure data quality and usability. Contribute to performance optimisation and cost efficiency across data solutions. Required Skills & Experience: Proven hands-on experience with Azure Databricks, Data Factory, Delta Lake, and Synapse. Strong proficiency in Python, PySpark, and advanced SQL. Understanding of Lakehouse architecture and medallion data patterns. Familiarity with data governance, lineage, and access control More ❯
Employment Type: Contractor
Rate: £400 - £425 per day
Posted:

Senior Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Freemarket
architectures in a modern cloud environment. You will play a key role in building and optimizing our Medallion architecture (Bronze, Silver, Gold layers), working with modern tools such as Databricks , dbt , Azure Data Factory , and Python/SQL to support critical business analytics and AI/ML initiatives. Key Responsibilities ETL Development : Design and build robust and reusable ETL/… ELT pipelines through the Medallion architecture in Databricks . Data Transformation : Create and manage data models and transformations using dbt , ensuring clear lineage, version control, and modularity. Pipeline Orchestration : Develop and manage workflow orchestration using Azure Data Factory , including setting up triggers, pipelines, and integration runtimes. System Maintenance : Monitor, maintain, and optimize existing data pipelines, including cron job scheduling and … Ensure compliance with data security policies, data retention rules, and privacy regulations. Required Skills and Experience 5+ years of experience in data engineering or similar roles. Strong experience with Databricks , including notebooks, cluster configuration, and Delta Lake. Proficiency in dbt for transformation logic and version-controlled data modeling. Deep knowledge of Azure Data Factory , including pipeline orchestration and integration with More ❯
Posted:

Senior Data Engineer New London (Hybrid)

London, United Kingdom
Hybrid / WFH Options
freemarketFX Limited
architectures in a modern cloud environment. You will play a key role in building and optimizing our Medallion architecture (Bronze, Silver, Gold layers), working with modern tools such as Databricks , dbt , Azure Data Factory , and Python/SQL to support critical business analytics and AI/ML initiatives. Key Responsibilities ETL Development : Design and build robust and reusable ETL/… ELT pipelines through the Medallion architecture in Databricks . Data Transformation : Create and manage data models and transformations using dbt , ensuring clear lineage, version control, and modularity. Pipeline Orchestration : Develop and manage workflow orchestration using Azure Data Factory , including setting up triggers, pipelines, and integration runtimes. System Maintenance : Monitor, maintain, and optimize existing data pipelines, including cron job scheduling and … Ensure compliance with data security policies, data retention rules, and privacy regulations. Required Skills and Experience 5+ years of experience in data engineering or similar roles. Strong experience with Databricks , including notebooks, cluster configuration, and Delta Lake. Proficiency in dbt for transformation logic and version-controlled data modeling. Deep knowledge of Azure Data Factory , including pipeline orchestration and integration with More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer

Greater Manchester, England, United Kingdom
Hybrid / WFH Options
Tenth Revolution Group
forward-thinking organization using data to drive innovation and business performance. They’re expanding their team and are looking for a talented Data Engineer with experience in Azure and Databricks to join the team. Salary and Benefits £55,000 – £65,000 salary depending on experience 10% performance-related bonus Hybrid working model – 2 days in the Greater Manchester office Supportive … do I need to apply for the role: Solid hands-on experience with Azure data tools, including Data Factory, Data Lake, Synapse Analytics, and Azure SQL. Strong proficiency with Databricks, including Spark, Delta Lake, and notebooks. Skilled in Python and SQL for data transformation and processing. Experience with Git and modern CI/CD workflows. Strong analytical mindset and effective More ❯
Posted:

Data Engineer - FinTech Company - Newcastle

Newcastle upon Tyne, England, United Kingdom
Hybrid / WFH Options
Noir
Data Engineer - FinTech Company - Newcastle (Tech Stack: Data Engineer, Databricks, Python, Azure, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) I'm working with a leading Software House in the FinTech industry, based in Newcastle, who are looking to hire a talented Data Engineer .This is a fantastic opportunity to join a forward-thinking company where you'll play … working with data and delivering value to stakeholders. Strong proficiency in SQL, Python, and Apache Spark , with hands-on experience using these technologies in a production environment. Experience with Databricks and Microsoft Azure is highly desirable. Financial Services experience is a plus but not essential. Excellent communication skills, with the ability to explain complex data concepts in a clear and More ❯
Posted:

Data Engineer - FinTech Company - Newcastle

Bath, England, United Kingdom
Hybrid / WFH Options
Noir
Job Reference: NC/RG/DE_1745802035 Job Views: 50 Posted: 28.04.2025 Expiry Date: 12.06.2025 col-wide Job Description: Data Engineer - FinTech Company - Newcastle (Tech Stack: Data Engineer, Databricks, Python, Azure, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) I’m working with a leading Software House in the FinTech industry, based in Newcastle, who are looking to … working with data and delivering value to stakeholders. * Strong proficiency in SQL, Python, and Apache Spark, with hands-on experience using these technologies in a production environment. * Experience with Databricks and Microsoft Azure is highly desirable. * Financial Services experience is a plus but not essential. * Excellent communication skills, with the ability to explain complex data concepts in a clear and More ❯
Posted:

Senior Data Engineer

Wilmslow, England, United Kingdom
Hybrid / WFH Options
The Citation Group
pipelines and infrastructure, who can implement processes on our modern tech stack with robust, pragmatic solutions. Responsibilities develop, and maintain ETL/ELT data pipelines using AWS services Data, Databricks and dbt Manage and optimize data storage solutions such as Amazon S3, Redshift, RDS, and DynamoDB. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or AWS CloudFormation. … tools like Terraform or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in Agile environments with tools like Jira and Git. About Us We are Citation. We are far from your average service provider. Our colleagues bring their brilliant selves More ❯
Posted:

AI Technical Lead with Security Clearance

Falls Church, Virginia, United States
Hybrid / WFH Options
Epsilon Inc
IAT Level II Certification may be required (GSEC, GICSP, CND, CySA+, Security+ CE, SSCP or CCNA-Security). Advanced proficiency in Python, SQL, PySpark, MLOps, computer vision, and NLP, Databricks, AWS, Data Connections Cluster Training and tools such as GitLab, with proven success in architecting complex AI/ML pipelines. Demonstrated ability to write clean, efficient code in Python and … technical teams, providing guidance on project direction, technical challenges, and professional development. One or more of the following certifications are desired: AWS Certified Solutions Architect or Machine Learning Specialty. Databricks Certified Machine Learning Professional. Agile/Scrum Master Certification. Specialized certifications in AI/ML tools or methodologies. Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks … preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Wiraa
solutions that drive organizational efficiency and sustainability goals. As a key member of the technology department, you will work with a variety of modern cloud-based data platforms, including Databricks on AWS, SQL, Power BI, and other data-centric tools. Your technical expertise will contribute to the delivery of projects, day-to-day business-as-usual operations, and continuous improvements … your skills further through training and certifications, with a supportive environment focused on innovation and sustainability. Qualifications Proven experience as a Data Engineer or Data Analyst Strong proficiency in Databricks/Apache Spark Expertise in SQL and Python programming Experience with version control tools such as BitBucket or GitHub Knowledge of data modeling and ETL processes Understanding of cloud platforms More ❯
Posted:

Data Consultant(s) - Data Engineer

Liverpool, Lancashire, United Kingdom
Hybrid / WFH Options
Intuita - Vacancies
reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Consultant(s) - Data Engineer

Newbury, England, United Kingdom
Hybrid / WFH Options
Intuita
reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
Posted:

Data Consultant(s) - Data Engineer

Liverpool, England, United Kingdom
Hybrid / WFH Options
Intuita - Vacancies
reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
Posted:

Database Administrator I with Security Clearance

Falls Church, Virginia, United States
Hybrid / WFH Options
Epsilon Inc
to detail with a willingness to document processes and changes. Effective communication skills and a collaborative mindset. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance … platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control frameworks. Other: Must hold an Active DOD Secret, Top Secret or TS/ More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

Computer Systems Engineer I with Security Clearance

Falls Church, Virginia, United States
Hybrid / WFH Options
Epsilon Inc
abilities, and the capacity to work productively in cross-functional teams while maintaining a continuous improvement mindset. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance … platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control frameworks. Other: Must hold an Active DOD Secret, Top Secret or TS/ More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

Senior Data Engineer

United Kingdom
Hybrid / WFH Options
Riviera Travel
for Azure services. Skills, Experience & Competencies Required Skills: Extensive experience in data engineering with Microsoft Azure. Proficiency in Azure services such as Azure Data Factory, Azure Data Lake, Azure Databricks, Azure Synapse Analytics, and Azure SQL Database. Strong experience with ETL/ELT pipelines, data modelling, and data integration. Proficiency in SQL and programming languages like Python, Scala, or PowerShell. More ❯
Posted:

Lead Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
McCabe & Barton
telco. The ideal candidate with have expertise in some of the following: Python, SQL, Scala, and Java for data engineering. Strong experience with big data tools (Apache Spark, Hadoop, Databricks, Dask) and cloud platforms (AWS, Azure, GCP). Proficient in data modelling (relational, NoSQL, dimensional) and DevOps automation (Docker, Kubernetes, Terraform, CI/CD). Skilled in designing scalable, fault More ❯
Posted:

Data Engineer - Leading Fashion Company - London

London, England, United Kingdom
Hybrid / WFH Options
Noir
Data Engineer - Leading Fashion Company - London (Tech Stack: Data Engineer, Databricks, Python, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) We're recruiting on behalf of a leading fashion brand based in London that's recognised for combining creativity with cutting-edge technology. They're on the lookout for a talented Data Engineer to join their growing data team. More ❯
Posted:

Microsoft Fabric Consultant

Watford, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
Certification: DP-600 – Fabric Analytics Engineer Associate. Responsibilities Daily tasks include, but are not limited to: Designing, building, and optimizing data pipelines and ETL workflows using Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation. Writing clean, efficient Python code for data engineering. Collaborating with data scientists and stakeholders on machine learning and More ❯
Posted:

Microsoft Fabric Consultant

Bournemouth, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient Python code for data engineering tasks. Collaborating with data scientists More ❯
Posted:

Microsoft Fabric Consultant

Reading, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Posted:

Microsoft Fabric Consultant

Cheltenham, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Posted:

Microsoft Fabric Consultant

Hemel Hempstead, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
Analytics Engineer Associate. Responsibilities: Daily tasks include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating with More ❯
Posted:

Microsoft Fabric Consultant

Derby, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating with More ❯
Posted:
Databricks
10th Percentile
£45,750
25th Percentile
£57,500
Median
£77,500
75th Percentile
£97,500
90th Percentile
£113,750