Spark SQL Jobs in the UK

23 of 23 Spark SQL Jobs in the UK

Senior Data Engineer

City of London, London, United Kingdom
Mastek
platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure … Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources … practices. Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage More ❯
Posted:

Senior Data Engineer

London Area, United Kingdom
Mastek
platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure … Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources … practices. Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage More ❯
Posted:

Senior Data Engineer

london, south east england, united kingdom
Mastek
platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure … Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources … practices. Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage More ❯
Posted:

Senior Data Engineer

slough, south east england, united kingdom
Mastek
platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure … Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources … practices. Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage More ❯
Posted:

Senior Data Engineer

london (city of london), south east england, united kingdom
Mastek
platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure … Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources … practices. Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage More ❯
Posted:

Pyspark Developer

London, United Kingdom
Queen Square Recruitment Ltd
analysts and stakeholders to translate business needs into technical solutions. Maintain clear documentation and contribute to internal best practices. Requirements Strong hands-on experience with PySpark (RDDs, DataFrames, Spark SQL). Proven ability to build and optimise ETL pipelines and dataflows. Familiar with Microsoft Fabric or similar lakehouse/data platform environments. Experience with Git, CI More ❯
Employment Type: Contract
Rate: £400 - £450/day
Posted:

Data Engineer

Leeds, England, United Kingdom
Fruition Group
a business cultivating a robust in-house team, this role could be the next step for you. Data Engineer Responsibilities: Design, build, and optimise data pipelines in Python, PySpark, SparkSQL, and Databricks. Ingest, transform, and enrich structured, semi structured, and unstructured data. Operate and support production grade data systems with strong observability and monitoring. Enable real time and batch data More ❯
Posted:

Data Engineer

bradford, yorkshire and the humber, united kingdom
Fruition Group
a business cultivating a robust in-house team, this role could be the next step for you. Data Engineer Responsibilities: Design, build, and optimise data pipelines in Python, PySpark, SparkSQL, and Databricks. Ingest, transform, and enrich structured, semi structured, and unstructured data. Operate and support production grade data systems with strong observability and monitoring. Enable real time and batch data More ❯
Posted:

Data Analyst

Reigate, England, United Kingdom
Hybrid / WFH Options
esure Group
Strong understanding of data models and analytics; exposure to predictive modelling and machine learning is a plus. Proficient in SQL and Python, with bonus points for PySpark, SparkSQL, and Git. Skilled in data visualisation with tools such as Tableau or Power BI. Confident writing efficient code and troubleshooting sophisticated queries. Clear and adaptable communicator, able to explain technical More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

Central London, UK
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

United Kingdom
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

West London, UK
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

East London, London, United Kingdom
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

City of London, London, United Kingdom
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

Altrincham, Greater Manchester, United Kingdom
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

Bolton, Greater Manchester, United Kingdom
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

Leigh, Greater Manchester, United Kingdom
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

Leeds, West Yorkshire, United Kingdom
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

Central London / West End, London, United Kingdom
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

Bury, Greater Manchester, United Kingdom
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Remote Senior Data Engineer - Contract Opportunity

Ashton-Under-Lyne, Greater Manchester, United Kingdom
Hybrid / WFH Options
techstaff-in
and delivery to downstream systems. Essential Technical Requirements 5+ years proven experience in a Data Engineering role, building production-ready pipelines. Deep hands-on expertise with Databricks (PySpark/SparkSQL) environments (2+ years desirable). Expert proficiency in Python and SQL development. Experience using Databricks. Experience with cloud-based data platforms (Azure, AWS, or GCP). Prior experience More ❯
Posted:

Senior Data Engineer

London, United Kingdom
Hybrid / WFH Options
Cognizant
involves structuring analytical solutions that address business objectives and problem solving. We are looking for hands-on experience in writing code for AWS Glue in Python, PySpark, and Spark SQL. The successful candidate will translate stated or implied client needs into researchable hypotheses, facilitate client working sessions, and be involved in recurring project status meetings. You will develop … relevant data points Create solution hypothesis and get client buy in, discuss and align on end objective, staffing need, timelines and budget Nice to have Hive Pig No-SQL database More ❯
Employment Type: Permanent, Work From Home
Posted:

Developer (PySpark + Fabric)

London, United Kingdom
Stackstudio Digital Ltd
business stakeholders to translate requirements into technical solutions. Create, maintain, and update documentation and internal knowledge repositories. Your Profile Essential Skills/Knowledge/Experience Ability to write Spark code for large-scale data processing, including RDDs, DataFrames, and Spark SQL. Hands-on experience with lakehouses, dataflows, pipelines, and semantic models. Ability to build ETL workflows. More ❯
Employment Type: Contract
Rate: From £475 to £500 per day
Posted:
Spark SQL
10th Percentile
£36,750
25th Percentile
£43,125
Median
£47,500
75th Percentile
£57,500
90th Percentile
£68,000