Leeds, England, United Kingdom Hybrid / WFH Options
Scott Logic
data engineering and reporting. Including storage, data pipelines to ingest and transform data, and querying & reporting of analytical data. You've worked with technologies such as Python, Spark, SQL, Pyspark, PowerBI etc. You’ve got a background in software engineering, including Front End technologies like JavaScript. You’re a problem-solver, pragmatically exploring options and finding effective solutions. An More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
TieTalent
engineering experience with data platforms and analytics support Hands-on experience with Azure Data ecosystem: Databricks, Data Factory, Data Lake, Synapse; certifications are a plus Proficiency in Python, with PySpark experience preferred Strong SQL skills Experience in building and maintaining data pipelines Managing DevOps pipelines Skills in process optimization, performance tuning, data modeling, and database design Desirable Experience in … Zero goal by 2050. We value diversity and inclusion; your anonymized diversity data helps us improve our outreach and inclusion efforts. Nice-to-Have Skills Python, Azure, Data Factory, PySpark, SQL, DevOps Work Experience & Languages Data Engineer, Data Infrastructure English Seniorities and Job Details Entry level Contract IT & Internet industry #J-18808-Ljbffr More ❯
/Azure - or Snowflake/Redshift/BigQuery) (Required) Experience with infrastructure as code (e.g. Terraform) (Required) Proficiency in using Python both for scheduling (e.g. Airflow) and manipulating data (PySpark) (Required) Experience building deployment pipelines (e.g. Azure Pipelines) (Required) Deployment of web apps using Kubernetes (Preferably ArgoCD & Helm) (Preferred) Experience working on Analytics and Data Science enablement (dbt, DS More ❯
Databricks environments and developing lakehouse architectures with a focus on automation, performance tuning, cost optimisation, and system reliability. Proven proficiency in programming languages such as Python, T-SQL, and PySpark, with practical knowledge of test-driven development. Demonstrated capability in building secure, scalable data solutions on Azure with an in-depth understanding of data security and regulatory compliance, using More ❯
transaction processing with maintaining and strengthening the modelling standards and business information. ͏ Key Responsibilities: Build and optimize Prophecy data pipelines for large scale batch and streaming data workloads using Pyspark Define end-to-end data architecture leveraging prophecy integrated with databricks or Spark or other cloud-native compute engines Establish coding standards, reusable components, and naming conventions using Prophecy … exposure to convert legacy etl tools like datastage, informatica into Prophecy pipelines using Transpiler component of Prophecy Required skill & experience: 2+ years of hands-on experience with Prophecy (Using pyspark) approach 5+ years of experience in data engineering with tools such as Spark, Databricks,scala/Pyspark or SQL Strong understanding of ETL/ELT pipelines, distributed data More ❯
to be part of a team that’s transforming how data powers retail, this is your opportunity. Your Role (Key Responsibilities) Design, build, and optimise robust data pipelines using PySpark, SparkSQL, and Databricks to ingest, transform, and enrich data from a variety of sources. Translate business requirements into scalable and performant data engineering solutions, working closely with squad members More ❯
Job Title Data Engineering Manager - Commercial & Supply Location Asda House Employment Type Full time Contract Type Permanent Hours Per Week 37.5 Salary Competitive salary plus benefits Category Data Science Closing Date 27 June 2025 The role requires on-site presence More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Movera
Specification Strong SQL knowledge Proven experience working with Azure Git/DevOps Repos experience Demonstration of problem solving ability Synapse Analytics or similar experience - desirable Visual Files experience – desirable PySpark/Python experience – desirable Powershell experience – desirable What We Offer We aim to reward your hard work generously. You’ll be greeted in our offices with great coffee, fruit More ❯
Job Title Lead Data Engineer Location Asda House Employment Type Full time Contract Type Permanent Hours Per Week 37.5 Salary Competitive salary plus benefits Category Business Intelligence Closing Date 27 June 2025 This role requires on-site presence at Asda More ❯
and motivated Data Engineer to play a key role in the creation of a brand-new data platform within the Azure ecosystem including Azure Data Factory (ADF), Synapse and PySpark/Databricks and Snowflake. You will be a data ingestion and ETL Pipeline guru, tackling complex problems at source in order to retrieve the data and ensure to can … unless you have skills and desire to work on data ingestion, ETL/ELT. Key Responsibilities Build & Develop robust ETL/Data Ingestion pipelines leveraging Azure Data Factory, Synapse, PySpark and Python. Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. CI/CD & Automation More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Novuna
re looking for the following: Good technical knowledge of Azure Databricks, Azure Data Factory, Terraform and Azure DevOps Strong grounding in both SQL and Python with extensive experience using PySpark Excellent communication skills and the ability to collaborate with different stakeholders and disciplines Ability to mentor and coach other members of the team What can we offer you? At More ❯
include: Technical Skills: Coding fundamentals (Python, SQL) Data modelling and data warehousing Extract-Transform-Load (ETL) Creating data pipelines Ethics in data and AI Cloud and big data technologies (PySpark, AWS) Working with unstructured data Consulting Skills: Effective teamwork Business acumen Time management Stakeholder management Presentations Agile methodology Being a brand ambassador Being a Digital Futures consultant Consulting roles More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Skills for Care
You will also work alongside external data engineers and data scientists, who may not be familiar with the datasets, to accelerate development. Our technology stack consists of: Python and Pyspark AWS glue jobs assembled into Step functions Pydeequ for data quality testing Amazon Athena for querying data Hosted on AWS, using S3, Glue, Step functions, EMR, and Athena Terraform More ❯
for reporting. A solid understanding of Power BI (or similar tools) and a passion for creating great user experiences. Familiarity with SQL , and ideally some exposure to Python or PySpark . Experience with Azure Data Services and platforms like Databricks . A collaborative mindset and experience working in Agile teams . Strong communication skills and a proactive, problem-solving More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
DWP Digital
Data Engineer Pay up to £38,373 , plus 28.97% employer pension contributions, hybrid working, flexible hours, and great work life balance. As a Data Engineer you will be working within a high-performing team of engineers, developing data-centric solutions More ❯