City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
London offices 1-2 days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
London offices 1-2 days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering More ❯
Winchester, Hampshire, England, United Kingdom Hybrid / WFH Options
Ada Meher
London offices 1-2 days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering More ❯
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
and influence decisions. Strong understanding of data models and analytics; exposure to predictive modelling and machine learning is a plus. Proficient in SQL and Python, with bonus points for PySpark, SparkSQL, and Git. Skilled in data visualisation with tools such as Tableau or Power BI. Confident writing efficient code and troubleshooting sophisticated queries. Clear and adaptable communicator, able to More ❯
Erskine, Renfrewshire, Scotland, United Kingdom Hybrid / WFH Options
DXC Technology
and addressing data science opportunities. Required Skills & Experience Proven experience in MLOps or DevOps roles within machine learning environments Strong programming skills in Python, with hands-on experience in PySpark and SQL Deep understanding of ML lifecycle management and CI/CD best practices Familiarity with cloud-native ML platforms and scalable deployment strategies Excellent problem-solving skills and More ❯
personalized digital interactions.Some Technologies We Work With:Python primarily, with bits and pieces of typescript and scalaGCP, AWS, Azure - in this order of relevanceGitHub, Docker, GitHub Actions, Terraform, KubernetesPandas, PySpark and SparkVertex AI, Azure OpenAI for LLMs Job Responsibilities Lead the execution of projects in a high-performing data science team, fostering professional growth and creating an inclusive and More ❯
stack - you'll be empowered to design solutions using the most appropriate technologies, deploying final implementations on AWS. You'll bring hands-on depth and knowledge in: Languages: Python, PySpark, SQL Technologies: Spark, Airflow Cloud: AWS (API Gateway, Lambda, Redshift, Glue, CloudWatch, etc.) Data Pipelines: Designing and building modern, cloud-native pipelines using AWS services In addition, you will More ❯
Solihull, West Midlands, England, United Kingdom Hybrid / WFH Options
MYO Talent
Experience in a Data Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks Microsoft SQL Server Lakehouse, Delta Lake Data Warehousing ETL Database Design Python/PySpark Azure Blob Storage Azure Data Factory Desirable: Exposure ML/Machine Learning/AI/Artificial Intelligence More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Certain Advantage
C#, C++, Rust, Java, etc.). Strong background in Azure cloud application development, including security, observability, storage, and database resources. Solid understanding of data engineering tools and technologies (Databricks, PySpark, Lakehouses, Kafka). Advanced mathematics and quantitative analysis skills, ideally with hands-on experience in probabilistic modeling and the valuation of financial derivatives. Domain expertise in derivatives within energy More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
CDC. Knowledge of public/enterprise cloud technologies (AWS EC2, S3 Bucket, GCP, Azure) is advantageous but not required. Some skills/experience with automated testing frameworks (Java, Python, PySpark, Bitbucket, Gitlab, Jenkins) is advantageous but not required. Strong Environment Management skill Carbon60, Lorien & SRG - The Impellam Group STEM Portfolio are acting as an Employment Business in relation to More ❯
Coventry, West Midlands, England, United Kingdom Hybrid / WFH Options
Lorien
/executing tests Requirements Strong experience as a Data Engineer (migrating legacy systems onto AWS, building data pipelines) Strong Python experience Tech stack experience required: AWS Glue, Redshift, Lambda, PySpark, Airflow SSIS or SAS experience (Desirable) Benefits Salary up to £57,500 + up to 20% bonus Hybrid working: 1 to 2 days a week in the office More ❯
within a cross-functional environment. Experience in line management or team leadership, with a track record of developing and supporting engineers. AWS (Lambda, Glue, ECS, S3, etc.) Python and PySpark (data pipelines, APIs, automation) TypeScript and React (frontend development) Excellent communication and stakeholder management skills. Demonstrated expertise in technical design and architecture of distributed systems. Familiarity with fraud detection More ❯
Employment Type: Permanent
Salary: £90000 - £110000/annum Circa £100,000 + Bonus
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
within a cross-functional environment. Experience in line management or team leadership, with a track record of developing and supporting engineers. AWS?(Lambda, Glue, ECS, S3, etc.) Python and PySpark?(data pipelines, APIs, automation) TypeScript and React?(frontend development) Excellent communication and stakeholder management skills. Demonstrated expertise in technical design and architecture of distributed systems. Familiarity with fraud detection More ❯
Azure DevOps. Required Skills & Experience Strong expertise in Power BI, DAX, SQL, and data architecture. Proven experience with Azure Data Factory, Fabric Lakehouse/Warehouse, and Synapse. Proficiency in PySpark, T-SQL, and data transformation using Notebooks. Familiarity with DevOps, ARM/Bicep templates, and semantic modelling. Experience delivering enterprise-scale BI/data warehouse programmes. Exposure to Microsoft More ❯
understanding of AI/ML/DL and Statistics, as well as coding proficiency using related open source libraries and frameworks. Significant proficiency in SQL and languages like Python, PySpark and/or Scala. Can lead, work independently as well as play a key role in a team. Good communication and interpersonal skills for working in a multicultural work More ❯
results. What we're looking for: Proven experience in Data Architecture and data modelling. Strong skills in Microsoft Azure tools (Fabric, OneLake, Data Factory). Confident with Python/PySpark and relational databases. Hands-on ETL/ELT experience. A problem-solver with a positive, can-do attitude. Bonus points if you bring: Tableau, Power BI, SSAS, SSIS or More ❯
results. What we're looking for: Proven experience in Data Architecture and data modelling. Strong skills in Microsoft Azure tools (Fabric, OneLake, Data Factory). Confident with Python/PySpark and relational databases. Hands-on ETL/ELT experience. A problem-solver with a positive, can-do attitude. Bonus points if you bring: Tableau, Power BI, SSAS, SSIS or More ❯
methodologies and key IR metrics Passion for shipping high-quality products and a self-motivated drive to take ownership of tasks Tech Stack Core : Python, FastAPI, asyncio, Airflow, Luigi, PySpark, Docker, LangGraph Data Stores : Vector Databases, DynamoDB, AWS S3, AWS RDS Cloud & MLOps : AWS, Databricks, Ray ️ Unlimited vacation time - we strongly encourage all of our employees take at least More ❯
positive change through data It would be great if you had: Experience in the energy or retail sector Background in pricing, commercial modelling, credit risk, or debt Exposure to PySpark or other big data tools Experience with NLP, Generative AI, or advanced predictive modelling More ❯
field, or equivalent 2+ years’ experience with software development Good knowledge of a programming language (Python, MatLab, C#, or similar) An understanding of databases and associated processes (SQL, Kafka, PySpark) Experience with Git, Docker, DevOps and CI/CD processes A track record of independent working, requirements scoping, proof-of-concept implementation, and production release A strong desire to More ❯
field, or equivalent 2+ years’ experience with software development Good knowledge of a programming language (Python, MatLab, C#, or similar) An understanding of databases and associated processes (SQL, Kafka, PySpark) Experience with Git, Docker, DevOps and CI/CD processes A track record of independent working, requirements scoping, proof-of-concept implementation, and production release A strong desire to More ❯
build/test/deploy automation). Demonstrable professional experience in SQL. Minimum 2 years' experience in Python for scripting, automation, and data transformation. Minimum 2 years' experience with PySpark for handling distributed data processing workloads. Minimum 1 years' experience with Microsoft PowerBI Desirable Skills Solid understanding of SRE principles applied to data platforms (pipeline reliability, monitoring, CI/ More ❯
Lincoln, Lincolnshire, East Midlands, United Kingdom Hybrid / WFH Options
Frontier Agriculture Limited
in the development and maintenance of Data warehouses Developing and deploying Azure Data Factory, Azure Data Bricks, Infomatica Customer 360, SQL, reports, and automation Employ a strong knowledge of PySpark to optimize and develop workbooks Configuring Informatica to integrate data with our EDP platform as well as create and monitor data quality rules Collaborating with Architects and Senior Engineers More ❯
Employment Type: Permanent, Work From Home
Salary: Competitive + Benefits + 25 Days Holiday + Employee Assistance Program
of complex engineering tasks, contribute to architectural decisions, and collaborate with stakeholders across engineering, product, and domain teams. Key Responsibilities: Design, implement, and optimise scalable data pipelines using Python, PySpark, and Databricks Drive the delivery of customer-facing data products through APIs, Databricks-based sharing, and event-driven mechanisms (e.g., Kafka or similar) Take ownership of end-to-end … s agile delivery process and provide input during planning and retrospectives Requirements: Good experience in data engineering or backend software engineering with a data focus Strong proficiency in Python , PySpark , and working within Databricks environments Hands-on experience designing and delivering data products via REST APIs , event-driven systems , or data sharing platforms like Databricks Delta Sharing Solid understanding More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Lorien
Engineer Location: Manchester (hybrid) Type: Permanent Must have skills Working with AWS cloud services to manage and process data. Using AWS Glue for building and running data pipelines. Writing PySpark code in Python, along with SQL, to handle big data processing. Creating and running SQL queries to analyze and manage data More ❯