on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
South Yorkshire, England, United Kingdom Hybrid / WFH Options
Erin Associates
SQL Server. Understanding of applying master data management principles, data quality frameworks and data governance best practices. Understanding of Azure Data Factory, Fabric and similar technologies Tech Stack – Python, PySpark, SQL, Xpath, XML, Azure-based Data Science tools, BI tools, Data Visualisation, Agile. The company have an excellent reputation within their sector and have shown consistent growth year-on More ❯
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Erin Associates
SQL Server. Understanding of applying master data management principles, data quality frameworks and data governance best practices. Understanding of Azure Data Factory, Fabric and similar technologies Tech Stack Python, PySpark, SQL, Xpath, XML, Azure-based Data Science tools, BI tools, Data Visualisation, Agile. The company have an excellent reputation within their sector and have shown consistent growth year-on More ❯
Data Scientist, you will work using data engineering, statistical, and ML/AI approaches to uncover data patterns and build models. We use Microsoft tech stack, including Azure Databricks (Pyspark, python), and we are expanding our data science capabilities. To be successful in the role, you will need to have extensive experience in data science projects and have built More ❯
tooling, and engineering standards. We'd love to talk to you if you have: Proven experience delivering scalable, cloud-based data platforms using tools such as Databricks, Spark or PySpark, and services from AWS, Azure, or GCP. Experience in line management and people development. You've supported engineers with regular 1:1s, development planning, and performance conversations, and are More ❯
tooling, and engineering standards. We'd love to talk to you if you have: Proven experience delivering scalable, cloud-based data platforms using tools such as Databricks, Spark or PySpark, and services from AWS, Azure, or GCP. Experience in line management and people development. You've supported engineers with regular 1:1s, development planning, and performance conversations, and are More ❯
experience in data science, machine learning, and business analytics. Extensive experience with generative AI and machine learning frameworks (GANs, VAEs, transformer-based architectures, LLMs). Programming experience with Python, PySpark, microservices, and data modeling. Proficient in LLM orchestration on platforms such as OpenAI on Azure, AWS Bedrock, GCP Vertex AI, or Gemini AI. Architecture & Deployment: Designing, deploying, and scaling More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
Days per Week in Office) I am currently seeking a Contract AWS Data Engineer for a scale-up company with several upcoming greenfield projects. Tech Stack: AWS Databricks Lakehouse PySpark SQL ClickHouse/MySQL/DynamoDB If you are interested, please click "Apply" with an updated copy of your CV, and I will contact you to discuss further. More ❯
great if you have: Built search related products e.g. chatbots Exposure to building data products that use generative AI and LLMs Previous experience using Spark (either via Scala or Pyspark) Experience with statistical methods like regression, GLMs or experiment design and analysis, shipping productionized machine learning systems or other advanced techniques are also welcome All your information will be More ❯
Inside IR35 Start Date: ASAP Key Skills Required: Azure Data Factory Azure Functions SQL Python Desirable: Experience with Copilot Studio Experience designing, developing, and deploying AI solutions Familiarity with PySpark, PyTorch, or other ML frameworks Exposure to M365, D365, and low-code/no-code Azure AI tools If interested, please send a copy of your most recent CV More ❯
prompt engineering, vector databases, or RAG pipelines Proven experience with A/B testing, experimentation design, or causal inference to guide product decisions Exposure to Databricks, MLflow, AWS, and PySpark (or similar technologies) is a plus Excitement about Ophelos' mission to support households and businesses in breaking the vicious debt cycle About Our Team Ophelos launched in June of More ❯
a bonus (but not essential) if you bring experience in some of the following areas: Cloud analytics: Exposure to cloud-based data platforms such as Microsoft Azure or Databricks. PySpark: Any hands-on experience using PySpark for data processing. Azure services: Familiarity with Azure tools like Data Factory. Data fundamentals: Awareness of data structures, algorithms, data quality, governance … skills as you advance in your career with us. What success would look like: Building Reliable Data Pipelines: Consistently delivering well-tested and robust data pipelines using Python and PySpark on Databricks, adhering to established coding standards and software engineering best practices. Growing Technical Proficiency: Rapidly developing your skills in our core technologies (Python, PySpark, Databricks, SQL, Git More ❯
related field. • Demonstrated experience with intermediate to advanced full stack development. • Demonstrated experience with the ability to consume complex REST APIs. • Demonstrated experience with tools, languages, and frameworks: Python, PySpark, JavaScript, Vue, Nuxt.js, and Viz.js • Demonstrated experience with AWS services: S3 and EC2. • Demonstrated experience with databases: relational, graph (Neo4J/Graph-Tool), and NoSQL/document (MongoDB). More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
Senior Data Engineer| AWS/Databricks/PySpark | London/Glasgow (Hybrid) | August Start Role: Senior Data Engineer Location: This is a hybrid engagement represented by 2 days/week onsite, either in Central London or Glasgow. Start Date: Must be able to start mid-August. Salary: £80k-£90k (Senior) | £90k-£95k (Lead) About The Role Our partner is … decisions, peer reviews and solution design. Requirements Proven experience as a Data Engineer in cloud-first environments. Strong commercial knowledge of AWS services (e.g. S3, Glue, Redshift). Advanced PySpark and Databricks experience (Delta Lake, Unity Catalog, Databricks Jobs etc). Proficient in SQL (T-SQL/SparkSQL) and Python for data transformation and scripting. Hands-on experience with … engagement represented by 2 days/week onsite, either in Central London or Glasgow. You must be able to start in August. Senior Data Engineer| AWS/Databricks/PySpark | London/Glasgow (Hybrid) | August Start More ❯
teams as part of a wider trading project. The initial work on the project will involve abstracting code from these product teams into a shared, common python library leveraging PySpark/dataframes. You will then be serving as an extension of these product teams building microservices and libraries to solve the common needs. Skills: • Experience with Unit Testing • Preferably More ❯
teams as part of a wider trading project. The initial work on the project will involve abstracting code from these product teams into a shared, common python library leveraging PySpark/dataframes. You will then be serving as an extension of these product teams building microservices and libraries to solve the common needs. Skills: • Experience with Unit Testing • Preferably More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
ideally with a focus in Motor Experience and detailed technical knowledge of GLMs/Elastic Nets, GBMs, GAMs, Random Forests, and clustering techniques Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Proficient at communicating results in a concise manner both verbally and written Behaviours: Motivated by technical excellence Team player Self-motivated with a drive to learn More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
ideally with a focus in Motor Experience and detailed technical knowledge of GLMs/Elastic Nets, GBMs, GAMs, Random Forests, and clustering techniques Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Proficient at communicating results in a concise manner both verbally and written Behaviours: Motivated by technical excellence Team player Self-motivated with a drive to learn More ❯
Birmingham, West Midlands, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
Data Software Engineer (PythonPySpark) Remote UK to £95k Are you a data savvy Software Engineer with strong Python coding skills? You could be progressing your career in a senior, hands-on Data Software Engineer role as part of a friendly and supportive international team at a growing and hugely successful European car insurance tech company as they expand … on your location/preferences. About you: You are degree educated in a relevant discipline, e.g. Computer Science, Mathematics You have a software engineering background with advanced Python and PySpark coding skills You have experience in batch, distributed data processing and near real-time streaming data pipelines with technologies such as Kafka You have experience of Big Data Analytics More ❯
In details, the position encompasses duties and responsibilities as follows: An experienced Data Engineer is required for the Surveillance IT team to develop ingestion pipelines and frameworks across the application portfolio, supporting Trade Surveillance analysts with strategy and decision-making. More ❯
mathematic models, methods, and/or techniques (e.g. algorithm or development) to study issues and solve problems, engineering (electrical or computer), and/or high performance computing Preferred Python & pySpark experience More ❯