slough, south east england, united kingdom Hybrid / WFH Options
Ascentia Partners
proactive problem-solver with strong analytical and communication skills, capable of working both independently and collaboratively. Desirable: Experience working with Azure (or other major cloud platforms). Familiarity with PySpark or other big data technologies. Understanding of version control systems (e.g., Git). Knowledge of pricing or modelling workflows and the impact of engineering decisions on model performance. Why More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Ascentia Partners
proactive problem-solver with strong analytical and communication skills, capable of working both independently and collaboratively. Desirable: Experience working with Azure (or other major cloud platforms). Familiarity with PySpark or other big data technologies. Understanding of version control systems (e.g., Git). Knowledge of pricing or modelling workflows and the impact of engineering decisions on model performance. Why More ❯
london, south east england, united kingdom Hybrid / WFH Options
Fuse Group
Azure DevOps. Required Skills & Experience Strong expertise in Power BI, DAX, SQL, and data architecture. Proven experience with Azure Data Factory, Fabric Lakehouse/Warehouse, and Synapse. Proficiency in PySpark, T-SQL, and data transformation using Notebooks. Familiarity with DevOps, ARM/Bicep templates, and semantic modelling. Experience delivering enterprise-scale BI/data warehouse programmes. Exposure to Microsoft More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Fuse Group
Azure DevOps. Required Skills & Experience Strong expertise in Power BI, DAX, SQL, and data architecture. Proven experience with Azure Data Factory, Fabric Lakehouse/Warehouse, and Synapse. Proficiency in PySpark, T-SQL, and data transformation using Notebooks. Familiarity with DevOps, ARM/Bicep templates, and semantic modelling. Experience delivering enterprise-scale BI/data warehouse programmes. Exposure to Microsoft More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Fuse Group
Azure DevOps. Required Skills & Experience Strong expertise in Power BI, DAX, SQL, and data architecture. Proven experience with Azure Data Factory, Fabric Lakehouse/Warehouse, and Synapse. Proficiency in PySpark, T-SQL, and data transformation using Notebooks. Familiarity with DevOps, ARM/Bicep templates, and semantic modelling. Experience delivering enterprise-scale BI/data warehouse programmes. Exposure to Microsoft More ❯
results. What we're looking for: Proven experience in Data Architecture and data modelling. Strong skills in Microsoft Azure tools (Fabric, OneLake, Data Factory). Confident with Python/PySpark and relational databases. Hands-on ETL/ELT experience. A problem-solver with a positive, can-do attitude. Bonus points if you bring: Tableau, Power BI, SSAS, SSIS or More ❯
results. What we're looking for: Proven experience in Data Architecture and data modelling. Strong skills in Microsoft Azure tools (Fabric, OneLake, Data Factory). Confident with Python/PySpark and relational databases. Hands-on ETL/ELT experience. A problem-solver with a positive, can-do attitude. Bonus points if you bring: Tableau, Power BI, SSAS, SSIS or More ❯
positive change through data It would be great if you had: Experience in the energy or retail sector Background in pricing, commercial modelling, credit risk, or debt Exposure to PySpark or other big data tools Experience with NLP, Generative AI, or advanced predictive modelling More ❯
london, south east england, united kingdom Hybrid / WFH Options
Intelix.AI
Text-to-Cypher/SPARQL with safety filters and small eval sets. MCP-style tool contracts for safe agent access. Streaming/ELT at scale (Kafka/Databricks/PySpark). More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Intelix.AI
Text-to-Cypher/SPARQL with safety filters and small eval sets. MCP-style tool contracts for safe agent access. Streaming/ELT at scale (Kafka/Databricks/PySpark). More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Intelix.AI
Text-to-Cypher/SPARQL with safety filters and small eval sets. MCP-style tool contracts for safe agent access. Streaming/ELT at scale (Kafka/Databricks/PySpark). More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
Team Leading experience - REQUIRED/Demonstrable on CV (Full Support from Engineering Manager is also available) Hands on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We More ❯
for senior leadership as needed. 2. Technical Leadership AWS Expertise : Hands-on experience with AWS services, scalable data solutions, and pipeline design. Strong coding skills in Python , SQL , and pySpark . Optimize data platforms and enhance operational efficiency through innovative solutions. Nice to Have : Background in software delivery, with a solid grasp of CI/CD pipelines and DataOps More ❯
Portsmouth, yorkshire and the humber, united kingdom
TalentHawk
for senior leadership as needed. 2. Technical Leadership AWS Expertise : Hands-on experience with AWS services, scalable data solutions, and pipeline design. Strong coding skills in Python , SQL , and pySpark . Optimize data platforms and enhance operational efficiency through innovative solutions. Nice to Have : Background in software delivery, with a solid grasp of CI/CD pipelines and DataOps More ❯
analytics efforts. Required Skills & Experience: 4–5 years of commercial experience in data science , preferably in eCommerce or marketing analytics. Strong hands-on experience with Databricks, SQL, Python, and PySpark ; knowledge of R and dashboarding tools is a plus. Proven experience with causal inference, MMM modelling, and experimentation . Strong analytical and problem-solving skills with the ability to More ❯
analytics efforts. Required Skills & Experience: 4–5 years of commercial experience in data science , preferably in eCommerce or marketing analytics. Strong hands-on experience with Databricks, SQL, Python, and PySpark ; knowledge of R and dashboarding tools is a plus. Proven experience with causal inference, MMM modelling, and experimentation . Strong analytical and problem-solving skills with the ability to More ❯
london, south east england, united kingdom Hybrid / WFH Options
myGwork - LGBTQ+ Business Community
context. Proven success in leading analysts, including development, prioritisation, and delivery. Deep comfort with tools like SQL, Tableau, and large-scale data platforms (e.g., Databricks); bonus for Python or PySpark skills. Strong grasp of A/B testing, experimentation design, and statistical rigour. Exceptional communicator - able to distil complex data into clear, actionable narratives for senior audiences. Strategic thinker More ❯
london, south east england, united kingdom Hybrid / WFH Options
TechShack
basis for an initial 6-month assignment. The role involves very occasional travel to London, with flexibility to work remotely. Skills required: Azure platform experience Databricks, Azure Functions, and PySpark Strong data engineering and development background Active SC Clearance This role has a quick 1-hour interview process with a start date of 14th November . Azure DataBricks Developer More ❯
slough, south east england, united kingdom Hybrid / WFH Options
TechShack
basis for an initial 6-month assignment. The role involves very occasional travel to London, with flexibility to work remotely. Skills required: Azure platform experience Databricks, Azure Functions, and PySpark Strong data engineering and development background Active SC Clearance This role has a quick 1-hour interview process with a start date of 14th November . Azure DataBricks Developer More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
TechShack
basis for an initial 6-month assignment. The role involves very occasional travel to London, with flexibility to work remotely. Skills required: Azure platform experience Databricks, Azure Functions, and PySpark Strong data engineering and development background Active SC Clearance This role has a quick 1-hour interview process with a start date of 14th November . Azure DataBricks Developer More ❯
ll architect and maintain robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to … reviews, and foster a culture of continuous learning and knowledge-sharing Mandatory Skills Description: - 5+ years of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL More ❯
ll architect and maintain robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to … reviews, and foster a culture of continuous learning and knowledge-sharing Mandatory Skills Description: - 5+ years of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL More ❯
london (city of london), south east england, united kingdom
Luxoft
ll architect and maintain robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to … reviews, and foster a culture of continuous learning and knowledge-sharing Mandatory Skills Description: - 5+ years of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL More ❯
Contract duration: 6 months (can be extended) Location: London Must have skills: Primary Skills - SAS Admin, Enterprise Guide, Basic SAS coding skill Secondary Skills - Unix Scripting, Gitlab, YAML, Autosys, pyspark, Snowflake, AWS, Agile practice, SQL Candidate should have strong experience in SAS Administration and Expert SAS Coding Skills. Should have more than 6 years experience. Should have very good More ❯
platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks … practices. Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage, and More ❯