data engineering and reporting. Including storage, data pipelines to ingest and transform data, and querying & reporting of analytical data. You've worked with technologies such as Python, Spark, SQL, Pyspark, PowerBI etc. You're a problem-solver, pragmatically exploring options and finding effective solutions. An understanding of how to design and build well-structured, maintainable systems. Strong communication skills More ❯
cooperation with our data science team Experiment in your domain to improve precision, recall, or cost savings Requirements Expert skills in Java or Python Experience with Apache Spark or PySpark Experience writing software for the cloud (AWS or GCP) Speaking and writing in English enables you to take part in day-to-day conversations in the team and contribute More ❯
Head of Data Platform and Services, you'll not only maintain and optimize our data infrastructure but also spearhead its evolution. Built predominantly on Databricks, and utilizing technologies like Pyspark and Delta Lake, our infrastructure is designed for scalability, robustness, and efficiency. You'll take charge of developing sophisticated data integrations with various advertising platforms, empowering our teams with … and informed decision-making What you'll be doing for us Leadership in Design and Development : Lead in the architecture, development, and upkeep of our Databricks-based infrastructure, harnessing Pyspark and Delta Lake. CI/CD Pipeline Mastery : Create and manage CI/CD pipelines, ensuring automated deployments and system health monitoring. Advanced Data Integration : Develop sophisticated strategies for … standards. Data-Driven Culture Champion : Advocate for the strategic use of data across the organization. Skills-wise, you'll definitely: Expertise in Apache Spark Advanced proficiency in Python and Pyspark Extensive experience with Databricks Advanced SQL knowledge Proven leadership abilities in data engineering Strong experience in building and managing CI/CD pipelines. Experience in implementing data integrations with More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Client Server
Data Software Engineer (PythonPySpark) Remote UK to £95k Are you a data savvy Software Engineer with strong Python coding skills? You could be progressing your career in a senior, hands-on Data Software Engineer role as part of a friendly and supportive international team at a growing and hugely successful European car insurance tech company as they expand … on your location/preferences. About you: You are degree educated in a relevant discipline, e.g. Computer Science, Mathematics You have a software engineering background with advanced Python and PySpark coding skills You have experience in batch, distributed data processing and near real-time streaming data pipelines with technologies such as Kafka You have experience of Big Data Analytics More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
to performance optimisation and cost efficiency across data solutions. Required Skills & Experience: Proven hands-on experience with Azure Databricks, Data Factory, Delta Lake, and Synapse. Strong proficiency in Python, PySpark, and advanced SQL. Understanding of Lakehouse architecture and medallion data patterns. Familiarity with data governance, lineage, and access control tools. Experience in Agile environments, with solid CI/CD More ❯
infrastructure Excellent communication and collaboration skills Experience working with Git, practicing code reviews and branching strategies, CI/CD and testing in software solutions Proficiency in SQL, Python, and PySpark Ability to translate marketing needs into well-structured data products Deep understanding of data modeling concepts and building scalable data marts Basic experience with frontend technologies is a plus More ❯
and root cause analysis. Following agreed architectural standards and contributing to their continuous improvement. What do I need? Proficiency in Azure and its data related services. Strong SQL and PySpark skills, with a focus on writing efficient, readable, modular code. Experience of development on modern cloud data platforms (e.g. Databricks, Snowflake, RedShift). Familiarity of Data Lakehouse principles, standards More ❯
As a Data Engineer, you will play a crucial role in designing, developing, and maintaining data architecture and infrastructure. The successful candidate should possess a strong foundation in Python, Pyspark, SQL, and ETL processes, with a demonstrated ability to implement solutions in a cloud environment. Position - Senior Data Engineer Experience - 6+ yrs Location - London Job Type - Hybrid, Permanent Mandatory … Skills : Design, build, maintain data pipelines using Python, Pyspark and SQL Develop and maintain ETL processes to move data from various data sources to our data warehouse on AWS/AZURE/GCP . Collaborate with data scientists, business analysts to understand their data needs & develop solutions that meet their requirements. Develop & maintain data models and data dictionaries for … improve the performance and scalability of our data solutions. Qualifications : Minimum 6+ years of Total experience. At least 4 years of Hands on Experience using The Mandatory skills - Python, Pyspark, SQL. More ❯
our machine learning and analytics workloads to support the companies growth. Our data stack: We work with a modern data stack built on Databricks and AWS with python and pyspark as our primary tools. In this role, you'll get to: Own business critical components and perform meaningful work with an impact on our company and our customers Design … expand your skillset About you We believe that no one is the finished article, however, some experience in the following is important for this role: Proficient with Python and PySpark Experience working with a modern data stack is beneficial but not required Experience with AWS is beneficial but not required You enjoy learning new technologies and are passionate about More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
INTEC SELECT LIMITED
complex ideas. Proven ability to manage multiple projects and meet deadlines in dynamic environments. Proficiency with SQL Server in high-transaction settings. Experience with either C# or Python/PySpark for data tasks. Hands-on knowledge of Azure cloud services, such as Databricks, Event Hubs, and Function Apps. Solid understanding of DevOps principles and tools like Git, Azure DevOps More ❯
Location: London (Hybrid, 2-3 days/week in office - located in East India Doc, next to Canary Wharf) Duration: 6-9 months - Outside of IR35 Start date: ASAP Budget: £450-£500 per day Elsewhen is a London based consultancy More ❯
months Location: London JOB DETAILS Role Title: Senior Data Engineer Note: (Please do not submit the same profiles as for 111721-1) Required Core Skills: Databricks, AWS, Python, Pyspark, data modelling Minimum years of experience: 7 years Job Description: Must have hands-on experience in designing, developing, and maintaining data pipelines and data streams. Must have a strong working … knowledge of moving/transforming data across layers (Bronze, Silver, Gold) using ADF, Python, and PySpark. Must have hands-on experience with PySpark, Python, AWS, data modelling. Must have experience in ETL processes. Must have hands-on experience in Databricks development. Good to have experience in developing and maintaining data integrity and accuracy, data governance, and data security policies More ❯
Strong analytical and troubleshooting skills. Desirable Skills Familiarity with state management libraries (MobX, Redux). Exposure to financial data or market analytics projects. Experience with data engineering tools (DuckDB, PySpark, etc.). Knowledge of automated testing frameworks (Playwright, Cypress). Experience of WebAssembly. Python programming experience for data manipulation or API development. Use of AI for creating visualisations. Soft More ❯
effective knowledge transfer Translate business needs into technical solutions through effective stakeholder engagement Document data architecture, processes and reporting logic to ensure repeatability and transparency Work with SQL and PySpark to transform and load data Support Power BI reporting needs where required What We’re Looking For Previous experience in data engineering Strong hands-on experience with Azure data … tools (Data Factory, Synapse, Databricks) Advanced SQL and PySpark knowledge Strong stakeholder engagement skills with experience in requirement gathering and documentation Microsoft certification and Power BI experience is desirable Background in mid-to-large scale businesses preferred – complexity and data maturity essential A proactive, solutions-oriented personality who thrives in fast-paced, evolving environments Interested? Click “Apply” or email More ❯
re used daily by home builders, mortgage brokers, local councils, and more to make informed property purchasing decisions. We've migrated key legacy SQL Server/SSIS pipelines to PySpark and Databricks, and we're in the home stretch of our modernisation programme. Now we're looking to unlock the power of our disparate data, and make it accessible … working to high standards of compliance (inc ISO-27001, GDPR), Data Governance, and Information Security Experienced in migrating from SQL based data architectures to modern Data Engineering technologies, using PySpark, Databricks, Terraform, and Pandas Someone able to explore, analyse and understand our data and its uses Ideally experienced in a multi-cloud environment (Databricks across Azure and AWS) solving More ❯
Wandsworth, Greater London, UK Hybrid / WFH Options
Houseful Limited
re used daily by home builders, mortgage brokers, local councils, and more to make informed property purchasing decisions. We’ve migrated key legacy SQL Server/SSIS pipelines to PySpark and Databricks, and we’re in the home stretch of our modernisation programme. Now we’re looking to unlock the power of our disparate data, and make it accessible … working to high standards of compliance (inc ISO-27001, GDPR), Data Governance, and Information Security Experienced in migrating from SQL based data architectures to modern Data Engineering technologies, using PySpark, Databricks, Terraform, and Pandas Someone able to explore, analyse and understand our data and its uses Ideally experienced in a multi-cloud environment (Databricks across Azure and AWS) solving More ❯
East London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
s/PhD in Computer Science, Data Science, Mathematics, or related field. 5+ years of experience in ML modeling, ranking, or recommendation systems . Proficiency in Python, SQL, Spark, PySpark, TensorFlow . Strong knowledge of LLM algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/LLMs Familiarity with distributed computing More ❯
and value creation from data curation activities. Agile mindset with the ability to deliver prototypes quickly and iterate improvements based on stakeholder feedback Experience in Python, Databricks, Delta Lake, PySpark, Pandas, other data engineering frameworks and applying them to achieve industry standards-compliant datasets Strong communication skills and expertise to translate business needs into technical data requirements and processes More ❯
ETL) Future Talent Pool, GCP Data Engineer, London, hybrid role new workstreams on digital Google Cloud transformation programme Proficiency in programming languages such as Python and Java Programming languages Pyspark & Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery Google Cloud Platform Data Studio Unix/Linux Platform Version control tools (Git, GitHub), automated More ❯
using CI/CD, along with proficiency in designing and implementing CI/CD pipelines in Cloud environments. Excellent practical expertise in Performance tuning and system optimisation. Experience with PySpark and Azure Databricks for distributed data processing and large-scale data analysis. Proven experience with web frameworks , including knowledge of Django and experience with Flask, along with a solid More ❯
Data Engineering Manager £110,000-£115,000 + 10% bonus ️Databricks, Snowflake, Terraform, Pyspark, Azure London, hybrid working (2 days in office) Leading property data & risk software company We are partnered with a leading property data & risk software company that contributes valuations, insights, and decisioning technology to over 1 million mortgage approvals each year. They are looking for a … visualization, and data modeling. Engage in projects that influence the company's bottom line. Drive the business forward by enabling better decision-making processes. Tech Stack : Databricks, Azure, Python, Pyspark, Terraform. What's in it for you 7.5% pension contribution by the company Discretionary annual bonus up to 10% of base salary 25 days annual leave plus extra days More ❯
Data Scientist, you will work using data engineering, statistical, and ML/AI approaches to uncover data patterns and build models. We use Microsoft tech stack, including Azure Databricks (Pyspark, python), and we are expanding our data science capabilities. To be successful in the role, you will need to have extensive experience in data science projects and have built More ❯
Fabric - UK role (WFH) - 6 months initial contract - Top rates - Outside IR35 Major consultancy urgently requires a Data Engineer with experience of MS Fabric (tech stack is Microsoft Fabric, PySpark/RSpark and Github) for an initial 6 months contract (WFH and Outside IR35) who is passionate about building new capabilities from the ground up and want to help More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Fusion People Ltd
Fabric - UK role (WFH) - 6 months initial contract - Top rates - Outside IR35 Major consultancy urgently requires a Data Engineer with experience of MS Fabric (tech stack is Microsoft Fabric, PySpark/RSpark and Github) for an initial 6 months contract (WFH and Outside IR35) who is passionate about building new capabilities from the ground up and want to help More ❯
someone with a strong analytical mindset and experience in working with data in numerical or technical environments. You'll need a working knowledge of tools like Python, SQL, and PySpark, along with the ability to manipulate and extract insights from large datasets. Familiarity with VBA functions and core analytical techniques is expected, and you should be comfortable working independently More ❯