both greenfield initiatives and enhancing high-traffic financial applications. Key Skills & Experience: Strong hands-on experience with Databricks , Delta Lake , Spark Structured Streaming , and Unity Catalog Advanced Python/PySpark and big data pipeline development Familiar with event streaming tools ( Kafka , Azure Event Hubs ) Solid understanding of SQL , data modelling , and lakehouse architecture Experience deploying via CI/CD More ❯
business requirements into data solutions Monitor and improve pipeline performance and reliability Maintain documentation of systems, workflows, and configs Tech environment Python, SQL/PLSQL (MS SQL + Oracle), PySpark Apache Airflow (MWAA), AWS Glue, Athena AWS services (CDK, S3, data lake architectures) Git, JIRA You should apply if you have: Strong Python and SQL skills Proven experience designing More ❯
across varied solutions. - Extensive experience of using the Databricks platform for developing and deploying data solutions/data products (including ingestion, transformation and modelling) with high proficiency in Python, PySpark and SQL. - Leadership experience in other facets necessary for solution development such as testing, the wider scope of quality assurance, CI/CD etc. - Experience in related areas of More ❯
across varied solutions. - Extensive experience of using the Databricks platform for developing and deploying data solutions/data products (including ingestion, transformation and modelling) with high proficiency in Python, PySpark and SQL. - Leadership experience in other facets necessary for solution development such as testing, the wider scope of quality assurance, CI/CD etc. - Experience in related areas of More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
practice Essential Experience: Proven expertise in building data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT processes Clear communication skills and ability to document processes effectively Desirable Skills: GCP Professional Data Engineer certification Exposure to Agentic More ❯
change, including optimisation of production processes Great knowledge of Python and in particular, the classic Python Data Science stack (NumPy, pandas, PyTorch, scikit-learn, etc) is required; Familiarity with PySpark is also desirable. A cloud platform experience (e.g Azure, AWS, GCP), were using Azure in the team. Good SQL understanding in practice Capacity and enthusiasm for coaching and mentoring More ❯
south west london, south east england, united kingdom
Mars
change, including optimisation of production processes Great knowledge of Python and in particular, the classic Python Data Science stack (NumPy, pandas, PyTorch, scikit-learn, etc) is required; Familiarity with PySpark is also desirable. A cloud platform experience (e.g Azure, AWS, GCP), were using Azure in the team. Good SQL understanding in practice Capacity and enthusiasm for coaching and mentoring More ❯
Eastleigh, Hampshire, England, United Kingdom Hybrid / WFH Options
Spectrum IT Recruitment
operations and performance Work with Azure DevOps to manage and track project work About You Essential Skills & Experience: Proficiency in cloud-based tools (ADF, Synapse, S3, Lambda) Experience using PySpark for ELT pipelines Strong analytical and problem-solving mindset Able to work independently and collaboratively across teams Confident communicator with strong documentation skills Experience in a data engineering role More ❯
environment. Experience in the Utilities sector. Experience leading technical projects. Skills & Technologies required: Proficiency in cloud-based data engineering tools (ADF, Synapse Analytics, S3, Lambda). Proficiency in using PySpark notebooks for ELT processes. Ability to foster and cultivate a culture of best practices. Strong analytical and problem-solving skills. Ability to work independently and within cross-functional teams. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
Senior Data Engineer| AWS/Databricks/PySpark | London/Glasgow (Hybrid) | August Start Role: Senior Data Engineer Location: This is a hybrid engagement represented by 2 days/week onsite, either in Central London or Glasgow. Start Date: Must be able to start mid-August. Salary: £80k-£90k (Senior) | £90k-£95k (Lead) About The Role Our partner is … decisions, peer reviews and solution design. Requirements Proven experience as a Data Engineer in cloud-first environments. Strong commercial knowledge of AWS services (e.g. S3, Glue, Redshift). Advanced PySpark and Databricks experience (Delta Lake, Unity Catalog, Databricks Jobs etc). Proficient in SQL (T-SQL/SparkSQL) and Python for data transformation and scripting. Hands-on experience with … engagement represented by 2 days/week onsite, either in Central London or Glasgow. You must be able to start in August. Senior Data Engineer| AWS/Databricks/PySpark | London/Glasgow (Hybrid) | August Start More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Bowerford Associates
technical concepts to a range of audiences. Able to provide coaching and training to less experienced members of the team. Essential Skills: Programming Languages such as Spark, Java, Python, PySpark, Scala or similar (minimum of 2). Extensive Big Data hands-on experience across coding/configuration/automation/monitoring/security is necessary. Significant AWS or Azure … the Right to Work in the UK long-term as our client is NOT offering sponsorship for this role. KEYWORDS Lead Data Engineer, Senior Data Engineer, Spark, Java, Python, PySpark, Scala, Big Data, AWS, Azure, Cloud, On-Prem, ETL, Azure Data Fabric, ADF, Hadoop , HDFS , Azure Data, Delta Lake, Data Lake Please note that due to a high level More ❯
Employment Type: Permanent
Salary: £75000 - £80000/annum Pension, Good Holiday, Healthcare
real urgency, and real interest in doing this properly - not endless meetings and PowerPoints. What you'll be doing: Designing, building, and optimising Azure-based data pipelines using Databricks, PySpark, ADF, and Delta Lake Implementing a medallion architecture - from raw to curated Collaborating with analysts to make data business-ready Applying CI/CD and DevOps best practices (Git … time logistics datasets What they're looking for: A strong communicator - someone who can build relationships and help connect silos Experience building pipelines in Azure using Databricks, ADF, and PySpark Strong SQL and Python skills Bonus points if you've worked with Power BI, Azure Purview, or streaming tools You're versatile - happy to support analysts and wear multiple More ❯
Data Engineer | Data Consultant | Azure | Fabric | Python | SQL | PySpark Senior Data Engineer - Up to £70,000 London - 3 days in-office Method Resourcing are thrilled to be partnering with a Microsoft Solutions Partner to support them in hiring a Data Consultant to focus on and specialise on their current and upcoming Fabric projects. This is a fantastic time to … is offering a salary of up to £70,000 dependent on experience + Bonus & Benefits. Please apply now for immediate consideration. Data Engineer | Data Consultant | Azure | Fabric | Python | SQL | PySpark Senior Data Engineer - Up to £70,000 London - 3 days in-office RSG Plc is acting as an Employment Agency in relation to this vacancy. More ❯