Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
practice Essential Experience: Proven expertise in building data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT processes Clear communication skills and ability to document processes effectively Desirable Skills: GCP Professional Data Engineer certification Exposure to Agentic More ❯
Chipstead, Kent, United Kingdom Hybrid / WFH Options
Vermelo RPO
predictive modelling techniques; Logistic Regression, GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW s Radar and Emblem software is preferred Proficient at communicating results in More ❯
Sevenoaks, Kent, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
predictive modelling techniques; Logistic Regression, GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar and Emblem software is preferred Proficient at communicating results in More ❯
Haywards Heath, West Sussex, United Kingdom, Chipstead, Kent Hybrid / WFH Options
Vermelo RPO
predictive modelling techniques; Logistic Regression, GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar and Emblem software is preferred Proficient at communicating results in More ❯
team. In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a … make a significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and … and documentation. Required profile: Requirements Client facing role so strong communication and collaboration skills are vital Proven experience in data engineering, with hands-on expertise in Azure Data Services, PySpark, Apache Spark, and Apache Airflow. Strong programming skills in Python and SQL, with the ability to write efficient and maintainable code. Deep understanding of Spark internals, including RDDs, DataFrames More ❯
UK. In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a … desire to make a significant impact, we encourage you to apply! Job Responsibilities Data Engineering & Data Pipeline Development Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow Implement real-time and batch data processing using Spark Enforce best practices for data quality, governance, and security throughout the data lifecycle Ensure data availability, reliability and performance through … Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using Apache Spark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning initiatives. Workflow Orchestration More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
Senior Data Engineer| AWS/Databricks/PySpark | London/Glasgow (Hybrid) | August Start Role: Senior Data Engineer Location: This is a hybrid engagement represented by 2 days/week onsite, either in Central London or Glasgow. Start Date: Must be able to start mid-August. Salary: £80k-£90k (Senior) | £90k-£95k (Lead) About The Role Our partner is … decisions, peer reviews and solution design. Requirements Proven experience as a Data Engineer in cloud-first environments. Strong commercial knowledge of AWS services (e.g. S3, Glue, Redshift). Advanced PySpark and Databricks experience (Delta Lake, Unity Catalog, Databricks Jobs etc). Proficient in SQL (T-SQL/SparkSQL) and Python for data transformation and scripting. Hands-on experience with … engagement represented by 2 days/week onsite, either in Central London or Glasgow. You must be able to start in August. Senior Data Engineer| AWS/Databricks/PySpark | London/Glasgow (Hybrid) | August Start More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
technical concepts to a range of audiences. Able to provide coaching and training to less experienced members of the team. Essential Skills: Programming Languages such as Spark, Java, Python, PySpark, Scala or similar (minimum of 2). Extensive Big Data hands-on experience across coding/configuration/automation/monitoring/security is necessary. Significant AWS or Azure … the Right to Work in the UK long-term as our client is NOT offering sponsorship for this role. KEYWORDS Lead Data Engineer, Senior Data Engineer, Spark, Java, Python, PySpark, Scala, Big Data, AWS, Azure, Cloud, On-Prem, ETL, Azure Data Fabric, ADF, Hadoop , HDFS , Azure Data, Delta Lake, Data Lake Please note that due to a high level More ❯
real urgency, and real interest in doing this properly - not endless meetings and PowerPoints. What you'll be doing: Designing, building, and optimising Azure-based data pipelines using Databricks, PySpark, ADF, and Delta Lake Implementing a medallion architecture - from raw to curated Collaborating with analysts to make data business-ready Applying CI/CD and DevOps best practices (Git … time logistics datasets What they're looking for: A strong communicator - someone who can build relationships and help connect silos Experience building pipelines in Azure using Databricks, ADF, and PySpark Strong SQL and Python skills Bonus points if you've worked with Power BI, Azure Purview, or streaming tools You're versatile - happy to support analysts and wear multiple More ❯
Data Engineer | Data Consultant | Azure | Fabric | Python | SQL | PySpark Senior Data Engineer - Up to £70,000 London - 3 days in-office Method Resourcing are thrilled to be partnering with a Microsoft Solutions Partner to support them in hiring a Data Consultant to focus on and specialise on their current and upcoming Fabric projects. This is a fantastic time to … is offering a salary of up to £70,000 dependent on experience + Bonus & Benefits. Please apply now for immediate consideration. Data Engineer | Data Consultant | Azure | Fabric | Python | SQL | PySpark Senior Data Engineer - Up to £70,000 London - 3 days in-office RSG Plc is acting as an Employment Agency in relation to this vacancy. More ❯
Data Engineer | Data Consultant | Azure | Fabric | Python | SQL | PySpark Fabric Data Consultant - Up to £70,000 London - 3 days in-office Method Resourcing are thrilled to be partnering with a Microsoft Solutions Partner to support them in hiring a Data Consultant to focus on and specialise on their current and upcoming Fabric projects. This is a fantastic time to … is offering a salary of up to £70,000 dependent on experience + Bonus & Benefits. Please apply now for immediate consideration. Data Engineer | Data Consultant | Azure | Fabric | Python | SQL | PySpark Fabric Data Consultant - Up to £70,000 London - 3 days in-office RSG Plc is acting as an Employment Agency in relation to this vacancy. More ❯