City of London, London, United Kingdom Hybrid / WFH Options
E.ON Next
segmentation , and AI , we want to hear from you! At E.ON Next, you'll have the opportunity to leverage your skills in Databricks and PySpark to tackle operational and customer experience-related challenges, driving impactful solutions in a dynamic and collaborative environment. Join us in revolutionising the energy sector … stakeholders and fostering strong business partnerships. Leveraging predictive modeling, segmentation techniques, and advanced AI algorithms to unlock valuable insights. Demonstrating proficiency in Databricks and PySpark to streamline data processing and analysis. A taste of what you’ll be doing: ● Consultative Leadership: Build a strategic understanding of the business, employ … Strong communication skills with the ability to engage with non-technical stakeholders. ● Expertise in predictive modeling, segmentation, and AI techniques. ● Proficiency in Databricks and PySpark for data manipulation and analysis. ● Experience solving operational or customer experience-related problems such as workforce management, demand forecasting, or root cause analysis. ● BSc more »
Central London / West End, London, United Kingdom Hybrid / WFH Options
E.ON Next
segmentation , and AI , we want to hear from you! At E.ON Next, you'll have the opportunity to leverage your skills in Databricks and PySpark to tackle operational and customer experience-related challenges, driving impactful solutions in a dynamic and collaborative environment. Join us in revolutionising the energy sector … stakeholders and fostering strong business partnerships. Leveraging predictive modeling, segmentation techniques, and advanced AI algorithms to unlock valuable insights. Demonstrating proficiency in Databricks and PySpark to streamline data processing and analysis. A taste of what you’ll be doing: ● Consultative Leadership: Build a strategic understanding of the business, employ … Strong communication skills with the ability to engage with non-technical stakeholders. ● Expertise in predictive modeling, segmentation, and AI techniques. ● Proficiency in Databricks and PySpark for data manipulation and analysis. ● Experience solving operational or customer experience-related problems such as workforce management, demand forecasting, or root cause analysis. ● BSc more »
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management HOW TO APPLY Please register your interest by sending your CV to Kiran Ramasamy more »
Greater Manchester, England, United Kingdom Hybrid / WFH Options
Blue Wolf Digital
to join their team. The primary focus of the role is Databricks data engineering. You will be building data pipelines using Databricks, coding using PySpark, and supporting internal applications. You will also be using Python for Data Transformations and work across the Azure Data Platform. Must Have Strong Databricks more »
Nottingham, England, United Kingdom Hybrid / WFH Options
Harnham
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management BENEFITS: Pension scheme Gym Membership Share options Bonus Hybrid working HOW TO APPLY: Register more »
Role: Graduate Data Engineer Type: 12 months fixed-term Location: Peterborough Ready to utilise your skills to process and extract value from large datasets? Are you passionate about performing root cause analysis on various data? We have an exciting role more »
Leicester, England, United Kingdom Hybrid / WFH Options
Harnham
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management BENEFITS: Pension scheme Gym Membership Share options Bonus Hybrid working HOW TO APPLY: Register more »
snowflake schemas. Knowledge of DevOps practices within a Power BI environment. Familiarity with Microsoft Fabric & Databricks. SQL databases expertise, data engineering with Python and PySpark, and knowledge of geospatial concepts and tools. As part of this engagement, you will work on initiatives that redefine business efficiency through AI. You more »
Principal Data Engineer – National Salary Circa £80,000. London Salary Circa £100,000. Do you like working with the latest technology and are interested in enhancing your tech abilities? We have an exciting opportunity for an Engineer Manager with significant more »
and ensuring best practices and understood and followed. Technical Skills and Qualifications Expert knowledge in python including libraries/frameworks such as pandas, numpy, pyspark Good understanding of OOP, software design patterns, and SOLID principles Good experience in Docker Good experience in Linux Good experience in Airflow Good knowledge more »
building a massively distributed cloud-hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End more »
requirements and design scalable and efficient solutions to meet business needs. Implement data pipelines, ETL processes, and data transformations Strong experience with Azure Databricks, PySpark, and Python Integrate diverse data sources and formats, including structured and unstructured data, streaming data, and APIs. Technical skills: Strong proficiency in Python Extensive more »
a complete greenfield re-architecture from the ground up in Microsoft Azure. The Tech you'll be playing with: Azure Data Factory Azure Databricks PySpark SQL DBT What you need to bring: 1-3 year's experience in building Data Pipelines SQL experience in data warehousing Python experience would more »
to the ideas and delivery of the strategy; Support data queries in SQL (T-SQL/ANSI-SQL) and support data pipelines using in PySpark/Python, Databricks and AWS (Athena, Glue, S3); Analyse data needs and coordinate new data requests and data change requests. Work with clients to … Pivot Charts). Experience of supporting Data Warehousing. Basic SQL experience and understanding XML/JSON files. Basic knowledge/experience of either Python, PySpark, R, Scala etc. Experience using SQL, PowerBI, Tableau or similar tools. Preferred: Knowledge and experience of Financial Systems Support (Access Dimensions) or ServiceNow Support … and Administration. Strong knowledge of using SQL, PowerBI, Tableau etc. Strong knowledge of using Python, PySpark, R, Scala etc. Experience of supporting IT Applications and/or Platforms. Experience of cloud data solutions (AWS, Google, Microsoft Azure), AWS preferred. Degree in Business Analytics or Technology, Computer Science, Math, Statistics more »
requirements and deliver solutions that drive business value. Requirements: 7+ years in a Data Engineering Role Excellent proficiency in SQL, Python, Microsoft Azure, Databricks, PySpark, Experience managing a team Details: Start Date: ASAP Duration: 3 months, option for permanent extension Day rate: Up to £400Ltd, depending on experience Annual more »
Leicestershire, England, United Kingdom Hybrid / WFH Options
Harnham
continual improvement of their performance. REQUIRED SKILLS AND EXPERIENCE Proficiency in Python, including its associated data and machine learning packages such as numpy, pandas, pyspark, matplotlib, scikit-learn, Keras, TensorFlow, and more. Experience working with object-oriented programming languages. Experience in Forecasting & AB testing. Leadership Expertise in utilizing SQL more »
London (Hybrid working) and is paying up to £100,000 per annum Key Skills Required Tools - Data Factory/Data Bricks/Python/PySpark/SQL/Power BI Commercial experience of Kimball/Inmon Datamodelling Knowledge of London Market Insurance is Highly Desirable Experience of Synapse beneficial more »
data experience Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If you have the above experience and are looking for a new contract role more »
data experience Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If you have the above experience and are looking for a new contract role more »
Python, PySpark, AWS, Oracle, Kafka, Banking Become a key member of an agile team designing and delivering a market-leading secure and scalable reporting product. The team is migrating from Oracle to AWS cloud so experience with either is nice to have. You serve as a seasoned member of … candidates able to work under these requirements without visa sponsorship will be able to be considered at this time. Required skills: - Either Python or PySpark experience. Ideally, you will be working as either a Python Developer (3+ years) or Data Engineer using PySpark - Either AWS or Oracle. Candidates more »
South East London, London, United Kingdom Hybrid / WFH Options
The Bridge (IT Recruitment) Limited
inside IR35 on a remote basis for an energy provider. The key skills required for this Senior Data Engineer role are: Azure Python Databricks Pyspark If you do have the required skills for this remote Senior Data Engineer contract, please do apply. more »
understand consumers. Hands-on data engineering/development experience, preferably in a cloud/big data environment Skilled in at least one of Python, PySpark, SQL or similar Experience in guiding or managing roles in insight or data functions, delivering data projects and insight to inspire action and drive more »
month contract. Essential Skills: Insurance/Financial Services experience is highly desirable Design efficient and scalable data models. Azure Databricks (preferably), SQL, Python, PySpark, dimensional/star schema data modelling. Understanding of Conceptual, Logical and Physical Data Models Experience in ETL/ELT as well as Entity Relationship (ER more »
the Financial Services with experience in..... Statistical Models, Computer Vision, Predictive Analytics, Data Visualization, Large Language Models (LLM), NLP, AI, Machine Learning, MLOPs Python, Pyspark, Azure, Agile, MetaBase, then please apply. You can 📧 your cv to matt@hawksworthuk.com or message me on LinkedIn. Ideally you'll have plenty of more »
Must have experience of leading a team- Must also be hands on Apache JMeter with Groovy scripting TestCafe (Node Package Manager) – JavaScript/TypeScript PySpark – Python and Spark AWS Elastic Kubernetes Service (EKS) and Docker setup, administration, configuration AWS PostGres & Grafana setup, administration and configuration to store and visualise more »