Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
LV= General Insurance
experience of writing clean, well-documented, and unit-tested python ETL processes to read, transform, and verifying data using tools such as SQL and PySpark Cloud technologies and architecture including Databricks, Kubernetes, Data/Delta Lake, Azure Machine Learning, Data Factory. Azure is preferable, but AWS and GCP experience more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
and experience of this is a strong preference. However other Cloud platFforms like AWS/GCP are acceptable.Coding - Experience using Python with data (Pandas, PySpark) would be an advantage. Other languages such as C# would be beneficial but not essential.What’s next?If you believe you have the desired more »
candidate needs experience and confidence to speak up when several teams can be involved in delivering the project. There is a massive emphasis on Pyspark and Databricks for this particular role.Technical Skills Required:Azure (ADF, Functions, Blob Storage, Data Lake Storage, Azure Data Bricks)DatabricksSparkDelta LakeSQLPythonPySparkADLSDay To Day Responsibilities more »
cloud platformsExpertise on Azure Cloud platformKnowledge on orchestrating workloads on cloudAbility to set and lead the technical vision while balancing business driversStrong experience with PySpark, Python programmingProficiency with APIs, containerization and orchestration is a plusQualifications:Bachelor's and/or master’s degree About you:You are self-motivated more »
the Financial Services with experience in..... Statistical Models, Computer Vision, Predictive Analytics, Data Visualization, Large Language Models (LLM), NLP, AI, Machine Learning, MLOPs Python, Pyspark, Azure, Agile, MetaBase, then please apply. You can 📧 your cv to matt@hawksworthuk.com or message me on LinkedIn. Ideally you'll have plenty of more »
Leicester, England, United Kingdom Hybrid / WFH Options
Harnham
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management BENEFITS: Pension scheme Gym Membership Share options Bonus Hybrid working HOW TO APPLY: Register more »
Nottingham, England, United Kingdom Hybrid / WFH Options
Harnham
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management BENEFITS: Pension scheme Gym Membership Share options Bonus Hybrid working HOW TO APPLY: Register more »
South East London, England, United Kingdom Hybrid / WFH Options
Durlston Partners
on building a massively distributed cloud-hosted data platform that would be used by the entire firm.Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster.Cloud: AWS, Lambdas, ECS servicesThis role would focus on various areas of Data Engineering including:End to End more »
or customer experience-related problems such as workforce management, demand forecasting, or root cause analysisStrong visualisation skills including experience with TableauFamiliarity with Databricks and PySpark for data manipulation and analysisFamiliarity with Git-based source control methodologies, including branching and pull requestsA self-starter, passionate about converting data into actionable more »
following:Minimum of 4 years commercial experienceStrong proficiency in AWS and relevant technologiesGood communication skills Preferably experience with some or all of the following; Pyspark, Kubernetes, TerraformExpeirence in a start-up/scale up or fast-paced environment is desirable.HOW TO APPLYPlease register your interest by sending your CV more »
not received on time. Communicating outages with the end users of a data pipeline What We Value Comfortable reading and writing code in Python, Pyspark and Java. Basic understanding of Spark and interested in learning the basics of tuning Spark jobs. Data pipeline monitoring team members should be able more »
Job DescriptionLead Data Engineer:We need some strong Data engineer profiles… they need good experience withPyspark, Python, SQL, ADF and preferably Databricks experience Job description:Building new data pipelines and optimizing data flows using the Azure cloud stack.Building data products more »
in Azure Cloud technologies e.g., ML/OPS, ML Flow, Azure Data Factory, Azure Function, Data Bricks, Event Hub, Microservices/API, Python/PYSPARK/R, or SQL 2+ years of experience designing, developing, and implementing Big Data platforms using Azure Cloud architecture with structured and unstructured data more »
in Azure Cloud technologies e.g., ML/OPS, ML Flow, Azure Data Factory, Azure Function, Data Bricks, Event Hub, Microservices/API, Python/PYSPARK/R, or SQL 2+ years of experience designing, developing, and implementing Big Data platforms using Azure Cloud architecture with structured and unstructured data more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
experience of this is a strong preference. However other Cloud platFforms like AWS/GCP are acceptable. Coding - Experience using Python with data (Pandas, PySpark) would be an advantage. Other languages such as C# would be beneficial but not essential. What’s next? If you believe you have the more »
Cycle · Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is more »
Surrey, England, United Kingdom Hybrid / WFH Options
Hawksworth
that are comfortable with terms like; Statistical Models, Computer Vision, Predictive Analytics, Data Visualization, Large Language Models (LLM), NLP, AI, Machine Learning, MLOPs, Python, Pyspark, and Azure. Flexible Working : The role is 60% work from home Sponsorship : Sadly sponsorship isn't available for these roles. About You: Bachelors Degree more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
of this is a strong preference. However other Cloud platforms like AWS/GCP are acceptable. • Coding Languages - Experience using Python with data (Pandas, PySpark) would be an advantage. Other languages such as C# would be beneficial but not essential. Their lovely offices are based in the West Midlands more »
Leicestershire, England, United Kingdom Hybrid / WFH Options
Harnham
continual improvement of their performance. REQUIRED SKILLS AND EXPERIENCE Proficiency in Python, including its associated data and machine learning packages such as numpy, pandas, pyspark, matplotlib, scikit-learn, Keras, TensorFlow, and more. Experience working with object-oriented programming languages. Experience in Forecasting & AB testing. Leadership Expertise in utilizing SQL more »
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Anson McCade
Senior Data Engineer • Location: Belfast based – Hybrid - flexible working • Salary : £50,000 - £60,000 • Package : 10% bonus + 11% pension Overview One of the UK’s leading digital solution provider’s is searching for a Senior Data Engineer to join more »
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Anson McCade
Data Engineer • Location : Belfast based – Hybrid - flexible working • Salary : £32,5000 - £40,000 • Package: 10% bonus + 11% pension Overview One of the UK’s leading digital solution provider’s is searching for a Data Engineer to join their practice more »
months to begin with & its extendable Location: Leeds, UK (min 3 days onsite) Context: Legacy ETL code for example DataStage is being refactored into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. Skills: Spark Architecture – component understanding around Spark … Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans. Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations. Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations. Spark – SME Be able … are Cluster level failures. Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code. Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code. more »
Bournemouth, Dorset, South West, United Kingdom Hybrid / WFH Options
LV= General Insurance
experience of writing clean, well-documented, and unit-tested python ETL processes to read, transform, and verifying data using tools such as SQL and PySpark Cloud technologies and architecture including Databricks, Kubernetes, Data/Delta Lake, Azure Machine Learning, Data Factory. Azure is preferable, but AWS and GCP experience more »
V2, Azure Databricks, Azure Function Apps& Logic Apps, Azure Stream Analytics, Azure Resource Manager skills (Terraform, Azure Portal, Az CLI and Az PowerShell) Strong PySpark, Delta Lake, Unity Catalog and Python skills. Includes ability to write unit and integration tests in Python with unittest, pytest, etc. Strong understanding of more »
Lead Data Engineer: We need some strong Data engineer profiles… they need good experience with Pyspark, Python, SQL, ADF and preferably Databricks experience Job description: Building new data pipelines and optimizing data flows using the Azure cloud stack. Building data products from scratch. Support Business Analysts and Data Architects more »