a week in the Liverpool office - rest remote**** Senior Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. **** Informatica Cloud (IICS & IDMC ) is essential - NOT PowerCenter**** A top insurance firm are looking for a … e.g. Scrum, SAFe) and tools (i.e. Jira, AzureDevOps). Data Engineer, Data, Data Modelling, Migration, ETL, ETL Tooling (Informatica IICS & IDMC ) SQL, Python or Pyspark, Agile, migrating data from on-prem to cloud. Seniority Level Mid-Senior level Industry Insurance Financial Services Employment Type Full-time Job Functions Information more »
Manchester Area, United Kingdom Hybrid / WFH Options
Vermelo RPO
between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar software is preferred Proficient at more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
and ensuring delivery. Required Skills/Experience The ideal candidate will have the following: 1. Azure Synapse 2. Azure DataFactory 3. Microsoft Fabric 4. Pyspark/Python If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format. Disclaimer Not more »
a complete greenfield re-architecture from the ground up in Microsoft Azure. The Tech you'll be playing with: Azure Data Factory Azure Databricks PySpark SQL DBT What you need to bring: 1-3 year's experience in building Data Pipelines SQL experience in data warehousing Python experience would more »
Greater Manchester, England, United Kingdom Hybrid / WFH Options
Blue Wolf Digital
to join their team. The primary focus of the role is Databricks data engineering. You will be building data pipelines using Databricks, coding using PySpark, and supporting internal applications. You will also be using Python for Data Transformations and work across the Azure Data Platform. Must Have Strong Databricks more »
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management NEXT STEPS: If this role looks of interest, please reach out to Joseph Gregory. more »
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management HOW TO APPLY Please register your interest by sending your CV to Kiran Ramasamy more »
Nottingham, England, United Kingdom Hybrid / WFH Options
Harnham
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management BENEFITS: Pension scheme Gym Membership Share options Bonus Hybrid working HOW TO APPLY: Register more »
Leicester, England, United Kingdom Hybrid / WFH Options
Harnham
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management BENEFITS: Pension scheme Gym Membership Share options Bonus Hybrid working HOW TO APPLY: Register more »
Leicestershire, England, United Kingdom Hybrid / WFH Options
Harnham
continual improvement of their performance. REQUIRED SKILLS AND EXPERIENCE Proficiency in Python, including its associated data and machine learning packages such as numpy, pandas, pyspark, matplotlib, scikit-learn, Keras, TensorFlow, and more. Experience working with object-oriented programming languages. Experience in Forecasting & AB testing. Leadership Expertise in utilizing SQL more »
to the ideas and delivery of the strategy; Support data queries in SQL (T-SQL/ANSI-SQL) and support data pipelines using in PySpark/Python, Databricks and AWS (Athena, Glue, S3); Analyse data needs and coordinate new data requests and data change requests. Work with clients to … Pivot Charts). Experience of supporting Data Warehousing. Basic SQL experience and understanding XML/JSON files. Basic knowledge/experience of either Python, PySpark, R, Scala etc. Experience using SQL, PowerBI, Tableau or similar tools. Preferred: Knowledge and experience of Financial Systems Support (Access Dimensions) or ServiceNow Support … and Administration. Strong knowledge of using SQL, PowerBI, Tableau etc. Strong knowledge of using Python, PySpark, R, Scala etc. Experience of supporting IT Applications and/or Platforms. Experience of cloud data solutions (AWS, Google, Microsoft Azure), AWS preferred. Degree in Business Analytics or Technology, Computer Science, Math, Statistics more »
Role: Graduate Data Engineer Type: 12 months fixed-term Location: Peterborough Ready to utilise your skills to process and extract value from large datasets? Are you passionate about performing root cause analysis on various data? We have an exciting role more »
the Financial Services with experience in..... Statistical Models, Computer Vision, Predictive Analytics, Data Visualization, Large Language Models (LLM), NLP, AI, Machine Learning, MLOPs Python, Pyspark, Azure, Agile, MetaBase, then please apply. You can 📧 your cv to matt@hawksworthuk.com or message me on LinkedIn. Ideally you'll have plenty of more »
the insurance domain is advantageous. - Education : A degree in Computer Science, Data Science, Engineering, or a related discipline. Technical Skills : Proficient in Python, SQL, PySpark, and Databricks. Demonstrated proficiency in modern NLP techniques and tools. Proven track record in developing and managing data quality metrics and dashboards. Experience collaborating more »
applied machine learning, probability, statistics, and quantitative risk modelling. High proficiency in Python & SQL. Experience with big data technologies and tools, particularly Databricks and Pyspark, is highly desirable. Experience in agile software development processes is a plus. Experience in insurance, cyber, or a related domain is ideal. Understanding of more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
experience of this is a strong preference. However other Cloud platFforms like AWS/GCP are acceptable. Coding - Experience using Python with data (Pandas, PySpark) would be an advantage. Other languages such as C# would be beneficial but not essential. What’s next? If you believe you have the more »
of this is a strong preference. However other Cloud platforms like AWS/GCP are acceptable. · Coding Languages - Experience using Python with data (Pandas, PySpark) would be an advantage. Other languages such as C# would be beneficial but not essential. · Machine Learning and Data Science Tools - Any experience in more »
Surrey, England, United Kingdom Hybrid / WFH Options
Hawksworth
that are comfortable with terms like; Statistical Models, Computer Vision, Predictive Analytics, Data Visualization, Large Language Models (LLM), NLP, AI, Machine Learning, MLOPs, Python, Pyspark, and Azure. Flexible Working : The role is 60% work from home Sponsorship : Sadly sponsorship isn't available for these roles. About You: Bachelors Degree more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
of this is a strong preference. However other Cloud platforms like AWS/GCP are acceptable. • Coding Languages - Experience using Python with data (Pandas, PySpark) would be an advantage. Other languages such as C# would be beneficial but not essential. Their lovely offices are based in the West Midlands more »
West Sussex, England, United Kingdom Hybrid / WFH Options
Eden Smith Group
the Insurance and technology sector, renown for its outstanding working culture, in a Data Engineering role. The role will require someone who has strong Pyspark and SQL experience alongside an Azure background. Key responsibilities: Develop low to high complexity, secure, governed, high quality, efficient data pipelinesfrom a variety of … and reporting & visualisation solutions in recognised BI tools such as PowerBI. Skills required: Strong experience with SQL server and/or Azure (Synapse, Datafactory) Pyspark Experience of designing and building end to end data solutions A good understanding of the full data lifecycle The ability to coach/mentor more »
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Anson McCade
Senior Data Engineer • Location: Belfast based – Hybrid - flexible working • Salary : £50,000 - £60,000 • Package : 10% bonus + 11% pension Overview One of the UK’s leading digital solution provider’s is searching for a Senior Data Engineer to join more »
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Anson McCade
Data Engineer • Location : Belfast based – Hybrid - flexible working • Salary : £32,5000 - £40,000 • Package: 10% bonus + 11% pension Overview One of the UK’s leading digital solution provider’s is searching for a Data Engineer to join their practice more »
Lead Data Engineer: We need some strong Data engineer profiles… they need good experience with Pyspark, Python, SQL, ADF and preferably Databricks experience Job description: Building new data pipelines and optimizing data flows using the Azure cloud stack. Building data products from scratch. Support Business Analysts and Data Architects more »