Knowledge on Spark architecture and modern Datawarehouse/Data-Lake/Lakehouse techniques Build transformation tables using SQL. Moderate level knowledge of Python/PySpark or equivalent programming language. PowerBI Data Gateways and DataFlows, permissions. Creation, utilisation, optimisation and maintenance of Relational SQL and NoSQL databases. Experienced working with more »
related field Certifications such as Azure Data Engineer Associate are desirable. Knowledge of data ingestion methods for real-time and batch processing Proficiency in PySpark and debugging Apache Spark workloads. What’s in it for you? Annual bonus scheme – up to 10% Excellent pension scheme Flexible working Enhanced family more »
Basingstoke, England, United Kingdom Hybrid / WFH Options
Blatchford
with various databases e.g. MS SQL, Azure Cosmos DB. Skilled at optimizing large and more complicated SQL statements. Proficiency in Python and experience with PySpark Experience using: CI/CD, Microsoft Azure, and Azure Dev Ops in an agile environment. Knowledge of Azure ETL services, i.e. Data Factory, Synapse more »
the insurance domain is advantageous. - Education : A degree in Computer Science, Data Science, Engineering, or a related discipline. Technical Skills : Proficient in Python, SQL, PySpark, and Databricks. Demonstrated proficiency in modern NLP techniques and tools. Proven track record in developing and managing data quality metrics and dashboards. Experience collaborating more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
Cycle · Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is more »
applied machine learning, probability, statistics, and quantitative risk modelling. High proficiency in Python & SQL. Experience with big data technologies and tools, particularly Databricks and Pyspark, is highly desirable. Experience in agile software development processes is a plus. Experience in insurance, cyber, or a related domain is ideal. Understanding of more »
Exeter, Devon, South West, United Kingdom Hybrid / WFH Options
Staffworx Limited
Ability to optimise workflows and analysis for map reduce processing Experience with BI software (Power BI, Tableau, Qlik Sense) Any experience with data engineering, PySpark, Databricks, Delta Lakes beneficial Confident presenting complex problems in ways suitable to target audience Experience leading or managing a small analysis team Familiarity working more »
Saffron Walden, Essex, South East, United Kingdom Hybrid / WFH Options
EMBL-EBI
expertise and requirements. You have A BSc or MSc in computer science or related fields. Expertise in Python, including popular Python libraries: NumPy, Pandas, PySpark and frameworks: Django, Django Rest Framework, FastAPI Hands-on experience with both relational (e.g. PostgreSQL) and non-relational databases (e.g. Elasticsearch, Redis). Strong more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
Surrey, England, United Kingdom Hybrid / WFH Options
Hawksworth
that are comfortable with terms like; Statistical Models, Computer Vision, Predictive Analytics, Data Visualization, Large Language Models (LLM), NLP, AI, Machine Learning, MLOPs, Python, Pyspark, and Azure. Flexible Working : The role is 60% work from home Sponsorship : Sadly sponsorship isn't available for these roles. About You: Bachelors Degree more »
Proficiency in modern programming languages and database querying languages. Comprehensive understanding of the Software Development Life Cycle and Agile methodologies. Familiarity with Python or PySpark and cloud technologies like AWS, Kubernetes, and Spark is highly desirable. Ideal Candidate: Someone with a knack for innovative solutions and a commitment to more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Experian Ltd
Glue and SageMaker Infrastructure-as-Code tools and approaches (we use the AWS CDK with CloudFormation) Data processing frameworks such as pandas, Spark and PySpark Machine learning concepts like model training, model registry, model deployment and monitoring Development and CI/CD tools (we use GitHub, CodePipeline and CodeBuild more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
of this is a strong preference. However other Cloud platforms like AWS/GCP are acceptable. • Coding Languages - Experience using Python with data (Pandas, PySpark) would be an advantage. Other languages such as C# would be beneficial but not essential. Their lovely offices are based in the West Midlands more »
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management NEXT STEPS: If this role looks of interest, please reach out to Joseph Gregory. more »
Nottingham, England, United Kingdom Hybrid / WFH Options
Harnham
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management BENEFITS: Pension scheme Gym Membership Share options Bonus Hybrid working HOW TO APPLY: Register more »
Leicester, England, United Kingdom Hybrid / WFH Options
Harnham
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management BENEFITS: Pension scheme Gym Membership Share options Bonus Hybrid working HOW TO APPLY: Register more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
contract position. If you possess a solid background in software application development, with experience in cloud or microservice architecture, and proficiency in Python or PySpark, we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS more »
Bristol, England, United Kingdom Hybrid / WFH Options
Adecco
experience in developing data ingestion pipelines with advanced ML elements. Degree in Computer Science, Data Science, Engineering, or related field. Proficient in Python, SQL, PySpark, and Databricks. Experience with modern NLP techniques and tools. Familiarity with Git for version control. Knowledge of the insurance sector is advantageous but not … Expertise in applied machine learning, probability, statistics, and quantitative risk modelling. High proficiency in Python and SQL, with experience in big data technologies (Databricks, PySpark). Experience in the insurance, cyber, or related domain preferred. Strong problem-solving and communication skills.Preferred Skills (Nice to Have): Knowledge of Monte Carlo more »
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Anson McCade
Senior Data Engineer • Location: Belfast based – Hybrid - flexible working • Salary : £50,000 - £60,000 • Package : 10% bonus + 11% pension Overview One of the UK’s leading digital solution provider’s is searching for a Senior Data Engineer to join more »
Guildford, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
Data Engineer We're partnering with a with a global financial services institution that are embarking on an ambitious project: to build an industry-leading data platform. The long-term vision is to migrate all data within the coming years more »
Job Description for Data Engineer - Coding skills: Pyspark or Scala(level to troubleshoot issues) and SQL. Must have good pyspark knowledge to build data transformation. Hands on Azure data platforms i.e; Azure Datafactory, Databricks, Synapse, SQL, Data lake storage. Experience building and optimizing ‘big data’ data pipelines, architectures more »
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Anson McCade
Data Engineer • Location : Belfast based – Hybrid - flexible working • Salary : £42,500 - £50,000 • Package: 10% bonus + 11% pension Overview One of the UK’s leading digital solution provider’s is searching for a Data Engineer to join their practice more »
worked as a senior data engineer in data lake/lake house architecture platform and has hands on experience of developing data pipelines using Pyspark and SQL. The successful individual will have working experience of AWS data eco system and toolsets, with passion to learn and explore new technology … with hands on delivery. Working experience of Cloud Technologies and data eco system, AWS is preferable. Expert in development of data pipeline based on Pyspark and SQL with orchestration tools such as Airflow, Tivoli etc Knowledge of OLAP implementation on any Database technologies (Snowflake preferred) Excellent understanding of Data more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »