z2bz0 years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth … Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver or any other DevOps tools More ❯
z2bz0 years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth … Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver or any other DevOps tools More ❯
z2bz0 years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth … Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver or any other DevOps tools More ❯
which they modernise the use of their data. Overview of responsibilities in the role: Design and maintain scalable, high-performance data pipelines using Azure Data Platform tools such as Databricks (PySpark), Data Factory, and Data Lake Gen2. Develop curated data layers (bronze, silver, gold) optimised for analytics, reporting, and AI/ML, ensuring they meet performance, governance, and reuse standards. … clear technical documentation, including data dictionaries, architecture diagrams, lineage, and testing protocols. What they are looking for from you: High level of experience with Microsoft Azure (Azure Data Factory, Databricks, Data Lake Gen 2) Experience building stable and accurate data pipelines Expertise with SQL and Python Familiarity with medallion architecture 4+ years working in a fast-paced data engineer position More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Recruit with Purpose
which they modernise the use of their data. Overview of responsibilities in the role: Design and maintain scalable, high-performance data pipelines using Azure Data Platform tools such as Databricks (PySpark), Data Factory, and Data Lake Gen2. Develop curated data layers (bronze, silver, gold) optimised for analytics, reporting, and AI/ML, ensuring they meet performance, governance, and reuse standards. … clear technical documentation, including data dictionaries, architecture diagrams, lineage, and testing protocols. What they are looking for from you: High level of experience with Microsoft Azure (Azure Data Factory, Databricks, Data Lake Gen 2) Experience building stable and accurate data pipelines Expertise with SQL and Python Familiarity with medallion architecture 4+ years working in a fast-paced data engineer position More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Recruit with Purpose
which they modernise the use of their data. Overview of responsibilities in the role: Design and maintain scalable, high-performance data pipelines using Azure Data Platform tools such as Databricks (PySpark), Data Factory, and Data Lake Gen2. Develop curated data layers (bronze, silver, gold) optimised for analytics, reporting, and AI/ML, ensuring they meet performance, governance, and reuse standards. … clear technical documentation, including data dictionaries, architecture diagrams, lineage, and testing protocols. What they are looking for from you: High level of experience with Microsoft Azure (Azure Data Factory, Databricks, Data Lake Gen 2) Experience building stable and accurate data pipelines Expertise with SQL and Python Familiarity with medallion architecture 4+ years working in a fast-paced data engineer position More ❯
capacity of AXA XL's Data and AI Platforms and cloud technologies. Our tech stack continues to evolve together with Azure data and AI offering and relies on Azure Databricks and Azure AI pillars. Additionally, AXA XL consumes the wider technology offering from AXA Group, such as managed OpenShift, VM and DevOps platforms. We use Scrum methodology. What will your … with a tangible track record of instituting change. Programming experience - ideally in Python or open to using Python. Familiarity with all, and expert in some of the below: SQL, Databricks or Spark, MPP databases, data warehouse design, feature store design, Kubernetes, orchestration tools, monitoring tools, IaC, Docker, streaming technologies. Well-established experience as a Data Engineer/Software Engineer/ More ❯
Tingley, Yorkshire, United Kingdom Hybrid / WFH Options
Parkinson's UK
a numerate subject such as Analysis, Data Science, or Maths with specific analytical qualifications is essential Previous experience as a Data Analyst or similar role is preferred Experience in Databricks would be highly desired but is not required The right candidate will likely only have some of the following technical skills & experience (listed is our current software solution): ExData visualisation … tools (PowerBI) and manipulating data (Python/Excel/VBA/SQL/Databricks/Spark), Analytical models (Python: SciKitLearn, Imblearn, Prophet, XGBoost, ExtraTrees, Kmeans Clustering), Web Scraping (Python: Selenium, Chrome driver, BeautifulSoup) and understanding of Databases (Oracle/SAP Hana/AWS). At Evri, we know we only grow if our people do too. That's why we More ❯
Morley, Leeds, United Kingdom Hybrid / WFH Options
Evri, Inc
a numerate subject such as Analysis, Data Science, or Maths with specific analytical qualifications is essential Previous experience as a Data Analyst or similar role is preferred Experience in Databricks would be highly desired but is not required The right candidate will likely only have some of the following technical skills & experience (listed is our current software solution): ExData visualisation … tools (PowerBI) and manipulating data (Python/Excel/VBA/SQL/Databricks/Spark), Analytical models (Python: SciKitLearn, Imblearn, Prophet, XGBoost, ExtraTrees, Kmeans Clustering), Web Scraping (Python: Selenium, Chrome driver, BeautifulSoup) and understanding of Databases (Oracle/SAP Hana/AWS). At Evri, we know we only grow if our people do too. That's why we More ❯
departments, You will have to have a strong understanding of various financial products and the trading life cycle. The role Advanced proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Knowledge of Kimball data modelling methodology Experience using scripting languages such as Python, PowerShell etc. Experience with Microsoft Azure. Strong knowledge of ETL/ELT tools and experience navigating More ❯
departments, You will have to have a strong understanding of various financial products and the trading life cycle. The role Advanced proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Knowledge of Kimball data modelling methodology Experience using scripting languages such as Python, PowerShell etc. Experience with Microsoft Azure. Strong knowledge of ETL/ELT tools and experience navigating More ❯
will be a hybrid set up, requiring 3 days a week in the London HQ. What are we looking for? Advanced proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Experience using scripting languages such as Python, PowerShell etc. Data Modelling, cleansing and enrichment. Experience with Microsoft Azure. Strong knowledge of ETL/ELT tools and experience navigating data More ❯
will be a hybrid set up, requiring 3 days a week in the London HQ. What are we looking for? Advanced proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Experience using scripting languages such as Python, PowerShell etc. Data Modelling, cleansing and enrichment. Experience with Microsoft Azure. Strong knowledge of ETL/ELT tools and experience navigating data More ❯
London, England, United Kingdom Hybrid / WFH Options
Maxwell Bond
Power BI (or similar tools) Some Python knowledge Strong attention to detail and problem-solving skills Comfortable explaining data to non-technical stakeholders Nice to have: Experience with Azure, Databricks or ETL processes Understanding of data modelling concepts What’s on offer: Up to £40,000 salary Hybrid working – 3 days a week in the Manchester office 25 days holiday More ❯
methodologies (SAFE, Scrum, JIRA) Languages and Frameworks: JSON YAML Python (advanced proficiency, Pydantic bonus) SQL PySpark Delta Lake Bash Git Markdown Scala (bonus) Azure SQL Server (bonus) Technologies: Azure Databricks Apache Spark Delta Tables Data processing with Python PowerBI (Data ingestion and integration) JIRA Additional Notes: Candidates with current or past high-level security clearance are encouraged to apply. Successful More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
work end-to-end, making meaningful contributions within a small, agile team. Experience We're looking for candidates with: Extensive experience in Data Engineering with a focus on Azure, Databricks, and Delta Lake. Proficiency in Kubernetes, Infrastructure as Code, and Terraform. Expertise in Azure DevOps and a commitment to best practices. A preference for simple, transparent solutions and a drive More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Harnham
work end-to-end, making meaningful contributions within a small, agile team. Experience We're looking for candidates with: Extensive experience in Data Engineering with a focus on Azure, Databricks, and Delta Lake. Proficiency in Kubernetes, Infrastructure as Code, and Terraform. Expertise in Azure DevOps and a commitment to best practices. A preference for simple, transparent solutions and a drive More ❯
Qualifications Proven experience in a Data Engineer role. Expertise in Microsoft and Azure technologies: SQL Server, ADF, Logic Apps, Data Lake, Power Query, Fabric, Function Apps, Power Automate, Spark, Databricks, SSIS. Strong understanding of data modelling techniques including Normalisation and Kimball methodologies. Proficient in SQL, and experience with programming languages such as Python or C#. Familiarity with low-code platforms More ❯
Qualifications Proven experience in a Data Engineer role. Expertise in Microsoft and Azure technologies: SQL Server, ADF, Logic Apps, Data Lake, Power Query, Fabric, Function Apps, Power Automate, Spark, Databricks, SSIS. Strong understanding of data modelling techniques including Normalisation and Kimball methodologies. Proficient in SQL, and experience with programming languages such as Python or C#. Familiarity with low-code platforms More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Harnham
science or analytics projects with real business impact. • Strong background in data engineering, with solid experience working with cloud platforms and data tools. • Proficient in Python, SQL, and ideally Databricks or similar tools. • Willing to learn and work with tools like Power BI as needed. • Confident working directly with stakeholders across the business and translating complex concepts into practical solutions. More ❯
science or analytics projects with real business impact. • Strong background in data engineering, with solid experience working with cloud platforms and data tools. • Proficient in Python, SQL, and ideally Databricks or similar tools. • Willing to learn and work with tools like Power BI as needed. • Confident working directly with stakeholders across the business and translating complex concepts into practical solutions. More ❯
and/or data projects. Financial services industry and Insurance Knowledge experience required. London Market knowledge is advantageous. Proficiency in testing and analysing pipelines using Azure Data Factory and Databricks to ensure accurate transformations. Experience creating traceability matrix mapping Test Scenarios/Test Cases with requirements. Experience in using test management tools such as Quality Center, VSTS. Excellent working knowledge More ❯
Nuneaton, England, United Kingdom Hybrid / WFH Options
Hays
strong technical expertise and a passion for solving complex business problems. You'll bring: Strong experience with SQL, SQL Server DB, Python, and PySpark Proficiency in Azure Data Factory, Databricks is a must, and Cloudsmith Background in data warehousing and data engineering Solid project management capabilities Outstanding communication skills, translating technical concepts into clear business value A collaborative, solution-oriented More ❯
Warwickshire, West Midlands, United Kingdom Hybrid / WFH Options
Hays
strong technical expertise and a passion for solving complex business problems. You'll bring: Strong experience with SQL, SQL Server DB, Python, and PySpark Proficiency in Azure Data Factory, Databricks is a must, and Cloudsmith Background in data warehousing and data engineering Solid project management capabilities Outstanding communication skills, translating technical concepts into clear business value A collaborative, solution-oriented More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Intec Select
environments. Proficiency with SQL Server in high-transaction settings. Experience with either C# or Python/PySpark for data tasks. Hands-on knowledge of Azure cloud services, such as Databricks, Event Hubs, and Function Apps. Solid understanding of DevOps principles and tools like Git, Azure DevOps, and Terraform. Versatility in working across a wide range of data technologies and solving More ❯