Databricks. Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such as Spark, Hadoop, or Kafka, is a plus. Strong problem-solving skills and the ability to work in a collaborative team environment. Excellent verbal and written More ❯
platform management roles, with 5+ years in leadership positions. Expertise in modern data platforms (e.g., Azure, AWS, Google Cloud) and big data technologies (e.g., Spark, Kafka, Hadoop). Strong knowledge of data governance frameworks, regulatory compliance (e.g., GDPR, CCPA), and data security best practices. Proven experience in enterprise-level More ❯
Purview, or Informatica, including projects around lineage, cataloging, and quality rules. Strong hands-on development experience in SQL and Python, with working knowledge of Spark or other distributed data processing frameworks. Design, development and implementation of distributed data solutions using API and microservice-based architecture. Deep understanding of ETL More ❯
architecture, data modelling, ETL/ELT processesand data pipeline development. Competency with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies (e.g., Hadoop, Spark, Kafka etc). Excellent communication and leadership skills, with the ability to engage and influence stakeholders at all levels. Insightful problem-solving skills and More ❯
london, south east england, united kingdom Hybrid / WFH Options
Aventum Group
Python, SQL, T-SQL, SSIS DB: Azure SQL Database, Cosmos DB, NoSQL, Methodologies: Agile, DevOps must have Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (public Key Infrastructure) and Integration testing Management Duties Yes We are an equal opportunity employer, and we are More ❯
london, south east england, united kingdom Hybrid / WFH Options
Methods
and lakehouse architectures. - Knowledge of DevOps practices, including CI/CD pipelines and version control (eg, Git). - Understanding of big data technologies (eg, Spark, Hadoop) is a plus. More ❯
Familiarity with SQL and database management systems (e.g., PostgreSQL, MySQL). Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data tools (e.g., Spark, Hadoop) is a plus. Prior experience in financial data analysis is highly preferred. Understanding financial datasets, metrics, and industry trends. Preferred Qualifications: Experience with More ❯
including at least 3 years in a leadership position. Deep hands-on expertise in modern data architecture, pipelines, and tooling (e.g., Airflow, DBT, Kafka, Spark, Python, SQL). Strong understanding of cloud infrastructure (AWS, GCP, or Azure) and scalable data systems. Familiarity with analytics and ML workflows, including model More ❯
including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc. Good knowledge of Python and Spark are required. Experience in ETL & ELT Good understanding of one scripting language Good understanding of how to enable analytics using cloud technology and ML More ❯
london, south east england, united kingdom Hybrid / WFH Options
Aventis Solutions
/Kubernetes), and Infrastructure-as-Code (Terraform). Cursor AI is widely used too. Data-driven : Experience working with big data systems like Databricks, Spark, Kafka, Snowflake, or BigQuery. Familiarity with batch and streaming ML pipelines is ideal. LLM-curious : Hands-on exposure to LLMs or interest in working More ❯
london, south east england, united kingdom Hybrid / WFH Options
IDEXX
environments with AI/ML components or interest in learning data workflows for ML applications . Bonus if you have e xposure to Kafka, Spark, or Flink . Experience with data compliance regulations (GDPR). What You Can Expect From Us Opportunity for annual bonuses Medical Insurance Cycle to More ❯
london, south east england, united kingdom Hybrid / WFH Options
Intec Select
data services. DataOps Knowledge: Experienced with CI/CD for data workflows, version control (e.g., Git), and automation in data engineering. Desirable: Experience with ApacheSpark Familiarity with machine learning frameworks and libraries Understanding of data governance and compliance Strong problem-solving and analytical skills Excellent communication and More ❯
a focus on Snowflake and Databricks . Deep understanding of data warehousing , data lakes , and lakehouse architectures . Strong proficiency in SQL , Python , and Spark . Experience with ETL/ELT pipelines , data modeling , and data integration . Familiarity with cloud platforms such as AWS, Azure, or GCP. Knowledge More ❯
strong degree in computer science or a relevant area. Excellent coding skills specifically in Python. Very desirable commercial technical experience with tools such as Spark, Databricks, Airflow, Docker etc Commercial Containerisation & Infrastructure as code experience Previous work in a CI/CD environment AWS is the preferred cloud platform More ❯
london, south east england, united kingdom Hybrid / WFH Options
Harnham
and relational database systems (e.g., PostgreSQL, MySQL) Exposure to cloud platforms such as AWS, Azure, or GCP Experience with big data tools such as Spark and Hadoop Previous experience working with financial data, including understanding of financial metrics and industry trends More ❯
concepts to non-technical stakeholders. Preferred Skills Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of Delta Lake, ApacheSpark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of regulatory data More ❯
london, south east england, united kingdom Hybrid / WFH Options
Harnham
Synapse, etc.). Advanced SQL skills, including performance tuning and query optimization. Strong Python programming skills. Experience with big data tools such as Hadoop, Spark, and Kafka. Proficiency in CI/CD processes and version control. Solid experience with Terraform and Infrastructure as Code (IaC). Experience with cloud More ❯
data technologies Technical Skills Advanced machine learning and deep learning techniques Natural language processing Time series analysis and forecasting Reinforcement learning Big data technologies (Spark, Hadoop) Cloud infrastructure and containerization (Docker, Kubernetes) Version control and CI/CD practices More ❯
compliance (GDPR,HIPAA, SOC 2). Expertise in AWS, Azure, GCP, Snowflake, Databricks, and big data processing frameworks. Proficiency in SQL, Python, Scala, Java, Spark, and data modelling. Client Engagement: Experience in agile project management, stakeholder engagement, and commercial negotiations . Leadership & Collaboration: Ability to scale consulting teams and More ❯
with Unix/Linux environments and scripting • Familiar with data visualisation tools (e.g. QuickSight, Tableau, Looker, QlikSense) Desirable: • Experience with large-scale data technologies (Spark, Hadoop) • Exposure to microservices/APIs for data delivery • AWS certifications (e.g. Solutions Architect, Big Data Specialty) • Interest or background in Machine Learning This More ❯
experience in machine learning frameworks, including architectural design and data platforms. Knowledge of cloud platforms (AWS, Azure, or GCP) and data engineering tools (e.g., Spark, Kafka). Exceptional communication skills, with the ability to influence technical and non-technical stakeholders alike. More ❯
london, south east england, united kingdom Hybrid / WFH Options
Undisclosed
concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of Delta Lake, ApacheSpark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of regulatory data More ❯
london, south east england, united kingdom Hybrid / WFH Options
Fortice
other systems. Create deployable data pipelines that are tested and robust using a variety of technologies and techniques depending on the available technologies (Nifi, Spark) Build analytics tools that utilise the data pipeline to provide actionable insights into client requirements, operational efficiency, and other key business performance metrics. Complete More ❯