Skills & Experience: ·Proven experience in senior data engineering roles, preferably within regulated industries ·Expertise in SQL, Snowflake, DBT Cloud, and CI/CD pipelines (Azure DevOps) ·Hands-on with ETL tools (e.g. Matillion, SNP Glue, or similar) ·Experience with AWS and/or Azure platforms ·Solid understanding of data modelling, orchestration, and warehousing techniques ·Strong communication, mentoring, and stakeholder engagement More ❯
working with modern orchestration tools, and applying best practices in security and compliance, this role offers both technical depth and impact. Key Responsibilities Design & Optimise Pipelines - Build and refine ETL/ELT workflows using Apache Airflow for orchestration. Data Ingestion - Create reliable ingestion processes from APIs and internal systems, leveraging tools such as Kafka, Spark, or AWS-native services. Cloud More ❯
South West London, London, United Kingdom Hybrid / WFH Options
JAM Recruitment Ltd
working with modern orchestration tools, and applying best practices in security and compliance, this role offers both technical depth and impact. Key Responsibilities Design & Optimise Pipelines - Build and refine ETL/ELT workflows using Apache Airflow for orchestration. Data Ingestion - Create reliable ingestion processes from APIs and internal systems, leveraging tools such as Kafka, Spark, or AWS-native services. Cloud More ❯
and star schema design · Hands-on experience with SQL Server and Snowflake, including their architecture, features and best practices · Familiarity with data integration tools (SSIS, ADF) and techniques (ELT, ETL) · Experience with reporting and analytical tools such as SSAS, SSRS or Power BI · Experience of working with risk systems within a financial institution · Track record of application ownership including familiarity More ❯
Strong understanding of relational databases and SQL (PostgreSQL, Snowflake, BigQuery, etc.). Familiarity with cloud platforms (AWS, GCP, or Azure). Experience working with large-scale datasets and complex ETL/ELT processes. Excellent problem-solving and communication skills. Preferred Qualifications: Telecom domain experience is highly desirable. Experience with version control systems (e.g., Git) and CI/CD tools. Knowledge More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
FDM Group
and ensure technical delivery aligns with business needs Translate business requirements into logical and physical data models that support scalable, high-performance analytics and reporting solutions Build and automate ETL/ELT pipelines using tools such as Azure Data Factory, as well as scripting with Python and PowerShell Use Power BI to create dashboards and data visualisations, focusing on usability More ❯
best practices. * Proficiency in SAP Data Services, SAP HANA, SAP BW, and SAP Analytics cloud. * Experience in SAP cloud platform. * Expertise in SAC architecture for data integration. * Skilled in ETL processes, data modelling, and data validation. * Any 1 Programming knowledge: [Python/SQL/Java]. * Hands on and 'can do' attitude * Excellent leadership, communication, and stakeholder management skills. * Good More ❯
SAP Data Services, SAP HANA, SAP BW, Datasphere and SAP Analytics cloud. * Experience in SAP cloud platform. * Expertise in DSP frameworks and SAC architecture for data integration. * Skilled in ETL processes, data modeling, and data validation. * Any 1 Programming knowledge: [Python/SQL/Java]. * Hands on and 'can do' attitude * Excellent leadership, communication, and stakeholder management skills. * Good More ❯
Databricks. Your focus will be on enabling analytics and machine learning at scale, using best-in-class tools across the Azure stack. Your responsibilities will include: Designing and deploying ETL pipelines using PySpark and Delta Lake on Databricks. Supporting the deployment and operationalisation of ML models with MLflow and Databricks Workflows. Building out reusable data products and feature stores for More ❯
understanding of query optimisation and performance tuning. Strong understanding of data warehousing concepts and dimensional data modelling. Experience integrating OBIEE with Oracle and non-Oracle data sources. Familiarity with ETL processes and tools (e.g., ODI, Informatica) is a plus. Experience with Oracle BI Publisher is a bonus. Excellent analytical, problem-solving, and communication skills. Company Market leading financial services organisation More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
understanding of query optimisation and performance tuning. Strong understanding of data warehousing concepts and dimensional data modelling. Experience integrating OBIEE with Oracle and non-Oracle data sources. Familiarity with ETL processes and tools (e.g., ODI, Informatica) is a plus. Experience with Oracle BI Publisher is a bonus. Excellent analytical, problem-solving, and communication skills. Company Market leading financial services organisation More ❯
data pipeline development, resource deployment, and ongoing optimisation of cloud-native systems. Your responsibilities will include: Designing and implementing scalable data lakehouse architectures using Databricks on Azure. Building efficient ETL/ELT pipelines for structured and unstructured data. Working with stakeholders to ensure high-quality, accessible data delivery. Optimising SQL workloads and data flows for analytics performance. Automating infrastructure deployment More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
Ensure security and compliance with DV-level clearance standards Skills & Experience: Current DV clearance (essential) Proven experience working with Palantir Foundry in complex environments Strong skills in data engineering, ETL processes, and data modelling Proficiency in relevant programming/scripting languages (e.g. Python, SQL) Experience working with large-scale datasets in secure environments Strong problem-solving skills and stakeholder engagement More ❯
and business rules Experience working with complex datasets from multiple sources (e.g., SQL, Excel, cloud platforms) Ability to communicate insights effectively to technical and non-technical stakeholders Familiarity with ETL processes , data governance, and data quality frameworks Excellent problem-solving skills and attention to detail Reasonable Adjustments: Respect and equality are core values to us. We are proud of the More ❯
GenAI technologies (e.g., OpenAI, Hugging Face, LangChain, RAG pipelines). Strong programming skills in Python, SQL, and familiarity with cloud platforms (Azure, AWS, GCP). Expertise in data architecture, ETL/ELT pipelines, and distributed computing frameworks (e.g., Spark, Databricks). Excellent communication, stakeholder management, and team leadership skills. Nice to have (advantagous) Experience with MLOps, CI/CD for More ❯
in troubleshooting and supporting B2B enterprise applications. Advanced knowledge and hands-on experience with Spark. Experience building, maintaining, and debugging DBT pipelines. Strong proficiency in developing, monitoring, and debugging ETL jobs. Deep understanding of SQL and experience with Databricks, Snowflake, BigQuery, Azure, Hadoop, or CDP environments. Hands-on technical support experience, including escalation management and adherence to SLAs. Familiarity with More ❯
in troubleshooting and supporting B2B enterprise applications. Advanced knowledge and hands-on experience with Spark. Experience building, maintaining, and debugging DBT pipelines. Strong proficiency in developing, monitoring, and debugging ETL jobs. Deep understanding of SQL and experience with Databricks, Snowflake, BigQuery, Azure, Hadoop, or CDP environments. Hands-on technical support experience, including escalation management and adherence to SLAs. Familiarity with More ❯
ownership. Work with data infrastructure to triage infra issues and drive to resolution. Skills/Experience: 5+ years experience in the data warehouse space. 5+ years experience in custom ETL design, implementation and maintenance. 5+ years experience with programming languages (Python or Java), Python preferred. 5+ years experience in writing efficient SQL statements. Experience working for a large technology company More ❯
Central London, London, England, United Kingdom Hybrid / WFH Options
Red Personnel
PowerPoint) and familiarity with Microsoft 365 tools such as SharePoint, Teams, and OneDrive. Hands-on experience administering Dynamics 365 and developing with Power Apps and Power Automate. Knowledge of ETL tools like SSIS and strong MS SQL skills. Ability to write clear User Acceptance Test scripts and produce technical documentation that’s easy to understand. Comfortable working within Agile project More ❯
be considered for this role you will need to have - • Experience with data management and data architectures (SQL/NoSQL), database systems, modern data warehouse platforms (Snowflake, Databricks, BigQuery), ETL/ELT pipeline development, and data lake/warehouse implementations for integrating structured and unstructured laboratory and sequencing data sources. • Master Full-Stack technologies, such as Python, JavaScript/TypeScript More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Next Best Move
and related technologies. Skills and Experience Proven experience administering Microsoft Dynamics 365 and Power Platform. Strong skills in Power Apps, Power Automate, MS SQL, and Business Central. Familiarity with ETL tools (e.g. SSIS), C#, .NET, and VBA. Experience in Agile project environments and technical documentation. Excellent organisational, communication, and stakeholder engagement skills. Qualifications Degree in a relevant field or equivalent More ❯
product). Mentor engineers and contribute to institutional data strategy and standards. Tech Requirements Strong Python skills; experience with Airflow, Luigi, or Spark. Proven track record building robust, scalable ETL/ELT systems. Experience with scientific, semi-structured, or biodiversity data. Familiarity with semantic web standards (e.g. Darwin Core, JSON-LD, Linked Data). Knowledge of NoSQL/graph databases More ❯
reduce cloud compute and storage costs. ? Guide engineering teams on choosing the right execution strategies across AWS, GCP, and Azure. ? Provide subject matter expertise on using AWS Glue for ETL workloads and integration with S3 and other AWS-native services. ? Implement observability tooling for logs, metrics, and error handling to support monitoring and incident response. ? Align implementations with InfoSum's More ❯
reports, and visualisations to communicate findings clearly and support data-driven decision-making. Leverage CRM data (via Salesforce), for segmentation, campaign performance tracking, and reporting use cases. Contribute to ETL/ELT workflows and data transformation processes, ensuring robust and efficient pipelines. Ensure data integrity through rigorous validation to maintain accuracy, completeness, and reliability in reporting. Support data governance andMore ❯
Alexander Mann Solutions - Public Sector Resourcing
and integration best practices. . Collaborate with cross-functional teams to map data flows and ensure accurate data transformation. . Evaluate and implement integration tools (API gateways, message brokers, ETL platforms). . Ensure secure data exchange through robust authentication, encryption, and access control mechanisms. . Oversee integration testing, deployment, and post-implementation support. . Maintain documentation for integration architecture More ❯