Greater London, England, United Kingdom Hybrid/Remote Options
Understanding Recruitment NFP
management Solid hands-on C#/.NET skills for APIs, automation, and custom tooling Understanding of data flows, system integrations , and automation best practices Experience with SSIS or similar ETL tools (desirable, not essential) Contract: Permanent Salary: £44,000 – £48,000 per annum Location: Hybrid – 2 days per week in London More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tact
days holiday plus bank holiday Flexible, hybrid working from central London And much more What do we need from you? PyTorch, Python , understanding of the machine learning ecosystem AWS ETL processes Good understanding of vector databases (they use open search) Proven experience using LLMs through APIs Sound like you? No CV is needed at this stage - we can cross that More ❯
days holiday plus bank holiday Flexible, hybrid working from central London And much more What do we need from you? PyTorch, Python , understanding of the machine learning ecosystem AWS ETL processes Good understanding of vector databases (they use open search) Proven experience using LLMs through APIs Sound like you? No CV is needed at this stage - we can cross that More ❯
Data/Commercial Analyst - large volumes of data & ETL London Hybrid (2-3 days per week in office) Salary £55,000 to £65,000 negotiable dependent on skills & experience Job Reference J13022 Founded over a decade ago, this organisation is an International Digital Solutions provider that offers a range of Hosting related services, including web hosting, domain registration, VPS, complex More ❯
Data/Commercial Analyst - large volumes of data & ETL London Hybrid (2-3 days per week in office) Salary £55,000 to £65,000 negotiable dependent on skills & experience Job Reference J13022 Founded over a decade ago, this organisation is an International Digital Solutions provider that offers a range of Hosting related services, including web hosting, domain registration, VPS, complex More ❯
takes pride in engineering fundamentals, thrives in a small team, and wants to see the commercial impact of their work. Key Responsibilities Develop, maintain, and enhance data pipelines andETL processes using Python and SQL. Manage and integrate API connections and FTP data feeds into internal systems. Build and support dashboards and reports to provide visibility across trading and operations. … Maths, Physics). 2–4 years’ experience in a technical, data, or engineering-focused role. Strong skills in Python, SQL, and Excel/VBA . Experience building or maintaining ETL/data pipelines , particularly around APIs or FTP processes. Working knowledge of Microsoft Azure and Git . Excellent analytical, communication, and problem-solving skills. A proactive, curious mindset and a More ❯
takes pride in engineering fundamentals, thrives in a small team, and wants to see the commercial impact of their work. Key Responsibilities Develop, maintain, and enhance data pipelines andETL processes using Python and SQL. Manage and integrate API connections and FTP data feeds into internal systems. Build and support dashboards and reports to provide visibility across trading and operations. … Maths, Physics). 2–4 years’ experience in a technical, data, or engineering-focused role. Strong skills in Python, SQL, and Excel/VBA . Experience building or maintaining ETL/data pipelines , particularly around APIs or FTP processes. Working knowledge of Microsoft Azure and Git . Excellent analytical, communication, and problem-solving skills. A proactive, curious mindset and a More ❯
and other relevant standards. Evaluate, select, and implement data governance tools (e.g. data catalogues, data quality tools). Lead the deployment of modern data platform technologies including Data Warehouses, ETL services, and cloud-based solutions. Promote a data-driven culture through organization-wide training and awareness initiatives. Develop and deliver role-specific training for data owners, custodians, and other key … and physical data modelling, as well as expertise in data warehousing, data lake architecture, and Master Data Management (MDM). The role also demands proficiency in data integration andETL processes, including the design and orchestration of data pipelines and the use of ETL/ELT tools. Candidates should be experienced with a range of database technologies, particularly cloud-based More ❯
client is looking for a Senior Data Engineer to join their Finance and Operations team, responsible for designing and maintaining Azure-based data pipelines and APIs, building and optimizing ETL processes, managing large datasets, troubleshooting data issues, and documenting technical solutions. The ideal candidate will have strong coding skills in Python and SQL, experience with dbt, Azure DevOps, and CI … be on-site 5 days a week out of the London office. Day to Day: Develop and maintain Azure-based data pipelines for Finance and Operations. Build and optimize ETL workflows using SQL and dbt. Write Python scripts for data transformation and automation. Deploy infrastructure as code and manage cloud data solutions. Collaborate with project managers and contractors across global … practices. Troubleshoot and resolve data-related issues promptly. Document technical solutions and maintain test scripts. Must Haves: Strong experience in data engineering. Expertise in Azure (cloud platform) SQL (advanced ETLand query optimization) dbt (data transformation pipelines) Python (data transformation and automation scripts) Azure DevOps/GitHub (CI/CD pipelines, source control) Data warehousing andETL best practices Plusses More ❯
client is looking for a Senior Data Engineer to join their Finance and Operations team, responsible for designing and maintaining Azure-based data pipelines and APIs, building and optimizing ETL processes, managing large datasets, troubleshooting data issues, and documenting technical solutions. The ideal candidate will have strong coding skills in Python and SQL, experience with dbt, Azure DevOps, and CI … be on-site 5 days a week out of the London office. Day to Day: Develop and maintain Azure-based data pipelines for Finance and Operations. Build and optimize ETL workflows using SQL and dbt. Write Python scripts for data transformation and automation. Deploy infrastructure as code and manage cloud data solutions. Collaborate with project managers and contractors across global … practices. Troubleshoot and resolve data-related issues promptly. Document technical solutions and maintain test scripts. Must Haves: Strong experience in data engineering. Expertise in Azure (cloud platform) SQL (advanced ETLand query optimization) dbt (data transformation pipelines) Python (data transformation and automation scripts) Azure DevOps/GitHub (CI/CD pipelines, source control) Data warehousing andETL best practices Plusses More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions Ltd
Pay: £450 - £550 per day Location - London (Hybrid) | Python | ETL | Impact-Driven Team We are partnering with a leading client seeking an Azure Data Engineer to join their team on an initial 6-month contract . This hybrid role requires 2 days per week onsite , offering the opportunity to design, build, and maintain cutting-edge data platforms.Data Engineering & Architecture: Design … build, and maintain scalable data pipelines andETL/ELT processes using Azure services. Develop and optimize data lake and data warehouse solutions (eg, Azure Data Lake Storage, Azure Synapse Analytics). Implement best practices in data modelling, partitioning, and performance optimization. Support Real Time and batch data processing workloads. Data Quality, Governance & Security Implement data validation, monitoring, and quality … role. Strong experience with Azure data services (ADF, Databricks, ADLS, Synapse, Event Hub, etc.). Proficiency in SQL and experience with Python/PySpark . Hands-on experience building ETL/ELT pipelines in cloud environments. Solid understanding of data modelling, warehousing concepts, and distributed data systems. Experience with version control (Git) and CI/CD for data pipelines. We More ❯
Pay: £450 - £550 per day Location - London (Hybrid) | Python | ETL | Impact-Driven Team We are partnering with a leading client seeking an Azure Data Engineer to join their team on an initial 6-month contract . This hybrid role requires 2 days per week onsite , offering the opportunity to design, build, and maintain cutting-edge data platforms. Data Engineering & Architecture … Design, build, and maintain scalable data pipelines andETL/ELT processes using Azure services. Develop and optimize data lake and data warehouse solutions (eg, Azure Data Lake Storage, Azure Synapse Analytics). Implement best practices in data modelling, partitioning, and performance optimization. Support Real Time and batch data processing workloads. Data Quality, Governance & Security Implement data validation, monitoring, and … role. Strong experience with Azure data services (ADF, Databricks, ADLS, Synapse, Event Hub, etc.). Proficiency in SQL and experience with Python/PySpark . Hands-on experience building ETL/ELT pipelines in cloud environments. Solid understanding of data modelling, warehousing concepts, and distributed data systems. Experience with version control (Git) and CI/CD for data pipelines. We More ❯
a data engineering strategy that aligns with organisational goals and technological advancements. Design and implement a scalable, reliable, and cost-efficient modern cloud data platform. Build and maintain robust ETL/ELT pipelines for processing and managing large volumes of structured and unstructured data. Create and manage Power BI dashboards, reports, and data models to provide strategic insights. Integrate cutting … GCP) and data processing services. Advanced skills in Power BI, including DAX, Power Query, and data modelling. Strong programming abilities in Python, SQL, and/or Scala. Expertise in ETL/ELT processes, data warehousing, and data mesh architectures. Familiarity with AI/ML concepts and their application in data analytics. Experience with metadata management, data lineage tracking, and data More ❯
a data engineering strategy that aligns with organisational goals and technological advancements. Design and implement a scalable, reliable, and cost-efficient modern cloud data platform. Build and maintain robust ETL/ELT pipelines for processing and managing large volumes of structured and unstructured data. Create and manage Power BI dashboards, reports, and data models to provide strategic insights. Integrate cutting … GCP) and data processing services. Advanced skills in Power BI, including DAX, Power Query, and data modelling. Strong programming abilities in Python, SQL, and/or Scala. Expertise in ETL/ELT processes, data warehousing, and data mesh architectures. Familiarity with AI/ML concepts and their application in data analytics. Experience with metadata management, data lineage tracking, and data More ❯
known London Market Insurance business is seeking an experienced Data Engineer to join its data team on an initial 6-month contract . Key Responsibilities Design, develop and maintain ETL pipelines to support enterprise reporting and analytics. Work across the Azure data stack (Azure Data Factory, Azure SQL, Synapse, Data Lake) to build scalable, reliable data solutions. Develop, enhance and … data quality, governance, and documentation best practices. Required Experience Proven experience as a Data Engineer within complex corporate environments (insurance experience highly desirable). Strong hands-on expertise with ETL development and data integration workflows. Solid commercial experience across the Azure data platform particularly Data Factory, Synapse, Databricks (nice to have) and Azure SQL. Strong understanding of data warehousing concepts More ❯
deliver high-quality solutions in a fast-paced environment. Key Responsibilities Design and implement scalable data pipelines using Azure Data Factory , Databricks , and Synapse Analytics . Develop and optimise ETL processes for structured and semi-structured data. Work with SQL and Python for data transformation and modelling. Integrate data from multiple sources, ensuring accuracy, consistency, and performance. Collaborate with stakeholders … in enterprise environments. Strong hands-on expertise with Azure Data Factory , Databricks , Synapse , and Azure Data Lake . Proficiency in SQL , Python , and PySpark . Experience with data modelling , ETL optimisation , and cloud migration projects . Familiarity with Agile delivery and CI/CD pipelines. Excellent communication skills for working with technical and non-technical teams. Interested? Apply now or More ❯
deliver high-quality solutions in a fast-paced environment. Key Responsibilities Design and implement scalable data pipelines using Azure Data Factory , Databricks , and Synapse Analytics . Develop and optimise ETL processes for structured and semi-structured data. Work with SQL and Python for data transformation and modelling. Integrate data from multiple sources, ensuring accuracy, consistency, and performance. Collaborate with stakeholders … in enterprise environments. Strong hands-on expertise with Azure Data Factory , Databricks , Synapse , and Azure Data Lake . Proficiency in SQL , Python , and PySpark . Experience with data modelling , ETL optimisation , and cloud migration projects . Familiarity with Agile delivery and CI/CD pipelines. Excellent communication skills for working with technical and non-technical teams. Interested? Apply now or More ❯
work on large-scale data projects, build modern data infrastructure, and help shape the future of data-driven decision-making. Responsibilities Design, build, and maintain scalable data pipelines andETL processes. Collaborate with data scientists, analysts, and engineers to ensure high-quality data delivery. Develop and optimise data models for analytics and reporting. Implement best practices for data governance, security … BigQuery, Dataflow, Pub/Sub, Composer, etc. Expertise in SQL and programming with Python or Scala. Experience with Airflow, dbt, or other orchestration frameworks. Solid understanding of data warehousing, ETL, and data modelling principles. Hands-on experience with CI/CD, version control (Git), and Infrastructure as Code (Terraform is a plus). Strong problem-solving skills, attention to detail More ❯
work on large-scale data projects, build modern data infrastructure, and help shape the future of data-driven decision-making. Responsibilities Design, build, and maintain scalable data pipelines andETL processes. Collaborate with data scientists, analysts, and engineers to ensure high-quality data delivery. Develop and optimise data models for analytics and reporting. Implement best practices for data governance, security … BigQuery, Dataflow, Pub/Sub, Composer, etc. Expertise in SQL and programming with Python or Scala. Experience with Airflow, dbt, or other orchestration frameworks. Solid understanding of data warehousing, ETL, and data modelling principles. Hands-on experience with CI/CD, version control (Git), and Infrastructure as Code (Terraform is a plus). Strong problem-solving skills, attention to detail More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions Ltd
as a skilled Salesforce Data Cloud specialist. Delivered two successful end-to-end Salesforce Data Cloud implementations. Strong expertise in designing scalable enterprise-level data architecture solutions. Experienced in ETL tools, data migration, and data cleansing practices. Proficient in writing and optimizing moderate to advanced SQL queries. Preferably a Salesforce Data Cloud Consultant certification holder. What to do next If More ❯
Consultant Permanent Location: UK - Fully Remote Salary: £65,000 - £80,000 (+ bonus) Skills: Unit4 FP&A, Database Administration (SQL Server or Oracle), T-SQL or PL/SQL, ETL Tools (e.g., SSIS), Microsoft Server Operating Systems We're looking to recruit an FP&A Systems Consultant for a specialist IT consultancy to lead my clients FP&A Practice click More ❯
onboarding from various sources to both on-prem and cloud environments. Develop and optimize SQL queries for data validation, transformation, and integration. Data Engineering capabilities using any of the ETL in Fivetran/Snowflake/ADF/SSIS (any one) Troubleshoot and resolve issues related to data ingestion and streaming performance. Serve as the go-to expert for all Kafka … . Excellent problem-solving, communication, and stakeholder management skills. Familiarity with event-driven architecture and streaming best practices. Preferred Experience with Fivetran, SSIS, ADF or Snowflake technologies. Knowledge of ETL processes, data warehousing, and cloud data platforms. Exposure to Azure for cloud-based data solutions. Personal Besides the professional qualifications of the candidates, we place great importance in addition to More ❯
onboarding from various sources to both on-prem and cloud environments. Develop and optimize SQL queries for data validation, transformation, and integration. Data Engineering capabilities using any of the ETL in Fivetran/Snowflake/ADF/SSIS (any one) Troubleshoot and resolve issues related to data ingestion and streaming performance. Serve as the go-to expert for all Kafka … . Excellent problem-solving, communication, and stakeholder management skills. Familiarity with event-driven architecture and streaming best practices. Preferred Experience with Fivetran, SSIS, ADF or Snowflake technologies. Knowledge of ETL processes, data warehousing, and cloud data platforms. Exposure to Azure for cloud-based data solutions. Personal Besides the professional qualifications of the candidates, we place great importance in addition to More ❯
City of London, London, United Kingdom Hybrid/Remote Options
OTA Recruitment
BI and Plotly/Dash (or similar). Key responsibilities Design and implement scalable data architectures and systems to support business intelligence and analytics needs. Develop, optimize, and maintain ETL pipelines for efficient data integration and transformation. Oversee data storage solutions, including backup and recovery strategies to ensure data integrity and availability. Write and manage SQL queries to extract, manipulate More ❯
BI and Plotly/Dash (or similar). Key responsibilities Design and implement scalable data architectures and systems to support business intelligence and analytics needs. Develop, optimize, and maintain ETL pipelines for efficient data integration and transformation. Oversee data storage solutions, including backup and recovery strategies to ensure data integrity and availability. Write and manage SQL queries to extract, manipulate More ❯