our data solutions are secure, efficient, and optimized. Key Responsibilities: Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage. Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. - Automate loads using Databricks workflows and Jobs Develop, test and build CI/CD pipelines using Azure DevOps More ❯
our data solutions are secure, efficient, and optimized. Key Responsibilities: Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage. Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. - Automate loads using Databricks workflows and Jobs Develop, test and build CI/CD pipelines using Azure DevOps More ❯
Advanced SQL skills and experience architecting solutions on modern data warehouses (e.g., Snowflake, BigQuery, Redshift). Hands-on experience with advanced modelling techniques in dbt. A deep understanding of ETL/ELT processes and tools (e.g., Fivetran, Airbyte, Stitch). Experience with data visualisation tools (e.g., Mode, Looker, Tableau, Power BI) and designing robust BI semantic layers. Exceptional understanding of More ❯
CI/CD pipelines. Data Modelling: Apply deep hands-on expertise in enterprise data modelling using tools like ERwin, ER/Studio, or PowerDesigner, ensuring scalability, performance, and maintainability. ETL/ELT Frameworks: Design and build robust data pipelines with Cloud Composer, Dataproc, Dataflow, Informatica, or IBM DataStage, supporting both batch and streaming data ingestion. Data Governance & Quality: Implement data More ❯
London, England, United Kingdom Hybrid / WFH Options
PACE Global
BI to communicate findings effectively to stakeholders. Automation & Scripting : Utilize Python scripting to automate data processing, cleaning, and analysis tasks. Database Management : Write, optimize, and maintain SQL queries to extract, manipulate, and analyze data from various databases. App Development : Develop and maintain low-code/no-code applications using Power Apps to streamline business processes and improve data accessibility. Data … workflows. Proficiency in Python scripting for data analysis and automation tasks. Strong knowledge of SQL for querying, data manipulation, and database management. Experience with data modeling, data warehousing, andETL processes. Familiarity with cloud-based data platforms (e.g., Azure, AWS) is a plus. Excellent analytical and problem-solving skills with attention to detail. Strong communication skills and the ability to … developing and optimizing Python scripts for data analysis, automation, and workflow improvements. Strong background in writing and optimizing SQL queries for complex data retrieval, manipulation, and analysis. Experience with ETL (Extract, Transform, Load) processes and data integration. Familiarity with data visualization best practices to ensure clear and effective communication of insights. Understanding of data modeling techniques to support the creation More ❯
data models and user behavioral data accessible and reusable for data professionals throughout the company. In this role, your main responsibilities and deliverables will be to build and maintain ETL pipelines, create foundational data models in our data warehouse, and integrate new data sources using DataOps best practices, including documentation and testing. You will also support other data specialists in More ❯
Engineer, your role involves designing, implementing, and managing data solutions on the Microsoft Azure cloud platform. Your responsibilities will include, but not limited to: Data Integration Data Integration andETL (Extract, Transform, Load): You'll develop and implement data integration processes to extract data from diverse sources andload it into data warehouses or data lakes. Azure Data Factory is … a key tool for building efficient ETL pipelines. Your goal is to ensure that data is effectively cleaned, transformed, and loaded. Your sources will include Rest API from various systems. Being able to onboard data from API's is essential. Use of Postman for testing outside of ADF is a preferred skill. Data Engineering As an Azure Data Engineer, you More ❯
Required Skills: Proficiency in BI tools (Power BI, Tableau, QlikView) and data visualization techniques Strong SQL skills and experience with relational databases (e.g., SQL Server, Oracle, MySQL) Familiarity with ETL processes and data warehousing concepts Experience in data modeling and designing effective reporting structures Knowledge of programming languages such as Python or R for data manipulation and analysis Strong analytical More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
Skills & Experience: Proven experience in senior data engineering roles, preferably within regulated industries Expertise in SQL, Snowflake, DBT Cloud, and CI/CD pipelines (Azure DevOps) Hands-on with ETL tools (e.g. Matillion, SNP Glue, or similar) Experience with AWS and/or Azure platforms Solid understanding of data modelling, orchestration, and warehousing techniques Strong communication, mentoring, and stakeholder engagement More ❯
Skills & Experience: ·Proven experience in senior data engineering roles, preferably within regulated industries ·Expertise in SQL, Snowflake, DBT Cloud, and CI/CD pipelines (Azure DevOps) ·Hands-on with ETL tools (e.g. Matillion, SNP Glue, or similar) ·Experience with AWS and/or Azure platforms ·Solid understanding of data modelling, orchestration, and warehousing techniques ·Strong communication, mentoring, and stakeholder engagement More ❯
various sources and supports diverse analytical needs, while optimizing costs and meeting business requirements. Implementing Data Engineering Pipelines: Design and develop data pipelines for data extraction, transformation, and loading (ETL) processes, ensuring data quality and consistency. Enabling Data Intelligence and Analytics: Build and maintain data warehouses, data marts, and data lakes to support business intelligence and data analytics initiatives. Supporting More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
data products using technologies like Snowflake, Power BI, Python, and SQL. Your work will enable self-service analytics and support data governance across the business. Key Responsibilities: Develop robust ETL/ELT pipelines and dimensional models for BI tools Define and implement data quality, ownership, and security standards Empower business teams with intuitive, self-serve data models Own data products More ❯
model development and deployment by providing data engineering expertise, ensuring data scientists have access to the data they need in the required format. Implement and optimize data transformations andETL/ELT processes using appropriate tools. Work with various databases and data warehousing solutions to store and retrieve data efficiently. Monitor, troubleshoot, and maintain data pipelines to ensure high data More ❯
Essential Core Technical Experience 5 to 10+ years of experience in SQL Server data warehouse or data provisioning architectures. Advanced SQL query writing and stored procedure experience. Experience developing ETL solutions in SQL Server, including SSIS & T-SQL. Experience with Microsoft BI technologies (SQL Server Management Studio, SSIS, SSAS, SSRS). Knowledge of data/system integration and dependency identification. More ❯
team member - you take ownership and get stuck in, and you're able to balance multiple projects and priorities. Some experience in working with some data pipelines (ELT/ETL concepts) & warehouse knowledge. Nice to haves Experience utilising Python for data manipulation. Experience using DBT. Experience using AI code assistants like GitHub Copilot to write and debug SQL and Python. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
build and maintain scalable and secure data pipelines on the Azure platform. Develop and deploy data ingestion processes using Azure Data Factory, Databricks (PySpark), and Azure Synapse Analytics. Optimise ETL/ELT processes to improve performance, reliability and efficiency. Integrate multiple data sources including Azure Data Lake (Gen2), SQL-based systems and APIs. Collaborate with Data Architects, Analysts, and stakeholders More ❯
working with modern orchestration tools, and applying best practices in security and compliance, this role offers both technical depth and impact. Key Responsibilities Design & Optimise Pipelines - Build and refine ETL/ELT workflows using Apache Airflow for orchestration. Data Ingestion - Create reliable ingestion processes from APIs and internal systems, leveraging tools such as Kafka, Spark, or AWS-native services. Cloud More ❯
South West London, London, United Kingdom Hybrid / WFH Options
JAM Recruitment Ltd
working with modern orchestration tools, and applying best practices in security and compliance, this role offers both technical depth and impact. Key Responsibilities Design & Optimise Pipelines - Build and refine ETL/ELT workflows using Apache Airflow for orchestration. Data Ingestion - Create reliable ingestion processes from APIs and internal systems, leveraging tools such as Kafka, Spark, or AWS-native services. Cloud More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Oscar Technology
that support decision-making. 2+ years using Business Intelligence tools (Power BI essential; Looker is a plus but not required). Strong SQL skills and a solid understanding of ETL processes and data pipelines. Experience working with data platforms (e.g., Snowflake , Azure, AWS ) is highly desirable. Exposure to or interest in the financial services, or purpose-driven sectors is a More ❯
data architecture principles, big data technologies (e.g., Hadoop, Spark), and cloud platforms like AWS, Azure, or GCP. Data Management Skills : Advanced proficiency in data modelling, SQL/NoSQL databases, ETL processes, and data integration techniques. Programming & Tools : Strong skills in Python or Java, with experience in data visualization tools and relevant programming frameworks. Governance & Compliance : Solid understanding of data governance More ❯
Fabric or strong expertise in related technologies such as Power BI, Azure Synapse, Data Factory, Azure Data Lake etc. A solid understanding of data engineering principles, including data modelling, ETL/ELT processes, and data warehousing. Hands-on experience with Power BI and DAX, and ideally some exposure to Notebooks, Pipelines, or Lakehouses within Fabric. Strong communication and stakeholder management More ❯
testing frameworks , leveraging automation and best practices to enhance efficiency. Requirements 3+ years of experience in quality assurance, test automation, or data validation . Experience in testing data pipelines, ETL/ELT workflows, and big data environments . Familiarity with Azure data platforms , such as Databricks, Azure Data Factory, Synapse Analytics, or ADLS . Proficiency in SQL and scripting languages More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
FDM Group
and ensure technical delivery aligns with business needs Translate business requirements into logical and physical data models that support scalable, high-performance analytics and reporting solutions Build and automate ETL/ELT pipelines using tools such as Azure Data Factory, as well as scripting with Python and PowerShell Use Power BI to create dashboards and data visualisations, focusing on usability More ❯
Your contributions will be vital in enhancing our data solutions and reporting capabilities to support the agency's diverse clients. Key Responsibilities Maintain and enhance our data architecture andETL processes, ensuring the accuracy of media, campaign, and performance data. Design, develop, and maintain compelling Power BI dashboards and reports that visualise campaign performance, media spend, audience behaviour, and KPIs. More ❯
marketing APIs). Ensure data integrity, completeness, and consistency via automated monitoring and validation tooling. Optimize and extend our cloud data warehouse (Google BigQuery) for performance and cost. Develop ETL/ELT jobs and data models to enable fast analytics and experimentation. Partner with analysts and product managers to define data tracking specifications and ensure implementation alignment. Contribute to infrastructure More ❯