we're looking for?Proven experience in data modelling and data engineering with a strong grasp of modern cloud architectures.?Expertise in data pipelines, ETL/ELT, and working with structured & unstructured data.?Strong skills in SQL, Python, Spark, or other relevant technologies.?Prior exposure to Microsoft Fabric is a … datasets. 5. Collaborate with Data Analysts, and other stakeholders to understand data requirements and translate them into technical solutions. 6. Develop and implement efficient ETL (Extract, Transform, Load) processes to integrate data from various sources into centralised data repositories. 7. Document data architecture, processes, and workflows for reference and knowledge … within the healthcare sector, with a focus on NHS data systems. Proven track record of designing, developing, and maintaining large-scale data pipelines andETL processes. In-depth knowledge of data modelling, database design, and data warehousing principles. Familiarity with healthcare data standards and compliance regulations. Significant experience of extracting More ❯
experience in data engineering and/or platform development, ideally with experience working with complex or large datasets such as the census. Experience with ETL tools and processes Exhibit proficiency in programming and database query languages, in particular Python and SQL. Possess a good understanding of data engineering principles, data More ❯
and data-driven decisions. Key Responsibilities: Develop, maintain, and optimize dynamic reports and dashboards using SSRS and Power BI to aid business decision making. Extract, transform, andload (ETL) data from diverse sources with T-SQL, SSIS, and Azure Data Factory Pipelines. Collaborate with business stakeholders to gather requirements andMore ❯
enterprise integration tools. A developed understanding of TOGAF or equivalent enterprise architecture frameworks. Hands-on experience in data warehousing, data lakes, data modelling, andETL processes. Excellent stakeholder engagement skills with the ability to influence and inspire across all levels. A proactive, forward-thinking mindset with a passion for innovation More ❯
enterprise integration tools. A developed understanding of TOGAF or equivalent enterprise architecture frameworks. Hands-on experience in data warehousing, data lakes, data modelling, andETL processes. Excellent stakeholder engagement skills with the ability to influence and inspire across all levels. A proactive, forward-thinking mindset with a passion for innovation More ❯
enterprise integration tools. A developed understanding of TOGAF or equivalent enterprise architecture frameworks. Hands-on experience in data warehousing, data lakes, data modelling, andETL processes. Excellent stakeholder engagement skills with the ability to influence and inspire across all levels. A proactive, forward-thinking mindset with a passion for innovation More ❯
enterprise integration tools. A developed understanding of TOGAF or equivalent enterprise architecture frameworks. Hands-on experience in data warehousing, data lakes, data modelling, andETL processes. Excellent stakeholder engagement skills with the ability to influence and inspire across all levels. A proactive, forward-thinking mindset with a passion for innovation More ❯
Azure D&A stack, Databricks and Azure Open AI solution. Proficiency in coding (Python, PL/SQL, Shell Script) relational and non-relational databases, ETL tooling (such as Informatica), scalable data platforms. Proficiency in Azure Data and Analytics stack; working knowledge on AWS and GCP data solutions. Good understanding of More ❯
Azure D&A stack, Databricks and Azure Open AI solution. Proficiency in coding (Python, PL/SQL, Shell Script) relational and non-relational databases, ETL tooling (such as Informatica), scalable data platforms. Proficiency in Azure Data and Analytics stack; working knowledge on AWS and GCP data solutions. Good understanding of More ❯
large datasets efficiently. Collaborate with cross-functional teams to understand business requirements and deliver data-driven insights. Design and build scalable data pipelines andETL processes. Perform data exploration, preprocessing, and feature engineering. Conduct statistical analysis and machine learning model development. Communicate findings and insights to stakeholders through data visualization More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Careerwise
large datasets efficiently. Collaborate with cross-functional teams to understand business requirements and deliver data-driven insights. Design and build scalable data pipelines andETL processes. Perform data exploration, preprocessing, and feature engineering. Conduct statistical analysis and machine learning model development. Communicate findings and insights to stakeholders through data visualization More ❯
such as Power BI, Tableau, QlikView etc. Familiarity with technical data structures, data pipelines and data science techniques, understanding complex systems. Extensive knowledge of ETL tools essential for building data pipelines effectively. Strong experience with cloud platforms like AWS, Azure, or Google Cloud The ability to build collaborative relationships with More ❯
migration projects to enhance system performance and integrity. Supporting the deployment of machine learning models using Databricks and PySpark. Managing and optimising cross-functional ETL processes across 80 databases daily. Working within a secure private cloud environment that includes Azure, SQL 2016, and SQL Server. About You The ideal candidate More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Eden Smith Group
such as Power BI, Tableau, QlikView etc. Familiarity with technical data structures, data pipelines and data science techniques, understanding complex systems. Extensive knowledge of ETL tools essential for building data pipelines effectively. Strong experience with cloud platforms like AWS, Azure, or Google Cloud The ability to build collaborative relationships with More ❯
Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data modeling, ETL frameworks, and big data technologies Experience working in financial services or regulated industries is a plus What’s on Offer A collaborative and inclusive work More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Radley James
Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data modeling, ETL frameworks, and big data technologies Experience working in financial services or regulated industries is a plus What’s on Offer A collaborative and inclusive work More ❯
This role is an opportunity to lead the build of bespoke data systems for our clients. Responsibilities: Design and implement scalable data pipelines andETL processes using Azure and Databricks technologies including Delta Live Tables. Lead technical discussions with clients and stakeholders to gather requirements and propose solutions. Help clients More ❯
migration projects to enhance system performance and integrity. Supporting the deployment of machine learning models using Databricks and PySpark. Managing and optimising cross-functional ETL processes across 80 databases daily. Working within a secure private cloud environment that includes Azure, SQL 2016, and SQL Server. About You The ideal candidate More ❯
will be essential in aligning BI initiatives with strategic goals and identifying opportunities for business value through data. Key Responsibilities Develop, maintain, and optimise ETL processes using SQL Server Integration Services (SSIS). Design and implement analytical models using SQL Server Analysis Services (SSAS). Create intuitive dashboards and reports More ❯
based platforms, ideally Microsoft Azure. Familiarity with MS Dynamics, Salesforce, and associated integration points. Desirable: Experience in the energy/utilities sector. Exposure to ETL processes, data warehousing, and data model design. Knowledge of Python or other scripting languages. Familiarity with Lean or Agile project methodologies. More ❯
scalable data systems (data warehouses, data lakes, and data pipelines). Proficiency in cloud platforms such as AWS, Azure, or Google Cloud. Experience with ETL/ELT tools and real-time data processing. Strong knowledge of SQL, NoSQL databases, and data modeling techniques. Familiarity with BI tools like Tableau, Power More ❯
to 10+ years' extensive experience in SQL Server data warehouse or data provisioning architectures. Advanced SQL query writing & SQL procedure experience. Experience developing ETL solutions in SQL Server including SSIS & T-SQL. Experience in Microsoft BI technologies (SQL Server Management Studio, SSIS, SSAS, SSRS). Experience of data/system More ❯
Provide technical guidance, mentorship, and lead activities within the Data Programme. Qualifications & Experience Relevant professional qualification or equivalent experience. Proven experience with data warehousing, ETL/ELT, integration tools, and BI solutions. Expertise with Synapse (notebooks and data flows) is essential. Working knowledge of PySpark is essential. Experience with Azure More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Robert Half
depth knowledge of cloud-native data services (e.g., Databricks, Snowflake, Microsoft Fabric, AWS Redshift). Strong understanding of data modelling (relational, dimensional, NoSQL) andETL/ELT processes . Experience with data integration tools (e.g., Apache Kafka, Talend, Informatica) and APIs . Familiarity with big data technologies (e.g., Hadoop, Spark More ❯
edinburgh, central scotland, United Kingdom Hybrid / WFH Options
Wyoming Interactive
Azure Data Factory and Snowflake. Experience with Looker Studio for visualization. Proficiency with RudderStack or similar customer data platforms. Data Integration Skills: Skilled in ETL/ELT processes, with direct experience integrating platforms such as HubSpot and Shopify. Programming Skills: Proficient in SQL, Python, or similar languages. Client-Facing Experience More ❯