Bachelor’s degree in a STEM field (Computer Science, Engineering, Maths, Physics). 2–4 years’ experience in a technical, data, or engineering-focused role. Strong skills in Python, SQL, and Excel/VBA . Experience building or maintaining ETL/data pipelines , particularly around APIs or FTP processes. Working knowledge of Microsoft Azure and Git . Excellent analytical, communication More ❯
Warrington, Cheshire, England, United Kingdom Hybrid / WFH Options
Brookson
understanding of the data, thus promoting a self-serve analytical culture. What are the qualities that can help you thrive as a Data Engineer? Essential Experience and Qualifications: Strong SQL skills - Tables, Stored Procedures, performance tuning etc. Experience with Azure ETL tools - Azure data Factory/Synapse Strong experience in Data movement methodologies and standards, ELT & ETL. A self-motivated More ❯
Nottingham, Nottinghamshire, East Midlands, United Kingdom Hybrid / WFH Options
Experian Ltd
Experian Degree or equivalent experience in a related field. Experience in data analytics delivery. Proficiency in tools like Alteryx, SAS, Python, and BI platforms (Tableau, Power BI). Strong SQL and Excel (can write advanced macros) skills. Experience in controls testing or audit analytics. Collaborative mindset More about you Analytical, with experience in data-driven problem solving. Comfortable working with More ❯
manchester, north west england, united kingdom Hybrid / WFH Options
Brookson Group - (A People2.0 Company)
understanding of the data, thus promoting a self-serve analytical culture. What are the qualities that can help you thrive as a Data Engineer? Essential Experience and Qualifications: Strong SQL skills - Tables, Stored Procedures, performance tuning etc. Experience with Azure ETL tools - Azure data Factory/Synapse Strong experience in Data movement methodologies and standards, ELT & ETL. A self-motivated More ❯
liverpool, north west england, united kingdom Hybrid / WFH Options
Brookson Group - (A People2.0 Company)
understanding of the data, thus promoting a self-serve analytical culture. What are the qualities that can help you thrive as a Data Engineer? Essential Experience and Qualifications: Strong SQL skills - Tables, Stored Procedures, performance tuning etc. Experience with Azure ETL tools - Azure data Factory/Synapse Strong experience in Data movement methodologies and standards, ELT & ETL. A self-motivated More ❯
chester, north west england, united kingdom Hybrid / WFH Options
Brookson Group - (A People2.0 Company)
understanding of the data, thus promoting a self-serve analytical culture. What are the qualities that can help you thrive as a Data Engineer? Essential Experience and Qualifications: Strong SQL skills - Tables, Stored Procedures, performance tuning etc. Experience with Azure ETL tools - Azure data Factory/Synapse Strong experience in Data movement methodologies and standards, ELT & ETL. A self-motivated More ❯
Associate Director, Data Analytics - Value Creation Interpath Leeds or Birmingham or Belfast or Manchester or Glasgow Interpath is an international and fast-growing advisory business with deep expertise in a broad range of specialisms spanning deals, advisory and restructuring capabilities. More ❯
Job Title: Senior Data Engineer Location: Leeds, on-site 2x per week Salary: Up to £85,000 Why Apply? Our client is looking for a talented Senior Data Engineer to play a key role in their ongoing digital transformation. This More ❯
bradford, yorkshire and the humber, united kingdom
Fruition Group
Job Title: Senior Data Engineer Location: Leeds, on-site 2x per week Salary: Up to £85,000 Why Apply? Our client is looking for a talented Senior Data Engineer to play a key role in their ongoing digital transformation. This More ❯
Experience: Expertise in cloud platforms (Azure) and data engineering best practices . Advanced proficiency in Power BI , including DAX, Power Query, and data modeling. Strong programming skills in Python, SQL, and/or Scala for data processing and automation. Experience with ETL/ELT, data warehousing, and event-driven architectures . Knowledge of AI/ML applications in data analytics More ❯
outputs across the organisation. The role requires strong Power BI skills, an eye for detail and the ability to manage your own workload from specification to release. Familiarity with SQL or similar languages is essential to support data validation and troubleshooting. You'll contribute to team discussions, promote good practice in reporting and play an active role in improving how … field Desirable Experience with Azure Synapse Analytics or Azure Data Factory Production of guidance documentation. Knowledge and Skills Essential Skilled in Power BI, including data modelling & DAX Knowledge of SQL or other languages Understands principles of secure and scalable reporting, including access control, performance and usability Clear communicator, confident in presenting data and insights to varied audiences Builds strong, collaborative More ❯
enterprise grade data solutions Strong hands-on experience with Snowflake (or comparable cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one programming language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or More ❯
enterprise grade data solutions Strong hands-on experience with Snowflake (or comparable cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one programming language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or More ❯
london (city of london), south east england, united kingdom
Kubrick Group
enterprise grade data solutions Strong hands-on experience with Snowflake (or comparable cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one programming language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or More ❯
pipelines using Fabric Pipelines, Azure Data Factory, Notebooks and SSIS Produce enterprise-grade Power BI dashboards and paginated reports Translate business requirements into scalable technical BI solutions Write advanced SQL for Fabric Lakehouse and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and … data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models and Fabric data modelling DevOps deployment using ARM/Bicep templates End-to-end More ❯
london (city of london), south east england, united kingdom
Sanderson
pipelines using Fabric Pipelines, Azure Data Factory, Notebooks and SSIS Produce enterprise-grade Power BI dashboards and paginated reports Translate business requirements into scalable technical BI solutions Write advanced SQL for Fabric Lakehouse and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and … data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models and Fabric data modelling DevOps deployment using ARM/Bicep templates End-to-end More ❯
pipelines using Fabric Pipelines, Azure Data Factory, Notebooks and SSIS Produce enterprise-grade Power BI dashboards and paginated reports Translate business requirements into scalable technical BI solutions Write advanced SQL for Fabric Lakehouse and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and … data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models and Fabric data modelling DevOps deployment using ARM/Bicep templates End-to-end More ❯
transform raw data into trusted, actionable insights that power critical business decisions. Key Responsibilities Design and implement scalable data pipelines and ETL/ELT workflows in Databricks using PySpark, SQL, and Delta Lake. Architect and manage the Medallion (Bronze, Silver, Gold) data architecture for optimal data organization, transformation, and consumption. Develop and maintain data models, schemas, and data quality frameworks … Proven track record implementing Medallion Architecture (Bronze, Silver, Gold layers) in production environments. Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. Proficiency in Python, SQL, and Spark optimization techniques. Experience working with cloud data platforms such as Azure Data Lake, AWS S3, or GCP BigQuery. Strong understanding of data quality frameworks, testing, and CI/ More ❯
london (city of london), south east england, united kingdom
TRIA
transform raw data into trusted, actionable insights that power critical business decisions. Key Responsibilities Design and implement scalable data pipelines and ETL/ELT workflows in Databricks using PySpark, SQL, and Delta Lake. Architect and manage the Medallion (Bronze, Silver, Gold) data architecture for optimal data organization, transformation, and consumption. Develop and maintain data models, schemas, and data quality frameworks … Proven track record implementing Medallion Architecture (Bronze, Silver, Gold layers) in production environments. Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. Proficiency in Python, SQL, and Spark optimization techniques. Experience working with cloud data platforms such as Azure Data Lake, AWS S3, or GCP BigQuery. Strong understanding of data quality frameworks, testing, and CI/ More ❯
transform raw data into trusted, actionable insights that power critical business decisions. Key Responsibilities Design and implement scalable data pipelines and ETL/ELT workflows in Databricks using PySpark, SQL, and Delta Lake. Architect and manage the Medallion (Bronze, Silver, Gold) data architecture for optimal data organization, transformation, and consumption. Develop and maintain data models, schemas, and data quality frameworks … Proven track record implementing Medallion Architecture (Bronze, Silver, Gold layers) in production environments. Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. Proficiency in Python, SQL, and Spark optimization techniques. Experience working with cloud data platforms such as Azure Data Lake, AWS S3, or GCP BigQuery. Strong understanding of data quality frameworks, testing, and CI/ More ❯
a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You’ll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to … teams, not a “heads-down coder” type. Top 4 Core Skills Python — workflow automation, data processing, and ETL/ELT development. Snowflake — scalable data architecture, performance optimisation, and governance. SQL — expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) — modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models … and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production-grade SQL and ensure data quality More ❯
london (city of london), south east england, united kingdom
83data
a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You’ll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to … teams, not a “heads-down coder” type. Top 4 Core Skills Python — workflow automation, data processing, and ETL/ELT development. Snowflake — scalable data architecture, performance optimisation, and governance. SQL — expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) — modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models … and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production-grade SQL and ensure data quality More ❯
pipelines using Fabric Pipelines, Azure Data Factory, Notebooks and SSIS Produce enterprise-grade Power BI dashboards and paginated reports Translate business requirements into scalable technical BI solutions Write advanced SQL for Fabric Lakehouse and Warehouse environments Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and … data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models and Fabric data modelling DevOps deployment using ARM/Bicep templates End-to-end More ❯
Better Placed Ltd - A Sunday Times Top 10 Employer!
re looking for an experienced, hands-on Data Engineer to design and optimise data architecture, build robust pipelines, and deliver actionable insights across the business. Key Responsibilities Data Engineering & SQL Development Own and develop the data architecture, ensuring performance and scalability. Design, develop, and optimise SQL queries, stored procedures, and ETL pipelines. Work with AWS S3 data lakes and ERP … data validation and quality assurance to ensure reliable insights. Support forecasting and predictive analytics to guide strategic decisions. Translate complex data into clear, actionable recommendations. Required Skills & Experience Strong SQL expertise, including optimisation and performance tuning. Experience in ETL development, data modelling, and large datasets. Proficiency with BI tools such as Power BI, Superset, or Tableau. Experience with AWS data More ❯
learn, Large Language Models, LLM, Data preprocessing, REST API, Microservices architecture, MLOps, CI/CD for ML, Power-BI, Docker, Kubernetes, AI Ethics, Cloud Platforms, AWS, Google Cloud Platform, SQL, NoSQL, DevOps, Financial services, Regulatory environments Contract Type: Hybrid/Bedford Daily Rate: £600-£650 (via Umbrella) 9 months initial contract We are seeking two AI Solutions Engineers to lead More ❯