e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong understanding of data quality frameworks, data contracts, and lineage. Proficient in using analytics and BI tools More ❯
metrics. Basic Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field Proficiency in automation using Python Excellent oral and written communication skills Experience with SQL, ETL processes, or data transformation Preferred Qualifications Experience with scripting and automation tools Familiarity with Infrastructure as Code (IaC) tools such as AWS CDK Knowledge of AWS services such as SQS … operational best practices experience. Basic qualifications: 5+ years of relevant professional experience in business intelligence, analytics, statistics, data engineering, data science or related field. Experience with Data modeling, SQL, ETL, Data Warehousing and Data Lakes. Strong experience with engineering and operations best practices (version control, data quality/testing, monitoring, etc.) Expert-level SQL. Proficiency with one or more general More ❯
model performance and ensure smooth application operations with large-scale data handling. Data Management and Preprocessing: Manage the collection, cleaning, and preprocessing of large datasets. Implement data pipelines andETL processes to ensure data quality and availability. Software Development: Write clean, efficient, and scalable code in Python. Implement CI/CD practices for version control, testing, and code review. Collaboration More ❯
pipelines, and optimisation of Power BI-based analytics solutions. Key Responsibilities: Lead the management and development of the organisation's Modern Data Platform (Azure-based) Build and maintain robust ETL/ELT pipelines , data models, and reporting marts Develop Power BI dashboards and implement best practice modelling and workspace governance Integrate data from internal departments and external agencies to support More ❯
Cardiff, South Glamorgan, Wales, United Kingdom Hybrid / WFH Options
Octad Recruitment Consultants (Octad Ltd )
team culture. Day-to-Day Responsibilities Infrastructure & Automation: Deploy and manage infrastructure using Bicep/Terraform , GitHub Actions , and PowerShell/DSC . Data Engineering: Architect and implement scalable ETL/ELT solutions; model schemas, optimize performance, and apply lakehouse best practices. Security & Resilience: Implement best-practice cloud security (NSGs, Defender, Conditional Access), automate DR/backups, and run quarterly More ❯
Cardiff, South Glamorgan, Wales, United Kingdom Hybrid / WFH Options
Octad Recruitment Ltd
team culture. Day-to-Day Responsibilities Infrastructure & Automation: Deploy and manage infrastructure using Bicep/Terraform , GitHub Actions , and PowerShell/DSC . Data Engineering: Architect and implement scalable ETL/ELT solutions; model schemas, optimize performance, and apply lakehouse best practices. Security & Resilience: Implement best-practice cloud security (NSGs, Defender, Conditional Access), automate DR/backups, and run quarterly More ❯
and SQL (e.g., Snowflake or similar warehousing technology, real-time systems). Experience with AWS services such as Lambda, SNS, S3, EKS, API Gateway. Knowledge of data warehouse design, ETL/ELT processes, and big data technologies (e.g., Snowflake, Spark). Understanding of data governance and compliance frameworks (e.g., GDPR, HIPAA). Strong communication and stakeholder management skills. Analytical mindset More ❯
frameworks, regulatory compliance (e.g., GDPR, CCPA), and data security best practices. Proven experience in enterprise-level architecture design and implementation. Hands-on knowledge of database systems (SQL/NoSQL), ETL/ELT processes, and data modeling techniques. Exceptional leadership, communication, and stakeholder management skills. Ability to work in fast-paced, agile environments and balance long-term strategy with short-term More ❯
Tandridge, Surrey, United Kingdom Hybrid / WFH Options
NHS
the attached document for the full job description. Person Specification Experience Design and implementation of complex data models for Business Intelligence and Advanced Analytics Design and implementation of complex ETL/ELT processes Experience in a supervisory role Deep knowledge of NHS data and issues Visualization and report writing skills Performance management experience Experience in change management and delivering successful More ❯
Essential Core Technical Experience 5 to 10+ years of experience in SQL Server data warehouse or data provisioning architectures. Advanced SQL query writing and stored procedure experience. Experience developing ETL solutions in SQL Server, including SSIS & T-SQL. Experience with Microsoft BI technologies (SQL Server Management Studio, SSIS, SSAS, SSRS). Knowledge of data/system integration and dependency identification. More ❯
including handling large, complex datasets. Advanced SQL skills for querying and managing relational databases. Familiarity with data visualisation tools (e.g., Sisense, Power BI, Streamlit). Technical Skills Experience with ETL processes and APIs for data integration. Understanding of statistical methods and data modelling techniques. Familiarity with cloud platforms like Snowflake is advantageous. Knowledge of data governance frameworks and data security More ❯
have a supportive culture with a keen focus on innovation, technical excellence, career development and mutual support. Responsibilities Data Pipeline Development:Design, develop, and optimize robust data pipelines andETL processes to ensure efficient data flow and integration Data Infrastructure Management:Manage and enhance our data infrastructure to support performance, scalability, and long-term reliability Advanced Analytics Support:Build andMore ❯
Advanced proficiency in T-SQL, SQL Server and SSIS, with a proven track record in troubleshooting and optimizing data processes. Strong understanding of data warehousing concepts, data modelling, andETL/ELT processes. • Experience working within Agile frameworks, including leading stand-ups and sprint planning. Hands-on experience with Azure Devops Excellent problem-solving and analytical skills, with a strong More ❯
engineering, mathematics, or a related technical discipline. - 3+ years of experience as a Data Engineer or in a similar role. - 3+ years of experience with data modeling, data warehousing, ETL/ELT pipelines and BI tools. - Experience with cloud-based big data technology stacks (e.g., Hadoop, Spark, Redshift, S3, EMR, SageMaker, DynamoDB etc.) - Knowledge of data management and data storage More ❯
South East London, London, United Kingdom Hybrid / WFH Options
TEN10 SOLUTIONS LIMITED
Test (SDET) with a strong focus on data automation to join our dynamic team. If you're driven by complex data challenges and thrive in high-performance environments where ETL testing has become second nature, this is the role for you. ?? Important: All applicants must be eligible for SC clearance Why Ten10? At Ten10, were one of the UKs leading … for SC clearance. Development background in Scala , Python , or Java Solid understanding of data engineering practices and data validation techniques. Experience using test automation frameworks for data pipelines andETL workflows Strong communication and stakeholder management skills. Nice-to-Have: Hands-on experience with Databricks , Apache Spark , and Azure Deequ . Familiarity with Big Data tools and distributed data processing. More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
quality and integrity. Applying test data management tools for crafting, managing, and maintaining test data sets. Developing and execute data transformation tests using DBT (Data Build Tool). Performing ETL testing to validate data extraction, transformation, and loading processes. Collaborating with data engineers, analysts, and other stakeholders to identify and resolve data quality issues. Automating data testing processes to improve … in defining and implementing data testing strategies. Hands-on experience with test data management tools. Proficiency in DBT (Data Build Tool) for data transformation and testing. Strong understanding of ETL processes and experience in ETL testing. Excellent problem-solving skills and attention to detail. Experience with data integration tools and frameworks. Understanding of data quality frameworks and standard methodologies. Understanding More ❯
of Databricks, Power BI, and enterprise integration tools. A developed understanding of TOGAF or equivalent enterprise architecture frameworks. Hands-on experience in data warehousing, data lakes, data modelling, andETL processes. Excellent stakeholder engagement skills with the ability to influence and inspire across all levels. A proactive, forward-thinking mindset with a passion for innovation, sustainability, and continuous improvement. Your More ❯
guidance to key stakeholders. Maintain and optimise reporting outputs, identifying areas for enhancement and automation. Conduct ad-hoc analysis to meet dynamic business needs. Write complex SQL queries to extractand manipulate data across large-scale database environments. Support transformation and change initiatives by providing insights and reporting capabilities that improve data integrity and performance. Lead delivery of advanced analytics … technical concepts clearly and concisely. Strong time management and multi-tasking abilities. Desirable: Familiarity with agile development methodologies and Jira. Experience with Power BI. Understanding of data warehousing andETL concepts. Experience evaluating external data sources for quality and value. Key Attributes Highly motivated, proactive, and capable of working independently. Organised, efficient, and detail-oriented. Skilled in building collaborative relationships More ❯
BI Data Analyst Associate Responsibilities: On a daily basis your varied role will include, but will not be limited to: Design, build, and optimize high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks or Microsoft Fabric. Implement scalable solutions to ingest, store, andtransform vast datasets, ensuring data availability and quality across the organization. Write More ❯
Develop and maintain interactive dashboards and reports using Qlik Sense. Collaborate with business units to gather requirements and translate them into technical specifications. Perform data extraction, transformation, and loading (ETL) from various sources. Ensure data accuracy, integrity, and consistency across all reports and dashboards. Analyse large datasets to identify trends, patterns, and actionable insights. Optimise Qlik applications for performance and … 2+ years of experience working with Qlik Sense. Strong SQL skills and experience with relational databases. Proficiency in data visualisation and storytelling with data. Experience with data modelling andETL processes. Excellent analytical and problem-solving skills. Strong communication, presentation, and collaboration abilities. Preferred Qualifications: Experience with cloud platforms (e.g., AWS, Azure, GCP). Familiarity with other BI tools (e.g. More ❯
Warehouse infrastructure architecture and best practices Practical database design, development, administration knowledge (any RDBMS) Strong understanding of data governance, quality, and privacy (e.g. GDPR compliance) Proficiency in ELT/ETL processes Strong experience in data ingestion, transformation & orchestration technology (ETL tools such as Informatica, Datastage, SSIS, etc ) or open source Meltano, Airbyte, and Airflow Proven experience with DBT (data build More ❯
well-established knowledge-sharing community. What you'll do Design and implement data solutions using Snowflake across cloud platforms (Azure and AWS) Build and maintain scalable data pipelines andETL processes Optimise data models, storage, and performance for analytics and reporting Ensure data integrity, security, and best practice compliance Serve as a subject matter expert in Snowflake engineering efforts within … you'll need Proven experience designing and delivering enterprise-scale data warehouse solutions using Snowflake In-depth understanding of Snowflake architecture, performance optimisation, and best practices Strong experience in ETL development, data modelling, and data integration Proficient in SQL, Python, and/or Java for data processing Hands-on experience with Azure and AWS cloud environments Familiarity with Agile methodologies More ❯
Amersham, Amersham on the Hill, Buckinghamshire, United Kingdom
VIQU IT
ensuring data accuracy and availability for reporting and analysis. Support the rollout of an Azure-based data lake, working alongside external partners and internal stakeholders. Design and implement scalable ETL packages using Microsoft tools such as SSIS and SSRS. Produce and enhance interactive reports and dashboards, facilitating clear communication of insights to non-technical users. Assist in training end users … for data tools and systems. Participate in peak reporting cycles and ad hoc data requests as needed. Key Requirements of the Data Engineer: Strong expertise in SQL, SSIS for ETL pipeline creation, and SSRS for report development. Experience with Power BI for data visualisation. Familiarity with Azure data services such as Data Lake, Data Factory, and Synapse Analytics is an More ❯
Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks Data Platform, ensuring performance, scalability, and availability. 🔹 Data Pipelines – Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. 🔹 Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. 🔹 CI/CD & Automation – Create deployment pipelines and automate workflows. 🔹 Collaboration – Work with stakeholders across Global IT, Data … hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge of MS SQL Server, SSIS, Visual Studio, and SSDT projects More ❯
City of London, London, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks Data Platform, ensuring performance, scalability, and availability. 🔹 Data Pipelines – Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. 🔹 Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. 🔹 CI/CD & Automation – Create deployment pipelines and automate workflows. 🔹 Collaboration – Work with stakeholders across Global IT, Data … hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge of MS SQL Server, SSIS, Visual Studio, and SSDT projects More ❯