model performance and ensure smooth application operations with large-scale data handling. Data Management and Preprocessing: Manage the collection, cleaning, and preprocessing of large datasets. Implement data pipelines andETL processes to ensure data quality and availability. Software Development: Write clean, efficient, and scalable code in Python. Implement CI/CD practices for version control, testing, and code review. Collaboration More ❯
skills, Experience in risk management (Market, Credit, Regulatory). Familiarity with risk measures: VAR, CE/PE, PFE. Success in managing multi-terabyte data warehouses. Skilled in data warehousing, ETL/ELT, and reporting tools. Scripting skills (Python, PowerShell). Knowledge of applications, data governance, and cybersecurity.- Preferred: Experience with data modelling tools like dbt. Knowledge of orchestration tools andMore ❯
City of London, London, Coleman Street, United Kingdom
Deerfoot Recruitment Solutions Limited
skills, Experience in risk management (Market, Credit, Regulatory). Familiarity with risk measures: VAR, CE/PE, PFE. Success in managing multi-terabyte data warehouses. Skilled in data warehousing, ETL/ELT, and reporting tools. Scripting skills (Python, PowerShell). Knowledge of applications, data governance, and cybersecurity.- Preferred: Experience with data modelling tools like dbt. Knowledge of orchestration tools andMore ❯
Employment Type: Permanent
Salary: £135000/annum bonus + good benefits package
and SQL (e.g., Snowflake or similar warehousing technology, real-time systems). Experience with AWS services such as Lambda, SNS, S3, EKS, API Gateway. Knowledge of data warehouse design, ETL/ELT processes, and big data technologies (e.g., Snowflake, Spark). Understanding of data governance and compliance frameworks (e.g., GDPR, HIPAA). Strong communication and stakeholder management skills. Analytical mindset More ❯
model development and deployment by providing data engineering expertise, ensuring data scientists have access to the data they need in the required format. Implement and optimize data transformations andETL/ELT processes using appropriate tools. Work with various databases and data warehousing solutions to store and retrieve data efficiently. Monitor, troubleshoot, and maintain data pipelines to ensure high data More ❯
frameworks, regulatory compliance (e.g., GDPR, CCPA), and data security best practices. Proven experience in enterprise-level architecture design and implementation. Hands-on knowledge of database systems (SQL/NoSQL), ETL/ELT processes, and data modeling techniques. Exceptional leadership, communication, and stakeholder management skills. Ability to work in fast-paced, agile environments and balance long-term strategy with short-term More ❯
Tandridge, Surrey, United Kingdom Hybrid / WFH Options
NHS
the attached document for the full job description. Person Specification Experience Design and implementation of complex data models for Business Intelligence and Advanced Analytics Design and implementation of complex ETL/ELT processes Experience in a supervisory role Deep knowledge of NHS data and issues Visualization and report writing skills Performance management experience Experience in change management and delivering successful More ❯
Essential Core Technical Experience 5 to 10+ years of experience in SQL Server data warehouse or data provisioning architectures. Advanced SQL query writing and stored procedure experience. Experience developing ETL solutions in SQL Server, including SSIS & T-SQL. Experience with Microsoft BI technologies (SQL Server Management Studio, SSIS, SSAS, SSRS). Knowledge of data/system integration and dependency identification. More ❯
to deliver value quickly and iteratively. Demonstrated experience building data infrastructure and platforms from scratch in a greenfield or early-stage environment. Proven experience building complex data pipelines andETL processes from disparate sources in a production environment. Deep expertise with modern cloud data warehousing platforms, especially Snowflake (alternatively experience with BigQuery). Strong data modelling skills, including dimensional modelling More ❯
have a supportive culture with a keen focus on innovation, technical excellence, career development and mutual support. Responsibilities Data Pipeline Development:Design, develop, and optimize robust data pipelines andETL processes to ensure efficient data flow and integration Data Infrastructure Management:Manage and enhance our data infrastructure to support performance, scalability, and long-term reliability Advanced Analytics Support:Build andMore ❯
Advanced proficiency in T-SQL, SQL Server and SSIS, with a proven track record in troubleshooting and optimizing data processes. Strong understanding of data warehousing concepts, data modelling, andETL/ELT processes. • Experience working within Agile frameworks, including leading stand-ups and sprint planning. Hands-on experience with Azure Devops Excellent problem-solving and analytical skills, with a strong More ❯
and coaching junior engineers, fostering a culture of technical excellence within the team. Data Engineering Expertise: - Deep understanding of data engineering principles and best practices, including data modeling, observable ETL/ELT processes, data warehousing, and data governance. - Proficiency in data manipulation languages (e.g., SQL/DBT) and programming languages relevant to data engineering (e.g., Python). - Experience with a More ❯
A proactive, self-starting attitude with a passion for continuous learning and improvement. Experience with geospatial data or tools; QGIS is essential, ArcGIS and FME are desirable. Familiarity with ETL processes and collaboration with data engineering teams is a strong plus. Experience working in agile teams and contributing to iterative delivery cycles. Advanced Excel skills Openness to travel (expenses covered More ❯
engineering, mathematics, or a related technical discipline. - 3+ years of experience as a Data Engineer or in a similar role. - 3+ years of experience with data modeling, data warehousing, ETL/ELT pipelines and BI tools. - Experience with cloud-based big data technology stacks (e.g., Hadoop, Spark, Redshift, S3, EMR, SageMaker, DynamoDB etc.) - Knowledge of data management and data storage More ❯
or Tableau (data manipulation, macros, charts and pivot tables), - Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages - Fluency in SQL andETL PREFERRED QUALIFICATIONS - Knowledge of data modeling and data pipeline design - Masters degree in Business, Engineering, Statistics, Computer Science, Data Science, Mathematics or related field - Experience with at least one statistical More ❯
South East London, London, United Kingdom Hybrid / WFH Options
TEN10 SOLUTIONS LIMITED
Test (SDET) with a strong focus on data automation to join our dynamic team. If you're driven by complex data challenges and thrive in high-performance environments where ETL testing has become second nature, this is the role for you. ?? Important: All applicants must be eligible for SC clearance Why Ten10? At Ten10, were one of the UKs leading … for SC clearance. Development background in Scala , Python , or Java Solid understanding of data engineering practices and data validation techniques. Experience using test automation frameworks for data pipelines andETL workflows Strong communication and stakeholder management skills. Nice-to-Have: Hands-on experience with Databricks , Apache Spark , and Azure Deequ . Familiarity with Big Data tools and distributed data processing. More ❯
of Databricks, Power BI, and enterprise integration tools. A developed understanding of TOGAF or equivalent enterprise architecture frameworks. Hands-on experience in data warehousing, data lakes, data modelling, andETL processes. Excellent stakeholder engagement skills with the ability to influence and inspire across all levels. A proactive, forward-thinking mindset with a passion for innovation, sustainability, and continuous improvement. Your More ❯
guidance to key stakeholders. Maintain and optimise reporting outputs, identifying areas for enhancement and automation. Conduct ad-hoc analysis to meet dynamic business needs. Write complex SQL queries to extractand manipulate data across large-scale database environments. Support transformation and change initiatives by providing insights and reporting capabilities that improve data integrity and performance. Lead delivery of advanced analytics … technical concepts clearly and concisely. Strong time management and multi-tasking abilities. Desirable: Familiarity with agile development methodologies and Jira. Experience with Power BI. Understanding of data warehousing andETL concepts. Experience evaluating external data sources for quality and value. Key Attributes Highly motivated, proactive, and capable of working independently. Organised, efficient, and detail-oriented. Skilled in building collaborative relationships More ❯
Develop and maintain interactive dashboards and reports using Qlik Sense. Collaborate with business units to gather requirements and translate them into technical specifications. Perform data extraction, transformation, and loading (ETL) from various sources. Ensure data accuracy, integrity, and consistency across all reports and dashboards. Analyse large datasets to identify trends, patterns, and actionable insights. Optimise Qlik applications for performance and … 2+ years of experience working with Qlik Sense. Strong SQL skills and experience with relational databases. Proficiency in data visualisation and storytelling with data. Experience with data modelling andETL processes. Excellent analytical and problem-solving skills. Strong communication, presentation, and collaboration abilities. Preferred Qualifications: Experience with cloud platforms (e.g., AWS, Azure, GCP). Familiarity with other BI tools (e.g. More ❯
Warehouse infrastructure architecture and best practices Practical database design, development, administration knowledge (any RDBMS) Strong understanding of data governance, quality, and privacy (e.g. GDPR compliance) Proficiency in ELT/ETL processes Strong experience in data ingestion, transformation & orchestration technology (ETL tools such as Informatica, Datastage, SSIS, etc ) or open source Meltano, Airbyte, and Airflow Proven experience with DBT (data build More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
In Technology Group
Design and develop BI solutions using tools like Power BI, SQL, and DAX Collaborate with cross-functional teams to gather requirements and deliver insights Maintain and optimize data models, ETL processes, and reporting systems Ensure data accuracy, integrity, and security across all BI platforms Support ad-hoc data analysis and reporting needs Requirements Proven experience as a BI Developer or … in a similar data-focused role Strong SQL skills and experience with relational databases Proficiency in Power BI (or similar tools) Understanding of data warehousing concepts andETL processes Excellent problem-solving and communication skills Nice to Have Experience with cloud platforms (e.g., Azure, AWS) Knowledge of Python or R for data analysis Familiarity with Agile methodologies What We Offer More ❯
Amersham, Buckinghamshire, South East, United Kingdom
VIQU IT Recruitment
ensuring data accuracy and availability for reporting and analysis. Support the rollout of an Azure-based data lake, working alongside external partners and internal stakeholders. Design and implement scalable ETL packages using Microsoft tools such as SSIS and SSRS. Produce and enhance interactive reports and dashboards, facilitating clear communication of insights to non-technical users. Assist in training end users … for data tools and systems. Participate in peak reporting cycles and ad hoc data requests as needed. Key Requirements of the Data Engineer: Strong expertise in SQL, SSIS for ETL pipeline creation, and SSRS for report development. Experience with Power BI for data visualisation. Familiarity with Azure data services such as Data Lake, Data Factory, and Synapse Analytics is an More ❯
well-established knowledge-sharing community. What you'll do Design and implement data solutions using Snowflake across cloud platforms (Azure and AWS) Build and maintain scalable data pipelines andETL processes Optimise data models, storage, and performance for analytics and reporting Ensure data integrity, security, and best practice compliance Serve as a subject matter expert in Snowflake engineering efforts within … you'll need Proven experience designing and delivering enterprise-scale data warehouse solutions using Snowflake In-depth understanding of Snowflake architecture, performance optimisation, and best practices Strong experience in ETL development, data modelling, and data integration Proficient in SQL, Python, and/or Java for data processing Hands-on experience with Azure and AWS cloud environments Familiarity with Agile methodologies More ❯
Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks Data Platform, ensuring performance, scalability, and availability. 🔹 Data Pipelines – Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. 🔹 Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. 🔹 CI/CD & Automation – Create deployment pipelines and automate workflows. 🔹 Collaboration – Work with stakeholders across Global IT, Data … hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge of MS SQL Server, SSIS, Visual Studio, and SSDT projects More ❯
City of London, London, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks Data Platform, ensuring performance, scalability, and availability. 🔹 Data Pipelines – Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. 🔹 Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. 🔹 CI/CD & Automation – Create deployment pipelines and automate workflows. 🔹 Collaboration – Work with stakeholders across Global IT, Data … hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge of MS SQL Server, SSIS, Visual Studio, and SSDT projects More ❯