e.g., Snowflake or a similar warehousing technology, real-time systems). Experience in AWS cloud services, particularly Lambda, SNS, S3, and EKS, API Gateway Knowledge of data warehouse design, ETL/ELT processes, and big data technologies (e.g., Snowflake, Spark). Familiarity with data governance and compliance frameworks (e.g., GDPR, HIPAA). Strong communication and stakeholder management skills. Analytical mindset More ❯
Advanced proficiency in T-SQL, SQL Server and SSIS, with a proven track record in troubleshooting and optimizing data processes. Strong understanding of data warehousing concepts, data modelling, andETL/ELT processes. • Experience working within Agile frameworks, including leading stand-ups and sprint planning. Hands-on experience with Azure Devops Excellent problem-solving and analytical skills, with a strong More ❯
and coaching junior engineers, fostering a culture of technical excellence within the team. Data Engineering Expertise: - Deep understanding of data engineering principles and best practices, including data modeling, observable ETL/ELT processes, data warehousing, and data governance. - Proficiency in data manipulation languages (e.g., SQL/DBT) and programming languages relevant to data engineering (e.g., Python). - Experience with a More ❯
data services. This would include but not limited to Azure Synapse Analytics for data warehousing and big data analytics Azure data lake Gen1/Gen2 Azure Data factory for ETL/ELT, hybrid data integration and pipelines Azure Databricks for data processing and analytics Azure CosmosDB, Azure Table Storage and Azure Blob Storage Microsoft Fabric Azure Purview for data governance More ❯
data services. This would include but not limited to Azure Synapse Analytics for data warehousing and big data analytics Azure data lake Gen1/Gen2 Azure Data factory for ETL/ELT, hybrid data integration and pipelines Azure Databricks for data processing and analytics Azure CosmosDB, Azure Table Storage and Azure Blob Storage Microsoft Fabric Azure Purview for data governance More ❯
South East London, London, United Kingdom Hybrid / WFH Options
TEN10 SOLUTIONS LIMITED
Test (SDET) with a strong focus on data automation to join our dynamic team. If you're driven by complex data challenges and thrive in high-performance environments where ETL testing has become second nature, this is the role for you. ?? Important: All applicants must be eligible for SC clearance Why Ten10? At Ten10, were one of the UKs leading … for SC clearance. Development background in Scala , Python , or Java Solid understanding of data engineering practices and data validation techniques. Experience using test automation frameworks for data pipelines andETL workflows Strong communication and stakeholder management skills. Nice-to-Have: Hands-on experience with Databricks , Apache Spark , and Azure Deequ . Familiarity with Big Data tools and distributed data processing. More ❯
guidance to key stakeholders. Maintain and optimise reporting outputs, identifying areas for enhancement and automation. Conduct ad-hoc analysis to meet dynamic business needs. Write complex SQL queries to extractand manipulate data across large-scale database environments. Support transformation and change initiatives by providing insights and reporting capabilities that improve data integrity and performance. Lead delivery of advanced analytics … technical concepts clearly and concisely. Strong time management and multi-tasking abilities. Desirable: Familiarity with agile development methodologies and Jira. Experience with Power BI. Understanding of data warehousing andETL concepts. Experience evaluating external data sources for quality and value. Key Attributes Highly motivated, proactive, and capable of working independently. Organised, efficient, and detail-oriented. Skilled in building collaborative relationships More ❯
Warehouse infrastructure architecture and best practices Practical database design, development, administration knowledge (any RDBMS) Strong understanding of data governance, quality, and privacy (e.g. GDPR compliance) Proficiency in ELT/ETL processes Strong experience in data ingestion, transformation & orchestration technology (ETL tools such as Informatica, Datastage, SSIS, etc ) or open source Meltano, Airbyte, and Airflow Proven experience with DBT (data build More ❯
well-established knowledge-sharing community. What you'll do Design and implement data solutions using Snowflake across cloud platforms (Azure and AWS) Build and maintain scalable data pipelines andETL processes Optimise data models, storage, and performance for analytics and reporting Ensure data integrity, security, and best practice compliance Serve as a subject matter expert in Snowflake engineering efforts within … you'll need Proven experience designing and delivering enterprise-scale data warehouse solutions using Snowflake In-depth understanding of Snowflake architecture, performance optimisation, and best practices Strong experience in ETL development, data modelling, and data integration Proficient in SQL, Python, and/or Java for data processing Hands-on experience with Azure and AWS cloud environments Familiarity with Agile methodologies More ❯
Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks Data Platform, ensuring performance, scalability, and availability. 🔹 Data Pipelines – Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. 🔹 Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. 🔹 CI/CD & Automation – Create deployment pipelines and automate workflows. 🔹 Collaboration – Work with stakeholders across Global IT, Data … hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge of MS SQL Server, SSIS, Visual Studio, and SSDT projects More ❯
City of London, London, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
Key Responsibilities 🔹 Build & Develop – Design and maintain a robust Databricks Data Platform, ensuring performance, scalability, and availability. 🔹 Data Pipelines – Connect APIs, databases, and data streams to the platform, implementing ETL/ELT processes. 🔹 Data Integrity – Embed quality measures, monitoring, and alerting mechanisms. 🔹 CI/CD & Automation – Create deployment pipelines and automate workflows. 🔹 Collaboration – Work with stakeholders across Global IT, Data … hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensional modelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge of MS SQL Server, SSIS, Visual Studio, and SSDT projects More ❯
Expert in Microsoft Azure data services and hands-on with Microsoft Fabric (OneLake, Lakehouse, DirectLake, Power BI integration, etc.) would be a distinct advantage. Strong understanding of data modelling, ETL/ELT pipelines, and data warehousing principles. Skilled in designing scalable and secure solutions using best practices and industry frameworks. Excellent communication and stakeholder engagement skills. Bonus Points For Certifications More ❯
Expert in Microsoft Azure data services and hands-on with Microsoft Fabric (OneLake, Lakehouse, DirectLake, Power BI integration, etc.) would be a distinct advantage. Strong understanding of data modelling, ETL/ELT pipelines, and data warehousing principles. Skilled in designing scalable and secure solutions using best practices and industry frameworks. Excellent communication and stakeholder engagement skills. Bonus Points For Certifications More ❯
Employment Type: Permanent
Salary: £90000 - £110000/annum Plus bonus and package
models, database design development, data mining, and segmentation techniques. Strong knowledge of and experience with reporting packages (Business Objects, Tableau, Power BI), databases (SQL, etc.), programming (XML, JavaScript, or ETL frameworks), and data visualization (Power BI, Tableau, etc.). Demonstrated analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail andMore ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hanson Lee
models, database design development, data mining, and segmentation techniques. Strong knowledge of and experience with reporting packages (Business Objects, Tableau, Power BI), databases (SQL, etc.), programming (XML, JavaScript, or ETL frameworks), and data visualization (Power BI, Tableau, etc.). Demonstrated analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail andMore ❯
Data Architecture background (not Data Engineering) Proven experience working with customer data and CRM systems Strong understanding of GDPR and data privacy impact assessments (DPIA) Experience in data integration , ETL , and data warehousing Comfort working cross-functionally with tech, product, and business teams Practical experience with tools like Salesforce or Customer Data Platforms (CDPs) Skilled at navigating complex data governance More ❯
and will enable providing insights to inform strategic decisions both at the Board/Executive level and at the business unit level. Key Responsibilities Design, develop, and maintain scalable ETL pipelines using technologies like dbt, Airbyte, Cube, DuckDB, Redshift, and Superset Work closely with stakeholders across the company to gather data requirements and setup dashboards Promote a data driven culture … to ensure performance and scalability Must Haves 8+ years working in data engineering with large sets of data. Ex: millions of students, transactions. Proven experience in building and maintaining ETL pipelines and data infrastructure Strong experience working with dbt core/cloud Business savvy and capable of interfacing with finance, revenue and ops leaders to build our business intelligence Expertise More ❯
background as a Data Architect , ideally within financial services or commodities trading . Proven experience with Azure , Databricks , and enterprise data lake solutions. Hands-on experience with data modelling , ETL/ELT pipelines , and data integration across multiple systems. Familiarity with tools like Kafka , Spark , and modern API-based architectures . Experience with relational databases such as Oracle and SQL More ❯
shape the way data is handled across the business, working with modern tools in a fast-moving, high-performance environment. Your responsibilities may include: Build and maintain scalable, efficient ETL/ELT pipelines for both real-time and batch processing. Integrate data from APIs, streaming platforms, and legacy systems, with a focus on data quality and reliability. Design and manage … of experience in data engineering or a related field. Strong programming skills in Java, Python and SQL; familiarity with Rust is a plus. Proven experience designing and maintaining scalable ETL/ELT pipelines and data architectures. Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services. Comfortable with big data tools and distributed processing frameworks such as … for architecting and tuning cloud native data lake/warehouse solutions (e.g., AWS S3 + Glue/Redshift, GCP BigQuery, Azure Synapse)? Select What best reflects your experience building ETL/ELT workflows with Apache Airflow (or similar) and integrating them into containerised CI/CD pipelines (Docker, GitHub Actions, Jenkins, etc.)? Select Which option best describes your experience building More ❯
are looking for a Lead Data Solutions Architect to work within a dynamic, remote-first data architectural capability to deliver cloud based data solutions using best-in-class RDBMS, ETL/ELT, and Cloud platforms for blue-chip customers across a range of sectors. You will lead cross-functional teams of Data Engineers, Architects, Business Analysts and Quality Assurance Analysts … solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a nice-to-have, providing a well-rounded background. Experience with data visualization tools and DevOps principles/tools is advantageous. Familiarity with More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
GreatFind Recruitment
and commitment to data accuracy Desirable: Certifications in data management or architecture (e.g., CDMP, TOGAF) Project management experience (Agile methodologies preferred) Familiarity with tools such as SQL, NoSQL, andETL platforms Benefits 33 days annual leave plus bank holidays 8% employer pension contribution or NHS Pension continuation (if applicable) Enhanced family leave policies Free health cashback plan and employee assistance More ❯
Qualifications Bachelor's degree with at least 5 years of experience, or equivalent. In-depth knowledge and expertise in data engineering, including: Snowflake (data warehousing and performance tuning) Informatica (ETL/ELT development and orchestration) - nice to have Python (data processing and scripting) - required AWS (data services such as S3, Glue, Redshift, Lambda) - required Cloud data practices and platform - AWS More ❯
technologies (Azure, AWS, GCP) and tools like Databricks, Snowflake, Synapse. Shape cloud migration and modernization strategies with a strong focus on DevOps practices. Architect scalable data models and robust ETL/ELT pipelines using industry-standard frameworks. Implement data governance and quality frameworks to ensure data integrity and compliance. Collaborate with clients’ senior leadership to influence data-driven transformation initiatives. More ❯
data architecture principles, big data technologies (e.g., Hadoop, Spark), and cloud platforms like AWS, Azure, or GCP. • Data Management Skills : Advanced proficiency in data modelling, SQL/NoSQL databases, ETL processes, and data integration techniques. • Programming & Tools : Strong skills in Python or Java, with experience in data visualization tools and relevant programming frameworks. • Governance & Compliance : Solid understanding of data governance More ❯
data processing and reporting. Data Modelling using the Kimball Methodology. Experience in developing CI/CD pipelines using Gitlab or similar. Comprehensive knowledge of data engineering, data modelling andETL best practice Experience of working within a global team. Experience of working with multiple stakeholders as part of an Agile team. Experience in developing production-ready data ingestion and processing More ❯