Proven track record leading multi-million-pound projects within consulting or enterprise-level engagements. • Strong stakeholder engagement at CxO or Director level. • Deep experience in cloud data lake architectures, ETL/ELT patterns, and metadata/data quality management. • Expertise in Matillion, Redshift, Glue, Lambda, DynamoDB, and data pipeline automation. • Familiarity with data visualisation platforms such as Quicksight, Tableau, or More ❯
using .NET framework backend services and React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for: End-to-end data migration … environments using AWS and our bespoke Conversion Framework. Build new and maintain existing bespoke systems. Implement .NET-based microservices with strong observability and integration with data platforms. Develop custom ETL pipelines using AWS, Python, and MySQL. Implement governance, lineage, and monitoring to ensure high availability and traceability. AI & Advanced Analytics Integration: Collaborate with AI/ML teams to enable model More ❯
using .NET framework backend services and React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for: End-to-end data migration … environments using AWS and our bespoke Conversion Framework. Build new and maintain existing bespoke systems. Implement .NET-based microservices with strong observability and integration with data platforms. Develop custom ETL pipelines using AWS, Python, and MySQL. Implement governance, lineage, and monitoring to ensure high availability and traceability. AI & Advanced Analytics Integration: Collaborate with AI/ML teams to enable model More ❯
you'll be instrumental in designing, building, and optimising data pipelines and infrastructure that support advanced analytics and modelling. You'll collaborate with cross-functional teams to manage complex ETL processes, implement best practices in code management, and ensure seamless data flow across platforms. Projects may include connecting SharePoint to Databricks, optimising Spark jobs, and managing GitHub-based code promotion More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
experience in a Data Engineer role and a strong academic background Python & SQL: Advanced-level Python for data applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
experience in a Data Engineer role and a strong academic background Python & SQL: Advanced-level Python for data applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Context
You must be a highly effective communicator, both written and verbally. Desirable Experience Experience with property management software MRI Qube, Yardi or similar. Experience with Microsoft SQL. Experience with ETL Tools and Data Migration. Experience with Data Analysis, Data mapping and UML. Experience with programming languages (Python, Ruby, C++, PHP, etc). Hybrid: 2-4 days onsite (the business work More ❯
technical consulting experience ● Track record of leading enterprise analytics or data platform projects ● Hands-on experience with Snowflake platform solutions ● Strong understanding of end-to-end data architecture, including ETL/ELT, data modeling, and business intelligence tools ● Experience with SQL, Python, Java, and/or Spark in data engineering or analytics contexts ● Familiarity with large-scale data warehouse technologies More ❯
AWS data services . Strong proficiency in DataOps methodologies and tools, including experience with CI/CD pipelines, containerized applications , and workflow orchestration using Apache Airflow . Familiar with ETL frameworks, and bonus experience with Big Data processing (Spark, Hive, Trino), and data streaming. Proven track record - You've made a demonstrable impact in your previous roles, standing out from More ❯
of insurance data from source systems, producing high-quality user stories, specifications, and data flows to support reporting. Facilitate stakeholder discussions to ensure accurate data extraction, transformation, and loading (ETL) processes. Understand data across finance teams to define and prepare data requirements for finance initiatives. Assist with data analysis and presentation of Balance Sheet reconciliations. Investigate discrepancies in financial reconciliations More ❯
Platform Implementation: Lead the design and development of a new global data platform (PaaS) Ensure scalable storage solutions (data lakes, data warehouses) to handle structured and unstructured data. Implement ETL/ELT pipelines using Dagster, Airflow, or similar tools. Optimize performance and scalability for large data volumes. Govern data security, compliance, and access controls. Development & DevOps: Strong programming and scripting More ❯
technical architecture of our solutions, ensuring they meet performance, scalability, and security requirements. Design and develop scalable AWS architectures for API-based and data-centric applications. Define data pipelines, ETL processes, and storage solutions using AWS services such as S3, OpenSearch, Redshift, Step Functions, Lambda, Glue, and Athena. Architect RESTful APIs, ensuring security, performance, and scalability. Optimise microservices architecture andMore ❯
Amazon Music engages fans, artists, and creators on a global scale. Learn more at BASIC QUALIFICATIONS - 5+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL - Experience managing a data or BI team - Experience leading and influencing the data or BI strategy of your team or organization - Experience in at least one More ❯
preferred) Nice to have Experience: Familiarity with Streamlit or other lightweight internal tooling UIs Exposure to LLMs , OpenAI tools , or agent-based systems Experience using Google Analytics or reverse ETL tools like Hightouch Building and scaling data products from scratch Prior experience in startup or scale-up environments Benefits: Competitive Salary: We offer pay that reflects your skills and the More ❯
role, you will: Partner closely with engineers and business leaders to design and build Relay's core data models Take ownership of Relay's data infrastructure, including our ingestion, ETL, data quality monitoring and data catalogue. Collaborate with team members across every function and business area at Relay. Regularly spend time in the field understanding how your data models match More ❯
with 4+ years in finance. Bachelor’s degree in computer science or related field. Excellent Strong Python and SQL skills for data processing and automation. Experience building and maintaining ETL pipelines and data transformations. Hands-on experience with Snowflake or similar cloud-based data platforms. Familiarity with data transformation tools such as DBT. Strong understanding of data structures, data modeling More ❯
working with franchisees or external partners on performance management. Proficiency in data visualization tools (Power BI, Tableau, or similar). Familiarity with data storage and integration platforms (Snowflake, APIs, ETL processes). Understanding of POS systems and financial reporting. Strong stakeholder management and communication skills. Ability to work cross-functionally with IT, operations, and franchise partners. Problem-solving mindset with More ❯
working with franchisees or external partners on performance management. Proficiency in data visualization tools (Power BI, Tableau, or similar). Familiarity with data storage and integration platforms (Snowflake, APIs, ETL processes). Understanding of POS systems and financial reporting. Strong stakeholder management and communication skills. Ability to work cross-functionally with IT, operations, and franchise partners. Problem-solving mindset with More ❯
in relevant domains such as fraud, AML detection, or operations Advanced proficiency in SQL and experience working with large, complex, and sometimes messy datasets Experience designing, building, and maintaining ETL pipelines or data models, ideally using tools like dbt Proficiency in Python for data analysis, including data manipulation, visualisation, and basic modelling Strong data storytelling and communication skills: you can More ❯
Technical Ownership Own the end-to-end architecture and reliability of Zego's data platform, from ingestion and transformation to governance and observability. Guide the team in evolving our ETL/ELT pipelines, data models, and real-time processing frameworks. Lead the development of new data architectures and the introduction of new technologies. Innovation & Delivery Drive the adoption of modern More ❯
Technical Ownership Own the end-to-end architecture and reliability of Zego’s data platform, from ingestion and transformation to governance and observability. Guide the team in evolving our ETL/ELT pipelines, data models, and real-time processing frameworks. Lead the development of new data architectures and the introduction of new technologies. Innovation & Delivery Drive the adoption of modern More ❯
ETL Application Developer Capacity Management & Data Engineering 6 month contract - Inside IR35 London - Moorgate tube Day rate to be confirmed Our leading bank client is seeking an experienced ETL Application Developer to join their technology resilience and capacity management team. This is a contract opportunity based in London. As an ETL Application Developer, you will play a key role in … on developing robust data pipelines and solutions that support capacity monitoring, reporting, and forecasting across complex IT environments. Key Responsibilities: Design, develop, and maintain ETL processes to collect, transform, andload capacity and infrastructure data from multiple sources. Collaborate with cross-functional teams and DBAs to ensure seamless data integration and delivery. Analyse and normalise raw infrastructure and monitoring data … compliance with regulatory standards (including DORA) by providing accurate, transparent, and timely capacity data. Create clear technical documentation and contribute to ongoing process improvement. Technical Requirements: Proven expertise in ETL (Extract, Transform, Load) development. Strong hands-on experience with SQL Server and data engineering tools. Solid understanding of infrastructure monitoring, capacity management, and data analytics. Ability to work with large More ❯
is crucial for developing their data warehouse and centralising data across the business. Reporting to the Head of Digital Transformation, you'll manage the full data lifecycle, specialising in ETL processes, data collection from various sources, high-speed data cleansing, and thorough testing and validation to ensure data integrity. You'll also be a key player in stakeholder interaction, translating … requirements into effective data warehouse solutions for calculations and data movements. Lead data migration andETL processes: Direct the migration of data and design, implement, and optimize ETL (Extract, Transform, Load) processes to integrate data from various sources. Manage Snowflake data warehouse: Build, maintain, and enhance the Snowflake data warehouse, ensuring it's scalable and resilient for future reporting. Ensure … has at least 4 years of Engineering experience in a similar role. The client is a Microsoft house and are looking for someone with Snowflake and Python with extensive ETL experience. As well as strong technical skills the client is looking for someone who has exquisite communications skills and can be the face of Data within the business. Salary andMore ❯
Primary areas of focus will be 1) defining metrics and KPIs, and 2) deep-diving ambiguous and large data sets. A successful candidate will be an expert with SQL, ETL (and general data wrangling) and have exemplary communication skills. The candidate will need to be a self-starter, comfortable with ambiguity in a fast-paced and ever-changing environment, and … self-service ability by stakeholders - Apply engineering excellence to reporting and analysis pipelines, automate and simplify self-service support for customers. - Interface with other technology teams to extract, transform, andload data from a wide variety of data sources. A day in the life Centralize the sprint and consolidate to one queue but still give stakeholder visibility and continue to … 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to More ❯
standards, and culture of the company. What we are looking for: Back-end development: 5+ years of industry experience in back-end engineering developing data platforms or large-scale extract-transform-load (ETL) pipelines. Programming : Proficiency in Python for data pipelines, distributed systems and micro-services. Cloud- technologies: Experience in developing and deploying in cloud platforms (e.g., AWS, GCP or More ❯