Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Deloitte LLP
wemake and action we take, guiding us to deliver impact how and where it mattersmost . Connect to your opportunity We're looking for a skilled DataLake Reporting Assistant Manager to join our team and play a key role in developing and maintaining our datalake reporting environment. You'll work closely with … and engineers, contributing to the development and implementation of our reporting strategy. Learn and apply industry best practices for data warehousing, reporting, and visualization. DataLake Reporting Environment: Become proficient in navigating and understanding datalake architecture. Assist in implementing data quality checks and validation processes to ensure accurate and … Information Systems, or a related field, or equivalent. Proven experience in data warehousing, business intelligence, or a related field, or equivalent. Strong understanding of datalake architectures, data modelling, and ETL processes. Expertise in SQL and data visualization tools (e.g., Power BI). Excellent communication, interpersonal, and presentation skills. Connect to More ❯
Lead data engineers at Thoughtworks develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. They might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to … a program inception to build a new product. Alongside hands-on coding, they are leading the team to implement the solution. Job responsibilities You will lead and manage data engineering projects from inception to completion, including goal-setting, scope definition and ensuring on-time delivery with cross team collaboration. You will collaborate with stakeholders to understand their strategic … approaches and can apply data security and privacy strategy to solve business problems. You have experience with different types of databases (i.e.: SQL, NoSQL, datalake, data schemas, etc.). Professional Skills You understand the importance of stakeholder management and can easily liaise between clients and other key stakeholders throughout projects, ensuring buy More ❯
Job summary The role holder is responsible for the leading the design, development, and ongoing enhancement of organisations data infrastructure and pipelines to support advanced data collection, storage, processing, and analysis. Main duties of the job The role holder is accountable for a team of data engineers, fostering a culture of technical excellence and … continuous improvement. The role holder will work collaboratively with cross-functional teams, including analysts, software engineers, and key stakeholders, to ensure that data solutions are robust, scalable, and aligned with the strategic goals of delivering high-quality care for our patients. About us Since 2012, CHEC has been working with the NHS to increase patient choice and provide … applicants may be required to undertake an Enhanced Disclosure via the Disclosure and Barring Service (DBS). Job description Job responsibilities Lead the design and execution of scalable data storage solutions, including databases, data warehouses, and datalakes, ensuring efficient handling of large data volumes. Oversee development and optimisation of ETL pipelines More ❯
Chadderton, Oldham, Lancashire, England, United Kingdom
Pure Resourcing Solutions
Are you ready to take ownership of a growing Data & BI function? This is a fantastic opportunity for a hands-on leader to shape and evolve a modern Microsoft data platform, deliver insightful analytics, and mentor a small team while remaining close to the technology.We’re looking for someone who thrives at the intersection of strategy … turning complex datasets into impactful Power BI dashboards that drive decision-making.What you’ll be doing Lead the design, optimisation, and ongoing evolution of a cloud-based Azure data estate. Build and maintain SQL models, queries, and stored procedures to deliver accurate, performant datasets. Develop and manage ETL/ELT pipelines using Azure Data Factory and … Deep expertise in Microsoft SQL Server, including T-SQL, indexing, and performance tuning. Proven experience with the Azure data stack (Data Factory, DataLake Gen2, SQL MI, Power BI, Purview). Strong Power BI skills (advanced DAX, Power Query/M). Solid understanding of data modelling, ETL design, and relational More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
multi-account Organization. This should include: • Networking and DNS components, Compute, Storage and Backup platforms, Logging, Messaging, and Alerting services, Organisational account management platforms, Serverless platforms, DataLake/Lake House platforms, services & architecture, IAM roles and policies, Security and audit tools and practices • Experience with implementing complex Terraform IaC and multi-branch CI/CD More ❯
ll join the Platform & AI Enablement team within our GPTO Engineering organization, reporting to the Sr. Director Engineering API. The team is responsible for building and supporting the data and AI platforms that underpin our industry-leading Business Planning Software solutions. This is a highly hands-on role focused on designing, building, and operating scalable systems that integrate … event-driven, batch & streaming data flows, and AI capabilities into the platform. Your Impact Build and maintain core platform capabilities that support high-throughput batch, streaming, and AI-powered workloads. Develop resilient, observable, and scalable systems using Apache Kafka, Flink, Pulsar, and cloud-native tools. Collaborate with AI/ML engineers to operationalize models and enable generative AI … use cases such as prompt-based insights or automation. Deliver reliable integrations with datalakes, event stores, and analytics systems, ensuring data flows efficiently across the business planning ecosystem. Contribute full-stack code where needed, including React-based frontends and backend services in Java and Python. Write clean, maintainable, well-tested code with an emphasis on More ❯
Skills/requirements Deploy comprehensive cloud infrastructure for various products, including Astronomer Airflow and AccelData environments. Facilitate cross-functional integration between vendor products and other systems, such as datalakes, storage, and compute services. Establish best practices for cloud security, scalability, and performance. Manage and configure vendor product deployments, ensuring the setup and maintenance of environments. Ensure high … control. Collaborate with cloud providers (e.g., AWS) for pipeline integration and scaling requirements. Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. Develop infrastructure for optimal extraction, transformation, and loading of data from various sources using AWS and SQL technologies. Work with stakeholders, including … design, product, and executive teams, to address platform-related technical issues. Build analytical tools to leverage the data pipeline, providing actionable insights into key business performance metrics, such as operational efficiency and customer acquisition.? All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to More ❯
Description As part of the Platform & AI Enablement team under GPTO Engineering, you'll report to the Sr. Director Engineering API. This team is accountable for shaping enterprise data architecture, enabling high-performance AI-driven workloads, and acting as a technical bridge between engineering and architecture. This is a hands-on role for a deeply experienced engineer who … thrives on solving complex problems and scaling robust platforms. Your Impact Influence the design and implementation of platform capabilities for data processing, AI enablement, and developer acceleration across batch, streaming, and real-time systems. Collaborate with the architecture function to represent engineering needs and help translate architectural direction into practical implementation patterns. Guide teams in integrating AI/… strategy discussions. Help teams balance speed and sustainability-delivering under tight deadlines without compromising quality. Your Qualifications 12+ years of software engineering experience, ideally in platform, infrastructure, or data-centric product development. Expertise in Apache Kafka, Apache Flink, and/or Apache Pulsar. Deep understanding of event-driven architectures, datalakes, and streaming pipelines. Strong experience More ❯