slough, south east england, united kingdom Hybrid / WFH Options
Amarji
is an exciting opportunity for an ambitious individual to contribute to our success while gaining valuable experience in a fast-paced and innovative environment. Responsibilities: - Implement data modelling andETL processes to integrate data from different sources into fabric data lake/data warehouse. - Collaborate with the data and analytics team to design, develop, and maintain Power BI reports andMore ❯
slough, south east england, united kingdom Hybrid / WFH Options
Publicis Production
and product to ensure data and reporting needs are met Implement data quality checks, data governance practices, and monitoring systems to ensure reliable and trustworthy data. Optimize performance of ETL/ELT workflows and improve infrastructure scalability. More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
architectures that drive measurable transformation aligned to banking sector priorities. Client Advisory & Engagement – Act as a trusted advisor, leading workshops and influencing senior stakeholders on strategy, roadmaps, and innovation. ETL/ELT & Data Engineering – Provide oversight on robust data pipeline and integration designs using SSIS, ADF, Informatica, or IBM DataStage. Data Governance & Quality – Define governance, metadata, and quality frameworks using More ❯
must be confident working across: Azure Data Services, including: Azure Data Factory Azure Synapse Analytics Azure Databricks Microsoft Fabric (desirable) Python and PySpark for data engineering, transformation, and automation ETL/ELT pipelines across diverse structured and unstructured data sources Data lakehouse and data warehouse architecture design Power BI for enterprise-grade reporting and visualisation Strong knowledge of data modelling More ❯
must be confident working across: Azure Data Services, including: Azure Data Factory Azure Synapse Analytics Azure Databricks Microsoft Fabric (desirable) Python and PySpark for data engineering, transformation, and automation ETL/ELT pipelines across diverse structured and unstructured data sources Data lakehouse and data warehouse architecture design Power BI for enterprise-grade reporting and visualisation Strong knowledge of data modelling More ❯
Reading, England, United Kingdom Hybrid / WFH Options
HD TECH Recruitment
initiatives Ensure high standards of documentation and data security compliance Technical Skills (desirable): Microsoft Azure Data Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding of statistical analysis and predictive modelling Experience More ❯
slough, south east england, united kingdom Hybrid / WFH Options
HD TECH Recruitment
initiatives Ensure high standards of documentation and data security compliance Technical Skills (desirable): Microsoft Azure Data Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding of statistical analysis and predictive modelling Experience More ❯
error handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity with data warehousing solutions (e.g., Snowflake More ❯
Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for More ❯
A proactive, self-starting attitude with a passion for continuous learning and improvement. Experience with geospatial data or tools; QGIS is essential, ArcGIS and FME are desirable. Familiarity with ETL processes and collaboration with data engineering teams is a strong plus. Experience working in agile teams and contributing to iterative delivery cycles. Advanced Excel skills Openness to travel (expenses covered More ❯
Slough, Berkshire, United Kingdom Hybrid / WFH Options
Halton Housing
vibrant organisation. What You'll Do: Coding DAX Measures and Dimensional Models Developing & delivering visually compelling Power BI Dashboards & Reports to specification Developing and maintaining SSRS reports Developing & maintaining ETL pipeline solutions in Azure Data Factory and SSIS, utilising Azure Data Lake & Dev Ops Providing second & third line support for Data Team and own allocated support tickets Monitoring the Data More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
technologies (Azure, AWS, GCP) and tools like Databricks, Snowflake, Synapse. Shape cloud migration and modernization strategies with a strong focus on DevOps practices. Architect scalable data models and robust ETL/ELT pipelines using industry-standard frameworks. Implement data governance and quality frameworks to ensure data integrity and compliance. Collaborate with clients’ senior leadership to influence data-driven transformation initiatives. More ❯
implementing data engineering best-practice (e.g., source-to-target mappings, coding standards, data quality, etc.), working closely with the external party who setup the environment . Create and maintain ETL processes, data mappings & transformations to orchestrate data integrations. Ensure data integrity, quality, privacy, and security across systems, in line with client and regulatory requirements. Optimize data solutions for performance and … up monitoring and data quality exception handling. Strong data modelling experience. Experience managing and developing CI/CD pipelines. Experience with Microsoft Azure products and services, and proficiency in ETL processes. Experience of working with APIs to integrate data flows between disparate cloud systems. Strong analytical and problem-solving skills, with the ability to work independently and collaboratively. The aptitude More ❯
in designing scalable data models and architectures, with proficiency in cloud platforms such as AWS, GCP, or Azure, and familiarity with data warehousing solutions. • Experience with data integration tools, ETL processes, and data pipeline development, along with a solid understanding of data governance and security best practices. • Proven leadership skills in managing cross-functional teams, driving data strategy, and delivering More ❯
leading enterprise data services, ideally in [government, healthcare, financial services, or large enterprise environments]. Strong background in data engineering, data platforms (AWS, Azure, GCP), and data integration/ETL pipelines. Proven track record of managing large, complex data programmes and services at scale. Strong knowledge of data governance, quality management, and regulatory compliance. Excellent leadership, stakeholder management, and influencing More ❯
informed decisions. With the platform foundation already in place, your mission will be to scale, optimise, and operationalise it — from designing new integrations to building tools and processes that extract maximum business value. Key Responsibilities Develop and maintain a roadmap for expanding the Azure data warehouse to support advanced reporting and analytics. Enhance and maintain the existing environment, embedding data … engineering best practices. Design and implement ETL workflows, transformations, and data mapping solutions. Monitor and uphold data quality, privacy, and compliance with all relevant regulations. Improve performance and scalability while researching emerging data technologies. Act as a go-to expert for available datasets and collaborate with teams to design suitable data models. Essential Skills & Experience 5+ years in data engineering More ❯
platforms) Deep understanding of Fabric ecosystem components and best practices Experience with medallion architecture implementation in Fabric Technical Skills: PySpark: Advanced proficiency in PySpark for data processing Data Engineering: ETL/ELT pipeline development and optimization Real-time Processing: Experience with streaming data and real-time analytics Performance Tuning: Optimization of data models and query performance Data Governance: Implementation of More ❯
on causal ML and probabilistic modelling Experience developing and implementing knowledge graphs Proficiency in scaling AI solutions from concept to production Working knowledge of backend systems, data pipelines, andETL processes Familiarity with cloud platforms, particularly Microsoft Azure Understanding of microservices architecture and distributed systems Experience with DevOps practices for AI/ML workflows (MLOps) Strong programming skills in Python More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
AWS stack and infrastructure-as-code tools to build robust data pipelines and applications that process complex datasets from multiple operational systems. Key Responsibilities: Build and maintain AWS-based ETL/ELT pipelines using S3, Glue (PySpark/Python), Lambda, Athena, Redshift, and Step Functions Develop backend applications to automate and support compliance reporting Process and validate complex data formats More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hexegic
promises exciting, engaging and rewarding projects for those that are keen to develop and build a successful career. Core Responsibilities Establishing new data integrations within the data foundation Conduct ETL activities as conducted by SMEs Configuring connections to other datasets within the data foundation Collaborate with SMEs to create, test and validate data models and outputs Set up monitoring andMore ❯
understanding of modern data architecture, including robust SQL skills and database performance tuning. Experienced in orchestrating complex data workflows to manage scheduling, dependencies, and data quality in ELT/ETL pipelines. Familiarity with modern data warehouses and experience with orchestration tools. Company Overview: You’ll become part of a wider data and analytics team that’s working on some genuinely More ❯
bring fresh thinking to your work. Support the execution of the data advertising strategy to drive increased direct/programmatic direct revenue. Work collaboratively with stakeholders to develop the ETL process and develop initiatives that increase zero- and first-party data collection and enrichment. Skills and Experience Essential Experience in analysing data and using data visualisation tools (preferably DOMO). More ❯
own team in the future. Key Responsibilities Data Architecture & Integration (50%) Design and implement scalable data pipelines across multiple platforms (ERP, CRM, project management tools, etc.) Develop and maintain ETL/ELT processes to unify cross-functional datasets. Build foundational datasets and collaborate with senior leadership to define data architecture. Design for future scalability and cloud warehouse implementation Data Visualisation … BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics, or data science. 5+ years of strong SQL/data modelling experience. Expertise building ETL/ELT pipelines across cloud-based platforms. Advanced Power BI skills and a deep understanding of scalable architecture. Familiarity with CRM/ERP systems (e.g., Salesforce, Netsuite, or similar). More ❯
in guiding customers through system and data architecture, integrating with cloud data warehouses such as Snowflake and Databricks; supporting the integration of external data sources, developing data models andETL processes; to driving value through dashboards or business applications. Integrating AI tools and LLMs to unlock valuable insights, optimise performance, and automate business processes will also be a part of … Technical Problem Solving: Assist customers in resolving technical challenges, working with SQL, APIs, and Python and support related to data ingestion, transformation, and visualisation Develop and optimise data models, ETL processes, Workflows and dashboards within the platform Provide best practices for analytics pipelines, integration strategies, performance optimisation and data governance and AI readiness Leverage AI capabilities within the platform andMore ❯