slough, south east england, united kingdom Hybrid / WFH Options
Uneek Global
to save time and improve efficiency Requirements: • 2+ years’ experience working with Power BI (Desktop & Service) • Strong skills in DAX, Power Query, and SQL • Experience in data modelling andETL processes • Able to translate technical detail into clear insights • Detail-focused and able to manage multiple priorities • Finance/business background is a plus • Azure, Power Automate, or Python knowledge More ❯
Kafka/Pulsar/Kinesis Strong SQL & relational DBs (Postgres/MySQL) Coding experience in Java/Scala Cloud (AWS preferred) + Infrastructure-as-Code familiarity Solid understanding of ETL/ELT and data modeling Nice to have: Iceberg/Delta/Hudi, Airflow/Dagster/Prefect, Python/TypeScript, data governance. More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Formula Recruitment
for growth and the chance to collaborate across the wider technology teams. Key Skills: Experience with Azure cloud data lakes and services (Data Factory, Synapse, Databricks). Skilled in ETL/ELT pipeline development and big data tools (Spark, Hadoop, Kafka). Strong Python/PySpark programming and advanced SQL with query optimisation. Experience with relational, NoSQL, and graph databases. More ❯
milton keynes, south east england, united kingdom Hybrid / WFH Options
Atrium (EMEA)
the Amazon Web Services (AWS) cloud platform. Skilled in scalable, reliable, and efficient data solutions, often using AWS services like S3, Redshift, EMR, Glue, and Kinesis. This involves designing ETL processes, ensuring data security, and collaborating with other teams for data analysis and business requirements Role Overview: Job Title: Data AWS Engineer Location: Northampton (hybrid: 2-3 days in office … Creating and implementing data pipelines to move data between different systems and applications on AWS. Data Warehouse Management: Designing, building, and maintaining data warehouses using AWS services like Redshift. ETL Process Development: Developing and maintaining Extract, Transform, Load (ETL) processes to move andtransform data. Data Governance and Security: Implementing data governance and security policies for data storage and processing. … Programming Languages: Proficiency in programming languages like Python or Java, used for designing and building data pipelines. SQL: Knowledge of SQL for querying and manipulating data in relational databases. ETL Processes: Experience with ETL tools and techniques. Cloud Computing: Familiarity with cloud computing concepts and principles. Data Architecture: Understanding of data architecture principles and best practices. Data Modeling: Understanding of More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Immersum
re Looking For Strong experience in data engineering and architecture roles. Deep knowledge of SQL, Snowflake (or similar DWHs), and Python . Proven track record of building robust, automated ETL/ELT pipelines . Familiarity with distributed systems and handling large-scale data (millions of rows/sec) . Experience with data hygiene best practices : data models, versioning, reproducibility. Hands More ❯
Certified Google Cloud (GCP) Data Engineer with strong expertise in data acquisition, modelling, ETL, and data warehousing. • Proficient in Terraform, YAML, and GitLab for environment management. • Skilled in MS SQL, GCP SQL, and Oracle DB design, with a focus on data quality and governance. • Strong understanding of contact centre metrics and systems (e.g. WFM, IVR, Call Routing). • Proven ability More ❯
months ending December 2024 totaled $13.8 billion. Experience : Minimum 10+ Years Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good Understanding of Airflow,Data Fusion and Data Flow Strong Background and experience in Data Ingestions More ❯
database architecture Practical experience with data warehousing platforms (Snowflake, PostgreSQL, MySQL) Solid Python programming capabilities for data engineering workflows Proven track record designing and maintaining production-grade ELT/ETL solutions Services offered by Computappoint Limited are those of an Employment Business and/or Employment Agency in relation to this vacancy. More ❯
requirements into technical solutions Take ownership of cloud migration and data platform modernization strategies Undertake data modelling tasks with a focus on long-term maintainability and scalability Designing robust ETL/ELT pipelines using cloud-based tooling Implementing governance frameworks and data quality controls To be successful in this role, you will have: Strong background in Data Warehousing/Data More ❯
expertise in Azure Databricks to join a Global company on a 12 Month initial Contract. Key Responsibilities: Design, build, and manage large-scale data platforms with a focus on ETL, Data Management, and Data Governance Develop and maintain Databricks Delta Live Tables (DLT) pipelines with strong emphasis on cost and performance optimisation Implement and manage structured streaming and real-time More ❯
throughout all project phases Required Skills: Strong Application development experience in Scala/Python Strong Database SQL experience, preferably Redshift Experience in Snowflake is an added advantage Experience with ETL/ELT process and frameworks is a must What is in it for you: Be part of the fastest-growing AI-first digital transformation and engineering company in the world More ❯
technologies · Experience in monetary policy analysis · Experience in time series database analysis · Experience in large Business Transformation Programs involving process reengineering · Extensive experience in UCD led approach, UI development, ETL processes, data flows analysis, cloud architecture, and data modernisation/transformation programs · Excellent analytical and problem-solving skills, with the ability to think critically and propose innovative ways forward andMore ❯
depth experience with all things data including the ability to work with a variety of datasets from multiple sources, familiarity with standard data processing tools/concepts (e.g. SQL, ETL), and experience driving robust QA processes Familiarity with DBT is highly valued In-depth experience of the advertising ecosystem (e.g. ad trafficking, Ad servers, DSPs, Media Strategy and Activation, etc. More ❯
Data Factory, Databricks, and related tools. Expand the existing data lake and warehouse to include new domains such as supply chain, marketing, finance, and customer data. Develop and optimise ETL/ELT processes to integrate data from diverse global sources (POS systems, e-commerce platforms, CRM, ERP, etc.). Implement data quality frameworks, monitoring, and alerting to ensure high data … Azure data ecosystem, including: Azure Data Lake/Data Lakehouse Azure Data Factory/Synapse Pipelines Databricks Azure SQL Database or Synapse Analytics Solid understanding of data warehousing principles, ETL/ELT design, and data modelling Experience with CI/CD Familiarity with data quality frameworks and data governance. Strong SQL and Python skills. Experience working in Agile delivery environments. More ❯
modern cloud data environment. Key Responsibilities Build and maintain Power BI reports and dashboards that deliver accurate and timely insights for operational and business stakeholders. Develop, test, and maintain ETL pipelines in Azure Data Factory (ADF) to ensure reliable data integration and transformation. Work with Azure Synapse Analytics to model, query, and optimise data for analysis and reporting. Support the … Technical Skills Power BI: Strong experience in developing and publishing reports, dashboards, and data models using DAX and Power Query. Azure Data Factory: Hands-on experience building and maintaining ETL pipelines. Azure Synapse Analytics: Working knowledge of data modelling, views, and performance tuning. SQL: Proficient in writing, optimising, and troubleshooting queries. Azure Data Lake: Understanding of data storage and structure More ❯
s future data strategy. What you’ll bring: Strong background in data engineering – Python, SQL, and cloud platforms (GCP or Azure). Experience designing or managing data pipelines andETL processes . Exposure to orchestration tools such as dbt , Airflow , or Azure Data Factory . Good understanding of data architecture and automation . AdTech or MarTech experience would be a More ❯
looking for a skilled Data Engineer to play a crucial role in building and optimizing the firm's data infrastructure. Responsibilities: Design, develop, and maintain robust data pipelines andETL processes. Collaborate with data scientists and analysts to support data-driven strategies. Ensure data accuracy, consistency, and security across various data sources. Drive innovation by exploring new technologies and best More ❯
impactful, data-driven solutions, while documenting processes for team-wide support. What skills you’ll need • Certified Google Cloud (GCP) Data Engineer with strong expertise in data acquisition, modelling, ETL, and data warehousing. • Proficient in Terraform, YAML, and GitLab for environment management. • Skilled in MS SQL, GCP SQL, and Oracle DB design, with a focus on data quality and governance. More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hexegic
promises exciting, engaging and rewarding projects for those that are keen to develop and build a successful career. Core Responsibilities Establishing new data integrations within the data foundation Conduct ETL activities as conducted by SMEs Configuring connections to other datasets within the data foundation Collaborate with SMEs to create, test and validate data models and outputs Set up monitoring andMore ❯
an experienced engineer, a confident communicator, business facing and with several years experience delivering cloud engineering solutions (they’re open to any platform), using Python, Airflow, SQL, data modelling, ETLand real time analytics. Any experience within the life sciences space (through university and/or industry) would also be a huge plus. Please note: if you hit the requirements More ❯
oxford district, south east england, united kingdom
La Fosse
an experienced engineer, a confident communicator, business facing and with several years experience delivering cloud engineering solutions (they’re open to any platform), using Python, Airflow, SQL, data modelling, ETLand real time analytics. Any experience within the life sciences space (through university and/or industry) would also be a huge plus. Please note: if you hit the requirements More ❯
Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency with data visualization tools (Tableau, Qlik, or similar). Experience with big data platforms (Snowflake, Databricks, Spark) andETL processes . Working knowledge of Python or R for analytics or automation (preferred). Understanding of statistical methods and A/B testing . Excellent storytelling and communication skills to More ❯
to power our trading, operational, and marketing intelligence. About the Role We are seeking a Senior Data Engineer to lead the technical build-out of this initiative — from designing ETL pipelines to creating a secure, scalable data warehouse that underpins business-critical reporting and analytics. This is a high-impact role at the intersection of technology and financial insight, ideal … trading-driven organisations, willing to grow into a Head of Data. Responsibilities Design, develop, and deploy a robust data infrastructure leveraging cloud-based services (AWS). Build and maintain ETL pipelines feeding data from multiple internal systems (trading, CRM, RUM, finance) into a central Data Lakehouse. Implement best practices for data ingestion, validation, transformation, and storage using modern cloud tools … Key Objectives for the first six months Define the core data architecture and choice of toolset. Define data models. Establish a secure and automated data development cycle. Build foundational ETL pipelines. Project Ownership Drive the full lifecycle of the data centre project, establishing foundations for downstream reporting and BI functions. Support future integration of a dedicated Data & Reporting Analyst role More ❯
to define the data vision and enable meaningful business transformation. Key Responsibilities Lead the design and implementation of a modern cloud data platform (Azure, AWS, or GCP). Develop ETL/ELT pipelines to manage structured and unstructured data at scale. Enable self-service BI and deliver insights through Power BI dashboards and advanced analytics. Integrate AI and automation into … teams in complex organisations. Expertise in cloud data platforms and data processing services. Strong skills in Python, SQL, and Power BI (DAX, Power Query, data modelling). Knowledge of ETL/ELT pipelines, data warehousing, and data mesh architectures. Familiarity with AI/ML applications, metadata management, and data lineage tracking. Excellent communication and stakeholder management skills. Degree in Computer More ❯
them to SDR requirements. Advanced knowledge of data quality frameworks, master data management, metadata, and reference data hierarchies. Proficiency in data validation, cleansing technologies, and lineage tracking. Familiarity with ETL/integration pipelines and validation of outputs for regulatory purposes. Excellent written and verbal communication skills, able to interpret regulatory guidance and influence business stakeholders. Experience of leading forums, stewardship More ❯