quality controls. Ability to design, build, and optimise scalable data models that support analytics and machine learning workloads. Solid working knowledge of AWS data services (e.g., S3, Kinesis, Glue, Redshift, Lambda, EMR) or Azure equivalents (e.g., ADF, Synapse, Fabric, Azure Functions). Familiarity with Palantir Foundry or Gotham is a significant advantage. Experience working within Data Lakehouse platforms such More ❯
quality controls. Ability to design, build, and optimise scalable data models that support analytics and machine learning workloads. Solid working knowledge of AWS data services (e.g., S3, Kinesis, Glue, Redshift, Lambda, EMR) or Azure equivalents (e.g., ADF, Synapse, Fabric, Azure Functions). Experience working within Data Lakehouse platforms such as Databricks, Snowflake, and/or Microsoft Fabric is an More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
strategic business decisions. Responsibilities for the AWS Data Engineer: Design, build and maintain scalable data pipelines and architectures within the AWS ecosystem Leverage services such as AWS Glue, Lambda, Redshift, EMR and S3 to support data ingestion, transformation and storage Work closely with data analysts, architects and business stakeholders to translate requirements into robust technical solutions Implement and optimise More ❯
anomaly detection, and GenAI-powered automation Support GenAI initiatives through data readiness, synthetic data generation, and prompt engineering. Mandatory Skills: Cloud Platforms:Deep experience with AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like Apache Airflow, Glue, Kafka, Informatica More ❯
Data Engineers to join a large on prem to AWS migration programme in replicating data pipelines in readiness for a lift and shift to the cloud. Essential: AWS and Redshift In full: Role purpose Reporting to the Lead Data Engineer, the Data Engineer is responsible for designing and maintaining scalable data pipelines, ensuring data availability, quality, and performance to More ❯
South West London, London, United Kingdom Hybrid/Remote Options
ARC IT Recruitment Ltd
based architecture. You will need to have skills and experience in the following: Cloud Data Architecture: Strong knowledge of modern, cloud-based data architecture and tooling (e.g., S3, Glue, Redshift, Athena, Lake Formation, Iceberg/Delta). AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Arc IT Recruitment
based architecture. You will need to have skills and experience in the following: Cloud Data Architecture: Strong knowledge of modern, cloud-based data architecture and tooling (e.g., S3, Glue, Redshift, Athena, Lake Formation, Iceberg/Delta). AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT More ❯
with analysts and architects to ensure compliance with government security standards. Troubleshoot and resolve issues in complex cloud environments. Essential Skills Strong experience with AWS services (Glue, Lambda, S3, Redshift, IAM). Proficiency in Python and SQL for data engineering tasks. Knowledge of data modelling, ETL frameworks, and best practices. Familiarity with security and compliance in government or regulated More ❯
with analysts and architects to ensure compliance with government security standards. Troubleshoot and resolve issues in complex cloud environments. Essential Skills Strong experience with AWS services (Glue, Lambda, S3, Redshift, IAM). Proficiency in Python and SQL for data engineering tasks. Knowledge of data modelling, ETL frameworks, and best practices. Familiarity with security and compliance in government or regulated More ❯
or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical direction. More ❯
and public cloud technologies. Strong experience with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid/Remote Options
Fruition Group
engineers and support their growth. Implement best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (e.g., S3, Redshift, Lambda). Proficient in Python, Apache Spark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and More ❯
scale that to 500 million events per day. You'll be using Kafka for real-time data streaming and AWS to build, optimise and monitor data warehousing solutions in AmazonRedShift, owning and managing AWS based systems to ensure cost effective, secure and high performance data operations. Location/WFH: You can work from home from anywhere in … data engineering skills and experience, having been through multiple end-to-end data pipeline builds You have a deep knowledge of AWS cloud services including: S3, EC2, Lambda, RDS, RedShift, Glue as well as Kinesis, MongoDB and PostgreSQL You have experience of working in environments with high throughput data (millions of events per hour) You have strong Kafka experience More ❯
Belfast, City of Belfast, County Antrim, United Kingdom Hybrid/Remote Options
Aspire Personnel Ltd
cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. AWS data and analytics services (or open-source equivalent) such as EMR, Glue, RedShift, Kinesis, Lambda, DynamoDB. What you can expect Work to agile best practices and cross-functionally with multiple teams and stakeholders. You’ll be using your technical skills to problem More ❯
cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. AWS data and analytics services (or open-source equivalent) such as EMR, Glue, RedShift, Kinesis, Lambda, DynamoDB. What you can expect Work to agile best practices and cross-functionally with multiple teams and stakeholders. You’ll be using your technical skills to problem More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
AJ Bell
s Data Governance and Data Classification policies. Maintain data dictionary. Maintain business level data model. Recommending and introducing new technology where needed. Core: Cloud data platforms (e.g. Snowflake, BigQuery, Redshift) Data transformation technology such as DBT Visual Studio Code Python CI automation systems such as Jenkins A git-based source control system such as BitBucket Data Warehouse/Kimball More ❯
Salford, Greater Manchester, North West, United Kingdom Hybrid/Remote Options
AJ BELL BUSINESS SOLUTIONS LIMITED
Bells Data Governance and Data Classification policies. Maintain data dictionary. Maintain business level data model. Recommending and introducing new technology where needed. Core: Cloud data platforms (e.g. Snowflake, BigQuery, Redshift) Data transformation technology such as DBT Visual Studio Code Python CI automation systems such as Jenkins A git-based source control system such as BitBucket Data Warehouse/Kimball More ❯
cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. AWS data and analytics services (or open-source equivalent) such as EMR, Glue, RedShift, Kinesis, Lambda, DynamoDB. What you can expect Work to agile best practices and cross-functionally with multiple teams and stakeholders. You’ll be using your technical skills to problem More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Lorien
with 2+ years in a technical leadership or management role Strong technical proficiency in data modelling, data warehousing, and distributed systems Hands-on experience with cloud data services (AWS Redshift, Glue, EMR or equivalent) Solid programming skills in Python and SQL Familiarity with DevOps practices (CI/CD, Infrastructure as Code - e.g., Terraform) Excellent communication skills with both technical More ❯
. Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift). Experience with streaming technologies (Kafka, Kinesis, or similar). Strong knowledge of data modelling, governance, and architecture best practices . Excellent leadership, communication, and stakeholder management skills. NICE More ❯
the organisation. What you'll be doing: Design, develop, and maintain scalable data architectures and ETL pipelines Build and manage data models and data warehouse solutions (Airflow, dbt, and Redshift) Write clean, efficient Python and SQL code for data processing and transformation Integrate data from internal and third-party APIs and services Optimise data pipelines for performance, scalability, and More ❯
everyone feels supported to learn and grow. Skills & Experience We re Looking For Technical Expertise Hands-on experience with modern data stacks (e.g. Dataform, dbt, Snowflake/BigQuery/Redshift). Expert SQL skills, including advanced analytics and performance optimisation. Strong experience with BI tools (Looker, Power BI, Tableau) and data storytelling. Skilled in data modelling, ELT/ETL More ❯
leveraging data to improve the OTM users and agents experience. This role would be suited to an individual with a background in Python, SQL, AWS, Looker, PyTorch, Airflow and Redshift/Postgres. The candidate must be able to exhibit proficiency in working with large volumes of data, automating data loads, and the ability to understand end user requirements with … goal of building a solution to provide high quality data processing and analysis as well as building APIs. RESPONSIBILITIES D evelop data pipelines using Python and Airflow Data warehousing (Redshift/Spectrum) Develop data solutions to address business needs Maintain existing reports and improve their performance QUALIFICATIONS Bachelors degree in Computer Science, Data Science, Mathematics, Engineering or equivalent experience … query performance. Strong skills in Python Experience with AWS or GCP Demonstrated experience and responsibility with data, processes, and building ETL pipelines. Experience with cloud data warehouses such as AmazonRedshift, and Google BigQuery. Building visualizations using tools such as Looker studio or equivalent Experience in designing ETL/ELT solutions, preferably using tools like Airflow or AWS More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Lorien
experience in similar environments is essential. The Skill Requirements: We're looking for candidates with a blend of the following: Strong knowledge of AWS data services (Glue, S3, Lambda, Redshift, etc.) Solid understanding of ETL processes and data pipeline management Proficiency in Python and PySpark Experience working with SQL-based platforms Previous involvement in migrating on-premise solutions to More ❯