on experience with Big Data ecosystems Hadoop, Spark, Kafka, Hive, HBase, etc. Strong experience with Cloud platforms (AWS/Azure/GCP) and services like: AWS: S3, Glue, EMR, Redshift, Lambda, Kinesis Azure: Data Factory, Synapse, Databricks, ADLS GCP: BigQuery, Dataflow, Pub/Sub Experience with Data Warehouse/Data Lake/Lakehouse design and modeling (Kimball, OLAP, OLTP More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
strategic business decisions. Responsibilities for the AWS Data Engineer: Design, build and maintain scalable data pipelines and architectures within the AWS ecosystem Leverage services such as AWS Glue, Lambda, Redshift, EMR and S3 to support data ingestion, transformation and storage Work closely with data analysts, architects and business stakeholders to translate requirements into robust technical solutions Implement and optimise More ❯
anomaly detection, and GenAI-powered automation Support GenAI initiatives through data readiness, synthetic data generation, and prompt engineering. Mandatory Skills: Cloud Platforms:Deep experience with AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like Apache Airflow, Glue, Kafka, Informatica More ❯
Data Engineers to join a large on prem to AWS migration programme in replicating data pipelines in readiness for a lift and shift to the cloud. Essential: AWS and Redshift In full: Role purpose Reporting to the Lead Data Engineer, the Data Engineer is responsible for designing and maintaining scalable data pipelines, ensuring data availability, quality, and performance to More ❯
AWS, or GCP) and cloud-native BI services. Strong understanding of data warehousing, OLAP, semantic models, and query optimization. Experience working with modern data platforms such as Snowflake, Databricks, Redshift, BigQuery, or Synapse. Soft Skills Excellent communication and collaboration skills. Ability to lead technical teams and manage multiple workstreams. Strong problem-solving, analytical thinking, and decision-making skills. Ability More ❯
South West London, London, United Kingdom Hybrid/Remote Options
ARC IT Recruitment Ltd
based architecture. You will need to have skills and experience in the following: Cloud Data Architecture: Strong knowledge of modern, cloud-based data architecture and tooling (e.g., S3, Glue, Redshift, Athena, Lake Formation, Iceberg/Delta). AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Arc IT Recruitment
based architecture. You will need to have skills and experience in the following: Cloud Data Architecture: Strong knowledge of modern, cloud-based data architecture and tooling (e.g., S3, Glue, Redshift, Athena, Lake Formation, Iceberg/Delta). AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT More ❯
with analysts and architects to ensure compliance with government security standards. Troubleshoot and resolve issues in complex cloud environments. Essential Skills Strong experience with AWS services (Glue, Lambda, S3, Redshift, IAM). Proficiency in Python and SQL for data engineering tasks. Knowledge of data modelling, ETL frameworks, and best practices. Familiarity with security and compliance in government or regulated More ❯
with analysts and architects to ensure compliance with government security standards. Troubleshoot and resolve issues in complex cloud environments. Essential Skills Strong experience with AWS services (Glue, Lambda, S3, Redshift, IAM). Proficiency in Python and SQL for data engineering tasks. Knowledge of data modelling, ETL frameworks, and best practices. Familiarity with security and compliance in government or regulated More ❯
or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical direction. More ❯
glasgow, central scotland, united kingdom Hybrid/Remote Options
Capgemini
that power both operational and analytical workloads, leveraging leading cloud technologies (AWS, GCP, Azure). We ENABLE hybrid cloud transformation – You will implement modern hybrid data architectures (e.g. Snowflake, Redshift, BigQuery) and drive the migration of legacy systems to cloud-native solutions that unlock agility and performance. We INNOVATE with AI-driven data solutions – You will create Proof of More ❯
engineers and support their growth. Implement best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (e.g., S3, Redshift, Lambda). Proficient in Python, Apache Spark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and More ❯
expertise in SQL, NoSQL and data integration technologies. Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud. Experience with data warehousing technologies such as Snowflake, Redshift, BigQuery, Synapse, or Teradata. Skilled in ETL/ELT tools and data pipeline technologies such as Informatica, Talend, Databricks, or Apache tools. Strong understanding of data governance, MDM, metadata More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
AJ Bell
s Data Governance and Data Classification policies. Maintain data dictionary. Maintain business level data model. Recommending and introducing new technology where needed. Core: Cloud data platforms (e.g. Snowflake, BigQuery, Redshift) Data transformation technology such as DBT Visual Studio Code Python CI automation systems such as Jenkins A git-based source control system such as BitBucket Data Warehouse/Kimball More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Lorien
with 2+ years in a technical leadership or management role Strong technical proficiency in data modelling, data warehousing, and distributed systems Hands-on experience with cloud data services (AWS Redshift, Glue, EMR or equivalent) Solid programming skills in Python and SQL Familiarity with DevOps practices (CI/CD, Infrastructure as Code - e.g., Terraform) Excellent communication skills with both technical More ❯
. Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift). Experience with streaming technologies (Kafka, Kinesis, or similar). Strong knowledge of data modelling, governance, and architecture best practices . Excellent leadership, communication, and stakeholder management skills. NICE More ❯
Coventry, West Midlands, United Kingdom Hybrid/Remote Options
Coventry Building Society
office. A team-led hybrid working arrangement is in place. About you To be successful in this role it's essential you have: Extensive experience with AWS (S3, Glue, Redshift, SageMaker) Experience of building and automating data pipelines in finance The ability to demonstrate, automate and manage data systems so they run smoothly and can grow easily. Familiarity with More ❯
the organisation. What you'll be doing: Design, develop, and maintain scalable data architectures and ETL pipelines Build and manage data models and data warehouse solutions (Airflow, dbt, and Redshift) Write clean, efficient Python and SQL code for data processing and transformation Integrate data from internal and third-party APIs and services Optimise data pipelines for performance, scalability, and More ❯
teams as needed. Skills required to contribute: Years of overall Data and Analytics experience with 2. Minimum 10+ years in AWS data platform including AWS S3, AWS Glue, AWS Redshift, AWS Athena, AWS Sagemaker, AWS Quicksight and AWS MLOPS 3. Snowflake DWH architecture, Snowflake Data Sharing, Snowpipe, Polaris catalog and data governance (meta data/business catalogs). 4. More ❯
may be a great fit if you have experience with any of the following... Workflow orchestration tooling (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (e.g. BigQuery, Snowflake, Redshift) Data transformation tools (e.g. dbt) and data quality frameworks (e.g. Great Expectations) Backend Python frameworks (e.g. Django, FastAPI, Flask) for API development Modern data processing libraries (e.g. Polars, DuckDB More ❯
leveraging data to improve the OTM users and agents experience. This role would be suited to an individual with a background in Python, SQL, AWS, Looker, PyTorch, Airflow and Redshift/Postgres. The candidate must be able to exhibit proficiency in working with large volumes of data, automating data loads, and the ability to understand end user requirements with … goal of building a solution to provide high quality data processing and analysis as well as building APIs. RESPONSIBILITIES D evelop data pipelines using Python and Airflow Data warehousing (Redshift/Spectrum) Develop data solutions to address business needs Maintain existing reports and improve their performance QUALIFICATIONS Bachelors degree in Computer Science, Data Science, Mathematics, Engineering or equivalent experience … query performance. Strong skills in Python Experience with AWS or GCP Demonstrated experience and responsibility with data, processes, and building ETL pipelines. Experience with cloud data warehouses such as AmazonRedshift, and Google BigQuery. Building visualizations using tools such as Looker studio or equivalent Experience in designing ETL/ELT solutions, preferably using tools like Airflow or AWS More ❯
experience in similar environments is essential. The Skill Requirements: We're looking for candidates with a blend of the following: Strong knowledge of AWS data services (Glue, S3, Lambda, Redshift, etc.) Solid understanding of ETL processes and data pipeline management Proficiency in Python and PySpark Experience working with SQL-based platforms Previous involvement in migrating on-premise solutions to More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Lorien
experience in similar environments is essential. The Skill Requirements: We're looking for candidates with a blend of the following: Strong knowledge of AWS data services (Glue, S3, Lambda, Redshift, etc.) Solid understanding of ETL processes and data pipeline management Proficiency in Python and PySpark Experience working with SQL-based platforms Previous involvement in migrating on-premise solutions to More ❯
SC cleared, or eligible to obtain SC-level security clearance Job description We are looking for a developer with expertise in Python, AWS Glue, step function, EMR cluster and redshift, to join our AWS Development Team. You will be serving as the functional and domain expert in the project team to ensure client expectations are met. The role involves More ❯
South East London, London, United Kingdom Hybrid/Remote Options
Stepstone UK
what to do, run some whiteboard design sessions (realising you've used a permanent marker, you quickly leave the room) Qualifications Data Warehousing, Data Modelling, Database Design, ETL AWS Redshift, AWS Glue, S3, SQL Server, Power BI Cloud/DevOps AWS, Docker, Terraform Bitbucket, Bamboo, CI/CD deploy pipeline Agile, pair programming, code reviews Additional Information Were a More ❯