Manchester, North West, United Kingdom Hybrid / WFH Options
Client Server
Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication), use Redshift for efficient loading strategies and write ETL pipelines that handle large volumes of data efficiently, with low latency. Location/WFH: You'll join a small but growing team More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication), use Redshift for efficient loading strategies and write ETL pipelines that handle large volumes of data efficiently, with low latency. Location/WFH: You'll join a small but growing team More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication), use Redshift for efficient loading strategies and write ETL pipelines that handle large volumes of data efficiently, with low latency. Location/WFH: You'll join a small but growing team More ❯
bolton, greater manchester, north west england, united kingdom Hybrid / WFH Options
Client Server
Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication), use Redshift for efficient loading strategies and write ETL pipelines that handle large volumes of data efficiently, with low latency. Location/WFH: You'll join a small but growing team More ❯
warrington, cheshire, north west england, united kingdom Hybrid / WFH Options
Client Server
Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication), use Redshift for efficient loading strategies and write ETL pipelines that handle large volumes of data efficiently, with low latency. Location/WFH: You'll join a small but growing team More ❯
and a seasoned architect with solid knowledge of the current technology trends • Expertise in relational and non-relational databases, including Oracle, MS SQL, and modern cloud-based solutions (Snowflake, Redshift, etc.) • 7 to 10 years of experience with enterprise data architecture, database management, and data engineering • Proven experience leading a data transformation project(s) from concept to delivery • Deep More ❯
Strong experience with Spark using Scala and Python Advanced SQL knowledge, with experience in pulling complex queries, query authoring, and strong familiarity with Snowflake and various relational databases like Redshift, Postgres, etc. Experience with data modeling and system design architecting scalable data platforms and applications for large enterprise clients. A dedicated focus on building high-performance systems Exposure to … dedicated data teams Nice to have: Previous engagement with healthcare and/or social determinants of health data products. Experience leveraging agentic-assisted coding tools (eg:, Cursor, Codex AI, Amazon Q, GitHub Copilot) Experience working with R Experience with processing health care eligibility and claims data Exposure to Matillion ETL Experience using and building solutions to support various reporting More ❯
Strong experience with Spark using Scala and Python Advanced SQL knowledge, with experience in pulling complex queries, query authoring, and strong familiarity with Snowflake and various relational databases like Redshift, Postgres, etc. Experience with data modeling and system design architecting scalable data platforms and applications for large enterprise clients. A dedicated focus on building high-performance systems Exposure to … dedicated data teams Nice to have: Previous engagement with healthcare and/or social determinants of health data products. Experience leveraging agentic-assisted coding tools (eg:, Cursor, Codex AI, Amazon Q, GitHub Copilot) Experience working with R Experience with processing health care eligibility and claims data Exposure to Matillion ETL Experience using and building solutions to support various reporting More ❯
Strong experience with Spark using Scala and Python Advanced SQL knowledge, with experience in pulling complex queries, query authoring, and strong familiarity with Snowflake and various relational databases like Redshift, Postgres, etc. Experience with data modeling and system design architecting scalable data platforms and applications for large enterprise clients. A dedicated focus on building high-performance systems Exposure to … dedicated data teams Nice to have: Previous engagement with healthcare and/or social determinants of health data products. Experience leveraging agentic-assisted coding tools (eg:, Cursor, Codex AI, Amazon Q, GitHub Copilot) Experience working with R Experience with processing health care eligibility and claims data Exposure to Matillion ETL Experience using and building solutions to support various reporting More ❯
Strong experience with Spark using Scala and Python Advanced SQL knowledge, with experience in pulling complex queries, query authoring, and strong familiarity with Snowflake and various relational databases like Redshift, Postgres, etc. Experience with data modeling and system design architecting scalable data platforms and applications for large enterprise clients. A dedicated focus on building high-performance systems Exposure to … dedicated data teams Nice to have: Previous engagement with healthcare and/or social determinants of health data products. Experience leveraging agentic-assisted coding tools (eg:, Cursor, Codex AI, Amazon Q, GitHub Copilot) Experience working with R Experience with processing health care eligibility and claims data Exposure to Matillion ETL Experience using and building solutions to support various reporting More ❯
Strong experience with Spark using Scala and Python Advanced SQL knowledge, with experience in pulling complex queries, query authoring, and strong familiarity with Snowflake and various relational databases like Redshift, Postgres, etc. Experience with data modeling and system design architecting scalable data platforms and applications for large enterprise clients. A dedicated focus on building high-performance systems Exposure to … dedicated data teams Nice to have: Previous engagement with healthcare and/or social determinants of health data products. Experience leveraging agentic-assisted coding tools (eg:, Cursor, Codex AI, Amazon Q, GitHub Copilot) Experience working with R Experience with processing health care eligibility and claims data Exposure to Matillion ETL Experience using and building solutions to support various reporting More ❯
City of London, London, United Kingdom Hybrid / WFH Options
I3 Resourcing Limited
Skillset Delivery experience Building solutions in snowflake implementing data warehousing solutions using Snowflake and AWS Hands-on experience with AWS services such as Glue (Spark), Lambda, Step Functions, ECS, Redshift, and SageMaker. Enthusiasm for cross-functional work and adaptability beyond traditional data engineering. Examples like building APIs, integrating with microservices, or contributing to backend systems - not just data pipelines … strong hands-on programming skills and software engineering fundamentals, with experience building scalable solutions in cloud environments (AWS preferred) Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Solid foundation in software engineering principles, including version control (Git), testing, CI/CD, modular design, and clean code practices. Experience developing reusable components and APIs … preferred for data processing, automation, and pipeline development AWS or Snowflake certifications are a plus Hands-on experience with AWS services such as Glue (Spark), Lambda, Step Functions, ECS, Redshift, and SageMaker. Enthusiasm for cross-functional work and adaptability beyond traditional data engineering. Examples like building APIs, integrating with microservices, or contributing to backend systems - not just data pipelines More ❯
Security SQL Configuration management Team building and communication Familiarity with JIRA and Confluence Familiarity with Agile methodologies including Scrum and Kanban Desired Skills: WebFOCUS v9.2 Linux AWS/Cloud AmazonRedshift Understanding of financial, acquisition, and budget data Release management and SDLC Proficiency with Agile methodologies including Scrum and Kanban More ❯
Reston, Virginia, United States Hybrid / WFH Options
ICF
related discipline) 6-8 years' experience in Data engineering with strong background in pipeline development and data integration. 3+ years of hands-on experience with AWS data services, including: Amazon Glue, Lambda, S3, StepFunctions and Athena; familiarity with Redshift and Lake Formation is a plus. 6+ years of experience in SQL and programming, preferably in Python. Experience with … BI Tools like Tableau, PowerBI or Amazon QuickSight. Experience with cloud integration tools such as Talend, Informatica Excellent oral communications, thought leadership and formal presentation skills US Citizen or Permanent Lawful Resident (Green Card Holder). Must be able to obtain and maintain a Public Trust MUST RESIDE IN THE United States (U.S.) and the work MUST BE PERFORMED More ❯
management . Track record of delivering data analytics and AI/ML-enabling solutions across complex environments. Hands-on experience with cloud data platforms , ideally AWS (S3, Kinesis, Glue, Redshift, Lambda, EMR). Experience with Azure technologies (ADF, Synapse, Fabric, Azure Functions) is also valued. Strong understanding of modern data lakehouse architectures , such as Databricks , Snowflake , or Microsoft Fabric More ❯
Strong stakeholder management and communication skills, with the ability to translate technical outputs into business-friendly insights. Experience with Snowflake (or alternative cloud data warehouses such as BigQuery or Redshift). More ❯
Strong stakeholder management and communication skills, with the ability to translate technical outputs into business-friendly insights. Experience with Snowflake (or alternative cloud data warehouses such as BigQuery or Redshift). More ❯
Reston, Virginia, United States Hybrid / WFH Options
SRC
with LLM or AI/ML pipelines and secure deployment of ML models in production -Experience deploying cloud-native architectures on AWS to include S3, EMR, EKS, IAM and Redshift -Experience with SQL to include optimizing queries with window functions, CTEs, and large datasets -Familiarity with Apache Airflow -Experience implementing FedRAMP/FISMA controls -Experience in a customer facing More ❯
Reston, Virginia, United States Hybrid / WFH Options
SRC
Experience with LLM or AI/ML pipelines and secure deployment of ML models in production -Experience deploying cloud-native architectureson AWS to include S3, EMR, EKS, IAM and Redshift -Experience with SQL to include oprimizing queries with window functions, CTEs, and large datasets -Familiarity with Apache Airflow -Experience implementing FedRAMP/FISMA controls -Experience in a customer facing More ❯
Reston, Virginia, United States Hybrid / WFH Options
SRC
with LLM or AI/ML pipelines and secure deployment of ML models in production -Experience deploying cloud-native architectures on AWS to include S3, EMR, EKS, IAM and Redshift -Experience with SQL to include optimizing queries with window functions, CTEs, and large datasets -Familiarity with Apache Airflow -Experience implementing FedRAMP/FISMA controls -Experience in a customer facing More ❯
Coventry, West Midlands, England, United Kingdom Hybrid / WFH Options
Lorien
and creating/executing tests Requirements Strong experience as a Data Engineer (migrating legacy systems onto AWS, building data pipelines) Strong Python experience Tech stack experience required: AWS Glue, Redshift, Lambda, PySpark, Airflow SSIS or SAS experience (Desirable) Benefits Salary up to £57,500 + up to 20% bonus Hybrid working: 1 to 2 days a week in the More ❯
wealth management. Excellent leadership, stakeholder management, and team-building skills. Some degree of exposure to: Cloud Data Platforms and AI integration (Azure, Databricks, etc); Data Storage and Warehousing (Snowflake, Redshift, Databricks); Data Processing and ETL (Apache, etc); Data Manipulation & Processing (Pandas, SQL, etc); Data Visualisation and BI (Tableau, PowerBI); ML Ops. More ❯
wealth management. Excellent leadership, stakeholder management, and team-building skills. Some degree of exposure to: Cloud Data Platforms and AI integration (Azure, Databricks, etc); Data Storage and Warehousing (Snowflake, Redshift, Databricks); Data Processing and ETL (Apache, etc); Data Manipulation & Processing (Pandas, SQL, etc); Data Visualisation and BI (Tableau, PowerBI); ML Ops. More ❯
wealth management. Excellent leadership, stakeholder management, and team-building skills. Some degree of exposure to: Cloud Data Platforms and AI integration (Azure, Databricks, etc); Data Storage and Warehousing (Snowflake, Redshift, Databricks); Data Processing and ETL (Apache, etc); Data Manipulation & Processing (Pandas, SQL, etc); Data Visualisation and BI (Tableau, PowerBI); ML Ops. More ❯
wealth management. Excellent leadership, stakeholder management, and team-building skills. Some degree of exposure to: Cloud Data Platforms and AI integration (Azure, Databricks, etc); Data Storage and Warehousing (Snowflake, Redshift, Databricks); Data Processing and ETL (Apache, etc); Data Manipulation & Processing (Pandas, SQL, etc); Data Visualisation and BI (Tableau, PowerBI); ML Ops. More ❯