services - particularly Glue, Athena, S3, Lambda, and Step Functions. Solid understanding of data modelling principles for analytics - including partitioning, denormalisation, and file formats (e.g., Parquet, ORC). Experience building and maintaining production-grade ETL pipelines with an emphasis on performance, quality, and maintainability. AWS certification desirable - Data Engineer or More ❯
preston, lancashire, north west england, United Kingdom
Chemist4U
services - particularly Glue, Athena, S3, Lambda, and Step Functions. Solid understanding of data modelling principles for analytics - including partitioning, denormalisation, and file formats (e.g., Parquet, ORC). Experience building and maintaining production-grade ETL pipelines with an emphasis on performance, quality, and maintainability. AWS certification desirable - Data Engineer or More ❯
solutions • Contribute to infrastructure automation using CI/CD practices and infrastructure as code • Work with various data formats including JSON, XML, CSV, and Parquet • Create and maintain metadata, data dictionaries, and schema documentation Required Experience: • Strong experience with data engineering in a cloud-first environment • Hands-on expertise More ❯
solutions • Contribute to infrastructure automation using CI/CD practices and infrastructure as code • Work with various data formats including JSON, XML, CSV, and Parquet • Create and maintain metadata, data dictionaries, and schema documentation Required Experience: • Strong experience with data engineering in a cloud-first environment • Hands-on expertise More ❯
solutions • Contribute to infrastructure automation using CI/CD practices and infrastructure as code • Work with various data formats including JSON, XML, CSV, and Parquet • Create and maintain metadata, data dictionaries, and schema documentation Required Experience: • Strong experience with data engineering in a cloud-first environment • Hands-on expertise More ❯
solutions • Contribute to infrastructure automation using CI/CD practices and infrastructure as code • Work with various data formats including JSON, XML, CSV, and Parquet • Create and maintain metadata, data dictionaries, and schema documentation Required Experience: • Strong experience with data engineering in a cloud-first environment • Hands-on expertise More ❯
similar tools Leading on solution deployment using infrastructure-as-code and CI/CD practices Transforming diverse data formats including JSON, XML, CSV, and Parquet Creating and maintaining clear technical documentation, metadata, and data dictionaries Your previous experience as Principal Data Engineer will include: Strong background across AWS data More ❯
similar tools Leading on solution deployment using infrastructure-as-code and CI/CD practices Transforming diverse data formats including JSON, XML, CSV, and Parquet Creating and maintaining clear technical documentation, metadata, and data dictionaries Your previous experience as Principal Data Engineer will include: Strong background across AWS data More ❯
cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet . Operate effectively in Linux and cloud-based environments . Support CI/CD processes and adopt infrastructure-as-code principles. Contribute to a More ❯
cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet . Operate effectively in Linux and cloud-based environments . Support CI/CD processes and adopt infrastructure-as-code principles. Contribute to a More ❯
cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet . Operate effectively in Linux and cloud-based environments . Support CI/CD processes and adopt infrastructure-as-code principles. Contribute to a More ❯
cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet . Operate effectively in Linux and cloud-based environments . Support CI/CD processes and adopt infrastructure-as-code principles. Contribute to a More ❯
leeds, west yorkshire, yorkshire and the humber, United Kingdom
Anson McCade
cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet . Operate effectively in Linux and cloud-based environments . Support CI/CD processes and adopt infrastructure-as-code principles. Contribute to a More ❯
bradford, yorkshire and the humber, United Kingdom
Anson McCade
cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet . Operate effectively in Linux and cloud-based environments . Support CI/CD processes and adopt infrastructure-as-code principles. Contribute to a More ❯
cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet . Operate effectively in Linux and cloud-based environments . Support CI/CD processes and adopt infrastructure-as-code principles. Contribute to a More ❯
cleanse data using a range of tools and techniques. Manage and process structured and semi-structured data formats such as JSON, XML, CSV, and Parquet . Operate effectively in Linux and cloud-based environments . Support CI/CD processes and adopt infrastructure-as-code principles. Contribute to a More ❯
large-scale datasets. Implement and manage Lake Formation and AWS Security Lake , ensuring data governance, access control, and security compliance. Optimise file formats (e.g., Parquet, ORC, Avro) for S3 storage , ensuring efficient querying and cost-effectiveness. Automate infrastructure deployment using Infrastructure as Code (IaC) tools such as Terraform or More ❯
large-scale datasets. Implement and manage Lake Formation and AWS Security Lake , ensuring data governance, access control, and security compliance. Optimise file formats (e.g., Parquet, ORC, Avro) for S3 storage , ensuring efficient querying and cost-effectiveness. Automate infrastructure deployment using Infrastructure as Code (IaC) tools such as Terraform or More ❯
large-scale datasets. Implement and manage Lake Formation and AWS Security Lake , ensuring data governance, access control, and security compliance. Optimise file formats (e.g., Parquet, ORC, Avro) for S3 storage , ensuring efficient querying and cost-effectiveness. Automate infrastructure deployment using Infrastructure as Code (IaC) tools such as Terraform or More ❯
Logic Apps, ADF, Service Bus, Functions) Comfortable working with Git , Azure DevOps , and unit testing practices Knowledge of common data formats: CSV, JSON, XML, Parquet Ability to lead integration designs with minimal rework required Preferred Qualifications Certification in SSIS or relevant Microsoft technologies Proven track record of delivering robust More ❯
leatherhead, south east england, United Kingdom Hybrid / WFH Options
JCW
Logic Apps, ADF, Service Bus, Functions) Comfortable working with Git , Azure DevOps , and unit testing practices Knowledge of common data formats: CSV, JSON, XML, Parquet Ability to lead integration designs with minimal rework required 🧾 Preferred Qualifications 🎓 Certification in SSIS or relevant Microsoft technologies 💡 Proven track record of delivering robust More ❯
guildford, south east england, United Kingdom Hybrid / WFH Options
JCW
Logic Apps, ADF, Service Bus, Functions) Comfortable working with Git , Azure DevOps , and unit testing practices Knowledge of common data formats: CSV, JSON, XML, Parquet Ability to lead integration designs with minimal rework required 🧾 Preferred Qualifications 🎓 Certification in SSIS or relevant Microsoft technologies 💡 Proven track record of delivering robust More ❯