ELT processes, automation Technical Requirements: - Strong proficiency in SQL and Python programming - Extensive experience with data modeling and data warehouse concepts - Advanced knowledge of AWS data services, including: S3, Redshift, AWS Glue, AWS Lambda - Experience with Infrastructure as Code using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills More ❯
ELT processes, automation Technical Requirements: - Strong proficiency in SQL and Python programming - Extensive experience with data modeling and data warehouse concepts - Advanced knowledge of AWS data services, including: S3, Redshift, AWS Glue, AWS Lambda - Experience with Infrastructure as Code using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills More ❯
Employment Type: Contract
Rate: £350 - £400/day PTO, pension and national insurance
Python or similar scripting language for test automation Experience with cloud platforms (AWS, GCP, or Azure), especially in data-related services Familiarity with data warehousing concepts (e.g., Snowflake, BigQuery, Redshift) Strong understanding of data governance, data profiling, and quality metrics Excellent problem-solving and communication skills Ability to work independently and as part of a distributed team Nice to More ❯
scalable test automation frameworks, with a focus on backend, API, and data systems using tools like Pytest and Postman. Expertise in Pandas, SQL, and AWS analytics services (Glue, Athena, Redshift) for data profiling, transformation, and validation within data lakes. Solid experience with AWS (S3, Lambda, EMR, ECS/EKS, CloudFormation/Terraform) and understanding of cloud-native architectures and More ❯
scalable test automation frameworks, with a focus on backend, API, and data systems using tools like Pytest and Postman. Expertise in Pandas, SQL, and AWS analytics services (Glue, Athena, Redshift) for data profiling, transformation, and validation within data lakes. Solid experience with AWS (S3, Lambda, EMR, ECS/EKS, CloudFormation/Terraform) and understanding of cloud-native architectures and More ❯
Engineers and Associates, and lead technical discussions and design sessions. Key requirements: Must-Have: Strong experience with AWS services: Glue, Lambda, S3, Athena, Step Functions, EventBridge, EMR, EKS, RDS, Redshift, DynamoDB. Strong Python development skills. Proficient with Docker , containerization, and virtualization. Hands-on experience with CI/CD , especially GitLab CI. Solid experience with Infrastructure as Code (Terraform, CloudFormation More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic thinking with More ❯
Sub; Fluent Python, SQL skills with real life project experience; Experience on orchestration tools such as Airflow and DBT; Experience with one of major analytical DWHs is plus: BigQuery, Redshift, Snowflake, Databricks, Synapse; Work experience with following technologies are noteworthy to mention and might be seen as bonus: AWS (and Data related proprietary technologies), Azure (and Data related proprietary More ❯
Would you like to work on one of the world's largest transactional distributed systems? How about working with customers and peers from the entire range of Amazon's business on cool new features? Whether you're passionate about building highly scalable and reliable systems or a software developer who likes to solve business problems, Selling Partner Services (SPS … responsibilities 1. Design/implement automation and manage our massive data infrastructure to scale for the analytics needs of case management. 2. Build solutions to achieve BAA(Best At Amazon) standards for system efficiency, IMR efficiency, data availability, consistency & compliance. 3. Enable efficient data exploration, experimentation of large datasets on our data platform and implement data access control mechanisms … and WW) use cases on our data platform 5. Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL, Amazon and AWS big data technologies 6. Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. 7. Drive More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
and implementing data warehousing solutions using Snowflake and AWS. The ideal candidate must have: Strong experience as an AWS Data Engineer Lead the design and implementation of Snowflake [and Redshift] based data warehousing solutions within an AWS environment Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Develop and optimize ETL processes using More ❯
build a comprehensive data platform? At AEE, you will: Design, implement, and support analytical data infrastructure for both business intelligence and machine learning applications Manage AWS resources including EC2, Redshift, EMR-Spark, etc. Collaborate with applied scientists to build data pipelines for training machine learning models Work with Product Managers, Financial, and Business analysts to improve reporting practices: data … for customers Requirements include: 3+ years of data engineering experience 4+ years of SQL experience Experience with data modeling, warehousing, and ETL pipelines Proficiency with AWS technologies such as Redshift, S3, Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles Experience with non-relational databases and data stores (object storage, document, key-value, graph, column-family) Our inclusive culture empowers More ❯
excellence — this could be the one. What you’ll do as a Principal Data Engineer: Lead the architecture and build of modern data platforms using AWS-native tools (S3, Redshift, Lambda, Glue, DynamoDB, Matillion, etc.). Drive best practices in CI/CD, Infrastructure-as-Code, and data ops methodologies. Liaise directly with clients to define requirements, shape technical … production-ready. Be an advocate of clean code, DevOps, and hands-on engineering leadership. Skills you’ll bring as a Principal Data Engineer: Extensive experience with AWS data services (Redshift, Glue, S3, etc.). Hands-on with ETL/ELT using Matillion or similar tools. Strong Python and SQL skills, with a good understanding of data formats Demonstrated experience More ❯
GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions … experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform More ❯
GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions … experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform More ❯
Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.) > Hands on experience with Oracle RDBMS > Data Migration experience to Snowflake > Experience with AWS services such as S3, Lambda, Redshift, and Glue. > Strong understanding of data warehousing concepts and data modeling. > Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. > Understanding/hands on More ❯
Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.) > Hands on experience with Oracle RDBMS > Data Migration experience to Snowflake > Experience with AWS services such as S3, Lambda, Redshift, and Glue. > Strong understanding of data warehousing concepts and data modeling. > Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. > Understanding/hands on More ❯
real ownership You'll need: Strong Python and SQL Solid AWS experience (Glue, Lambda, SQS, S3, etc.) Experience building/maintaining ETL pipelines Data modelling and warehousing knowledge (PostgreSQL, Redshift, Snowflake etc.) Self-starter mindset, you'll get stuck in and find answers independently to bring back to the team Drive to want to own projects and put forwards More ❯
data lake architectures. Develop conceptual, logical, and physical data models to support analytical requirements. Build and optimise data pipelines (ETL/ELT) using tools such as Azure Synapse, Snowflake, Redshift, or similar. Ensure robust data governance, security, and quality management practices. Support cloud data migrations and architecture modernisation initiatives. Front-End BI & Analytics Translate complex data into clear, actionable More ❯
Strong proficiency in SQL, ETL processes and database management systems (e.g., MySQL, PostgreSQL, MongoDB). Hands-on experience with AWS services for data processing, storage and analysis (e.g., S3, Redshift, EMR, Glue). Familiarity with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data technologies (e.g., Hadoop More ❯
detail. Desirable: Familiarity with AWS Lake Formation and real-time streaming tools (e.g., AWS Kinesis/Firehose). Experience within the transport, travel, or leisure industries. Knowledge of AWS Redshift and traditional data warehousing approaches. Why Join? This is a unique opportunity to join a forward-thinking organisation investing heavily in its data capability. You’ll be part of More ❯
York, North Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
WRK DIGITAL LTD
detail. Desirable: Familiarity with AWS Lake Formation and real-time streaming tools (e.g., AWS Kinesis/Firehose). Experience within the transport, travel, or leisure industries. Knowledge of AWS Redshift and traditional data warehousing approaches. Why Join? This is a unique opportunity to join a forward-thinking organisation investing heavily in its data capability. Youll be part of a More ❯
Programming experience in Python Skills and experience we’d love you to have... Understanding of cloud computing security concepts Experience in relational cloud based database technologies like Snowflake, BigQuery, Redshift Experience in open source technologies like Spark, Kafka, Beam Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in an More ❯
in multi-account organisations. Expertise in configuring and deploying AWS infrastructure components; use a broad set AWS services including EC2, EKS, S3, EFS, RDS, DynamoDB, ElastiCache, AppFlow, Glue, Athena, Redshift, API Gateway, Lambdas, WAF, CloudFormation and Control Tower; experience with corresponding services from other cloud providers is highly desirable Experience using IaC tools, including Terraform and CloudFormation; comfortable writing More ❯
City of London, Greater London, UK Hybrid / WFH Options
Monument Technology
in multi-account organisations. Expertise in configuring and deploying AWS infrastructure components; use a broad set AWS services including EC2, EKS, S3, EFS, RDS, DynamoDB, ElastiCache, AppFlow, Glue, Athena, Redshift, API Gateway, Lambdas, WAF, CloudFormation and Control Tower; experience with corresponding services from other cloud providers is highly desirable Experience using IaC tools, including Terraform and CloudFormation; comfortable writing More ❯
or Kafka Streams)? Select Which statement best describes your hands on responsibility for architecting and tuning cloud native data lake/warehouse solutions (e.g., AWS S3 + Glue/Redshift, GCP BigQuery, Azure Synapse)? Select What best reflects your experience building ETL/ELT workflows with Apache Airflow (or similar) and integrating them into containerised CI/CD pipelines More ❯