Job ID: Amazon Web Services Australia Pty Ltd Are you a Senior Cloud Architect with GenAI consulting specialist? Do you have real-time Data Analytics, Data Warehousing, Big Data, Modern Data Strategy, Data Lake, Data Engineering and GenAI experience? Do you have senior stakeholder engagement experience to support pre-sales and deliver consulting engagements? Do you like to solve … Vetting Agency clearance (see ). Key job responsibilities Expertise: Collaborate with pre-sales and delivery teams to help partners and customers learn and use services such as AWS Glue, Amazon S3, Amazon DynamoDB, Amazon Relational Database Service (RDS), Amazon Elastic Map Reduce (EMR), Amazon Kinesis, AmazonRedshift, Amazon Athena, AWS Lake Formation … Amazon DataZone, Amazon SageMaker, Amazon Quicksight and Amazon Bedrock. Solutions: Support pre-sales and deliver technical engagements with partners and customers. This includes participating in pre-sales visits, understanding customer requirements, creating consulting proposals and creating packaged data analytics service offerings. Delivery: Engagements include projects proving the use of AWS services to support new distributed computing More ❯
Senior Delivery Consultant -Data Analytics & GenAI, AWS Professional Services Public Sector Job ID: Amazon Web Services Australia Pty Ltd Are you a Senior Data Analytics and GenAI consulting specialist? Do you have real-time Data Analytics, Data Warehousing, Big Data, Modern Data Strategy, Data Lake, Data Engineering and GenAI experience? Do you have senior stakeholder engagement experience to support … decisions and desired customer outcomes. Key job responsibilities Expertise: Collaborate with pre-sales and delivery teams to help partners and customers learn and use services such as AWS Glue, Amazon S3, Amazon DynamoDB, Amazon Relational Database Service (RDS), Amazon Elastic Map Reduce (EMR), Amazon Kinesis, AmazonRedshift, Amazon Athena, AWS Lake Formation … Amazon DataZone, Amazon SageMaker, Amazon Quicksight and Amazon Bedrock. Solutions: Support pre-sales and deliver technical engagements with partners and customers. This includes participating in pre-sales visits, understanding customer requirements, creating consulting proposals and creating packaged data analytics service offerings. Delivery: Engagements include projects proving the use of AWS services to support new distributed computing More ❯
AWS Data Engineer London, UK Permanent Strong experience in Python, PySpark, AWS S3, AWS Glue, Databricks, AmazonRedshift, DynamoDB, CI/CD and Terraform. Total 7 + years of experience in Data engineering is required. Design, develop, and optimize ETL pipelines using AWS Glue, Amazon EMR and Kinesis for real-time and batch data processing. Implement data … transformation, streaming, and storage solutions on AWS, ensuring scalability and performance. Collaborate with cross-functional teams to integrate and manage data workflows. Skill-set AmazonRedshift, S3, AWS Glue, Amazon EMR, Kinesis Analytics Ensure data security, compliance, and best practices in cloud data engineering Experience with programming languages such as Python, Java, or Scala. More ❯
Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka). Familiarity with AWS and its data services (e.g. S3, Athena, AWS Glue). Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). Familiarity of data orchestration tools (e.g. Prefect, Apache Airflow). Familiarity with CI/CD More ❯
processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc) Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a More ❯
processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a More ❯
Java Experience in a Marketing technical stack and 3rd party tools Broad experience of working within AWS; including infrastructure (VPC, EC2, Security groups, S3 etc) to AWS data platforms (Redshift, RDS, Glue etc). Working Knowledge of infrastructure automation with Terraform. Working knowledge of test tools and frameworks such as Junit, Mockito, ScalaTest, pytest. Working knowledge of build tools More ❯
broad range of problems using your technical skills. Demonstrable experience of utilising strong communication and stakeholder management skills when engaging with customers Significant experience with cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP). Strong proficiency in SQL and experience with relational databases such as MySQL, PostgreSQL, or Oracle. Experience with big … data technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as AmazonRedshift, Google BigQuery, or Apache Airflow. Proficiency in Python and at least one other programming language such as Java, or Scala. Willingness to mentor more junior members of the team. Strong analytical and problem-solving skills with the ability to More ❯
focus on building scalable data solutions. Experience with data pipeline orchestration tools such as Dagster or similar. Familiarity with cloud platforms (e.g. AWS) and their data services (e.g., S3, Redshift, Snowflake). Understanding of data warehousing concepts and experience with modern warehousing solutions. Experience with GitHub Actions (or similar) and implementing CI/CD pipelines for data workflows and More ❯
skills. Good written and verbal skills, able to translate concepts into easily understood diagrams and visuals for both technical and non-technical people alike. AWS cloud products (Lambda functions, Redshift, S3, AmazonMQ, Kinesis, EMR, RDS (Postgres . Apache Airflow for orchestration. DBT for data transformations. Machine Learning for product insights and recommendations. Experience with microservices using technologies like Docker More ❯
Nice to Haves": • Certification in dbt or Google Cloud Platform or related technologies. • Experience with other cloud platforms (e.g. AWS, Azure, Snowflake) and data warehouse/lakehouse technologies (e.g. Redshift, Databricks, Synapse) • Knowledge of distributed big data technologies. • Proficiency in Python. • Familiarity with data governance and compliance frameworks. Your characteristics as a Consultant will include: • Driven by delivering quality More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work closely with More ❯
Farnborough, Hampshire, South East, United Kingdom
Peregrine
and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) would be beneficial. Data Platform Management: Comfortable operating in hybrid environments (cloud and on-prem). Experience integrating diverse data sources and systems. Understanding of secure data transfer More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Fruition Group
engineers and support their growth. Implement best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (e.g., S3, Redshift, Lambda). Proficient in Python, Apache Spark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and More ❯
Snowflake (data warehousing and performance tuning) Informatica (ETL/ELT development and orchestration) - nice to have Python (data processing and scripting) - required AWS (data services such as S3, Glue, Redshift, Lambda) - required Cloud data practices and platform - AWS required Basic knowledge of related disciplines such as data science, software engineering, and business analytics. Proven ability to independently resolve complex More ❯
scalable test automation frameworks, with a focus on backend, API, and data systems using tools like Pytest and Postman. Expertise in Pandas, SQL, and AWS analytics services (Glue, Athena, Redshift) for data profiling, transformation, and validation within data lakes. Solid experience with AWS (S3, Lambda, EMR, ECS/EKS, CloudFormation/Terraform) and understanding of cloud-native architectures and More ❯
journey of our data platform (AWS) Cloud Proficiency: Hands-on experience with at least one major cloud platform (AWS, Azure, or GCP) and its core data services (e.g., S3, Redshift, Lambda/Functions, Glue). Data Modelling: Deep understanding of ELT/ETL patterns, and data modelling techniques. CRM/Customer Data Focus: Experience working directly with data from More ❯
may either leverage third party tools such as Fivetran, Airbyte, Stitch or build custom pipelines. We use the main data warehouses for dbt modelling and have extensive experience with Redshift, BigQuery and Snowflake. Recently we've been rolling out a serverless implementation of dbt and progressing work on internal product to build modular data platforms. When initially working with More ❯
data models to support analytics and reporting Monitor and maintain data infrastructure to ensure availability and performance What You'll Need to Succeed Experience with cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP). Proficiency in SQL and experience with relational databases such as MySQL, PostgreSQL, or Oracle. Experience with big data … technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as AmazonRedshift, Google BigQuery, or Apache Airflow. Proficiency in at least one programming language such as Python, Java, or Scala. Strong analytical and problem-solving skills with the ability to work independently and in a team environment. How You'll Grow Exceptional More ❯
Bash, Python, Go, PowerShell Monitoring and logging tools such as Prometheus, Grafana, Dynatrace Solid understanding of networking and security (VPC, Nginx, AWS WAF, etc.) Database experience with DynamoDB, Aurora, Redshift, SQL Comfortable with Linux/Unix OS administration Ideally, AWS DevOps Engineer certification Exposure to Ping Identity (ForgeRock) is also desirable Business & People Skills Ability to work independently and More ❯
Bash, Python, Go, PowerShell Monitoring and logging tools such as Prometheus, Grafana, Dynatrace Solid understanding of networking and security (VPC, Nginx, AWS WAF, etc.) Database experience with DynamoDB, Aurora, Redshift, SQL Comfortable with Linux/Unix OS administration Ideally, AWS DevOps Engineer certification Exposure to Ping Identity (ForgeRock) is also desirable Business & People Skills Ability to work independently and More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic thinking with More ❯
cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic thinking with More ❯
NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and infrastructure-as-code practices Familiarity with data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
input into technical decisions, peer reviews and solution design. Requirements Proven experience as a Data Engineer in cloud-first environments. Strong commercial knowledge of AWS services (e.g. S3, Glue, Redshift). Advanced PySpark and Databricks experience (Delta Lake, Unity Catalog, Databricks Jobs etc). Proficient in SQL (T-SQL/SparkSQL) and Python for data transformation and scripting. Hands More ❯