Senior Delivery Consultant -Data Analytics & GenAI, AWS Professional Services Public Sector Job ID: Amazon Web Services Australia Pty Ltd Are you a Senior Data Analytics and GenAI consulting specialist? Do you have real-time Data Analytics, Data Warehousing, Big Data, Modern Data Strategy, Data Lake, Data Engineering and GenAI experience? Do you have senior stakeholder engagement experience to support … decisions and desired customer outcomes. Key job responsibilities Expertise: Collaborate with pre-sales and delivery teams to help partners and customers learn and use services such as AWS Glue, Amazon S3, Amazon DynamoDB, Amazon Relational Database Service (RDS), Amazon Elastic Map Reduce (EMR), Amazon Kinesis, AmazonRedshift, Amazon Athena, AWS Lake Formation … Amazon DataZone, Amazon SageMaker, Amazon Quicksight and Amazon Bedrock. Solutions: Support pre-sales and deliver technical engagements with partners and customers. This includes participating in pre-sales visits, understanding customer requirements, creating consulting proposals and creating packaged data analytics service offerings. Delivery: Engagements include projects proving the use of AWS services to support new distributed computing More ❯
modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflake schema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and testable data transformation workflows, with a strong understanding of More ❯
modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflake schema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and testable data transformation workflows, with a strong understanding of More ❯
modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflake schema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and testable data transformation workflows, with a strong understanding of More ❯
london (city of london), south east england, united kingdom
HCLTech
modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflake schema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and testable data transformation workflows, with a strong understanding of More ❯
solutions. Support and mentor junior engineers, contributing to knowledge sharing across the team. What We're Looking For Strong hands-on experience across AWS Glue, Lambda, Step Functions, RDS, Redshift, and Boto3. Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as: Spark, Hadoop etc. Practical knowledge of building Real Time event More ❯
solving skills and ability to explain technical ideas to non technical audiences. Experience with real time data pipelines, event driven architectures (Kafka/Kinesis), or modern data warehouses (Snowflake, Redshift, BigQuery) is a plus. Our Benefits: Paid Vacation Days Health insurance Commuter benefit Employee Stock Purchase Plan (ESPP) Mental Health & Family Forming Benefits Continuing education and corridor travel benefits More ❯
Engineers and Associates, and lead technical discussions and design sessions. Key requirements: Must-Have: Strong experience with AWS services: Glue, Lambda, S3, Athena, Step Functions, EventBridge, EMR, EKS, RDS, Redshift, DynamoDB. Strong Python development skills. Proficient with Docker , containerization, and virtualization. Hands-on experience with CI/CD , especially GitLab CI. Solid experience with Infrastructure as Code (Terraform, CloudFormation More ❯
systems (e.g., Salesforce, Netsuite, or similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth More ❯
systems (e.g., Salesforce, Netsuite, or similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth More ❯
systems (e.g., Salesforce, Netsuite, or similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth More ❯
london (city of london), south east england, united kingdom
TGS International Group
systems (e.g., Salesforce, Netsuite, or similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth More ❯
data governance, data quality, and compliance standards. Strong analytical and problem-solving skills with attention to detail. Excellent communication and documentation abilities. Preferred: Exposure to cloud data platforms (AWS Redshift, Azure Synapse, Google BigQuery, Snowflake). Knowledge of Python, R, or other scripting languages for data manipulation. Experience in banking, financial services, or enterprise IT environments. More ❯
data governance, data quality, and compliance standards. Strong analytical and problem-solving skills with attention to detail. Excellent communication and documentation abilities. Preferred: Exposure to cloud data platforms (AWS Redshift, Azure Synapse, Google BigQuery, Snowflake). Knowledge of Python, R, or other scripting languages for data manipulation. Experience in banking, financial services, or enterprise IT environments. More ❯
models, not dashboarding. YOUR SKILLS & EXPERIENCE The successful candidate will have: Strong SQL and DBT experience. Familiarity with Kimball methodology, dimensional modelling, and star schema design. Proven experience with Redshift or Snowflake. Strong background in cloud-based data environments (AWS preferred). Hands-on experience with Airflow for orchestration. (Nice-to-have) Python for data engineering tasks. (Nice-to More ❯
the platform post-deployment About You Requirements Minimum of 8 years' experience in data engineering within cloud environments, preferably AWS, including use of services such as S3, Lambda, Glue, Redshift, and IAM Must be eligible for SC Clearance A strong understanding of secure data practices in government or similarly regulated environments Practical experience building and managing Python-based analytical More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Awin
acceptable) Working knowledge of SQL and data modelling concepts Experience with BI tools (e.g., Power BI, Looker, Tableau) Familiarity with cloud data platforms such as Snowflake, BigQuery, or AWS Redshift Understanding of modern data architecture and APIs Our Offer Flexi-Week and Work-Life Balance: We prioritise your mental health and wellbeing, offering you a flexible four-day Flexi More ❯
Tech Stack You’ll be working with: Programming: Python, SQL, Scala/Java Big Data: Spark, Hadoop, Databricks Pipelines: Airflow, Kafka, ETL tools Cloud: AWS, GCP, or Azure (Glue, Redshift, BigQuery, Snowflake) Data Modelling & Warehousing 🔹 What’s on Offer 💷 £80,000pa (Permanent role) 📍 Hybrid – 2 days per week in Central London 🚀 Opportunity to work in a scaling HealthTech company More ❯
Tech Stack You’ll be working with: Programming: Python, SQL, Scala/Java Big Data: Spark, Hadoop, Databricks Pipelines: Airflow, Kafka, ETL tools Cloud: AWS, GCP, or Azure (Glue, Redshift, BigQuery, Snowflake) Data Modelling & Warehousing 🔹 What’s on Offer 💷 £80,000pa (Permanent role) 📍 Hybrid – 2 days per week in Central London 🚀 Opportunity to work in a scaling HealthTech company More ❯
Tech Stack You’ll be working with: Programming: Python, SQL, Scala/Java Big Data: Spark, Hadoop, Databricks Pipelines: Airflow, Kafka, ETL tools Cloud: AWS, GCP, or Azure (Glue, Redshift, BigQuery, Snowflake) Data Modelling & Warehousing 🔹 What’s on Offer 💷 £80,000pa (Permanent role) 📍 Hybrid – 2 days per week in Central London 🚀 Opportunity to work in a scaling HealthTech company More ❯
london (city of london), south east england, united kingdom
Roc Search
Tech Stack You’ll be working with: Programming: Python, SQL, Scala/Java Big Data: Spark, Hadoop, Databricks Pipelines: Airflow, Kafka, ETL tools Cloud: AWS, GCP, or Azure (Glue, Redshift, BigQuery, Snowflake) Data Modelling & Warehousing 🔹 What’s on Offer 💷 £80,000pa (Permanent role) 📍 Hybrid – 2 days per week in Central London 🚀 Opportunity to work in a scaling HealthTech company More ❯
Python and related ML libraries Strong background in applied machine learning, model development and data engineering Experience with cloud environments (Azure, AWS, GCP) and tools such as Spark, Hive, Redshift Demonstrated ability to lead cross-functional teams and mentor junior practitioners Ability to communicate complex technical concepts clearly to non-technical audiences Bonus Points For Participation in Kaggle or More ❯
required: Strong data visualisation using Power BI and coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯
required: Strong data visualisation using Power BI and coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯
required: Strong data visualisation using Power BI and coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯