Senior Delivery Consultant -Data Analytics & GenAI, AWS Professional Services Public Sector Job ID: Amazon Web Services Australia Pty Ltd Are you a Senior Data Analytics and GenAI consulting specialist? Do you have real-time Data Analytics, Data Warehousing, Big Data, Modern Data Strategy, Data Lake, Data Engineering and GenAI experience? Do you have senior stakeholder engagement experience to … learn and use services such as AWS Glue, Amazon S3, Amazon DynamoDB, Amazon Relational Database Service (RDS), AmazonElastic Map Reduce (EMR), Amazon Kinesis, Amazon Redshift, Amazon Athena, AWS Lake Formation, Amazon DataZone, Amazon SageMaker, Amazon Quicksight and Amazon … clearance. PREFERRED QUALIFICATIONS - AWS Professional level certification - 10+ years of IT platform implementation in a technical and analytical role experience Acknowledgement of country: In the spirit of reconciliation Amazon acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that More ❯
the following: Python, SQL, Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server, Oracle, PostgreSQL, MySQL) NoSQL (e.g., MongoDB, Cassandra, DynamoDB, Neo4j) Solid understanding of software engineering best practices - code reviews, testing frameworks, CI/CD, and code More ❯
Leeds, West Yorkshire, England, United Kingdom Hybrid / WFH Options
Robert Walters
Key Skills & Experience: Proven experience as a Senior/Lead Data Engineer in a large-scale environment. Strong expertise with AWS data services (e.g., S3, Glue, Lambda, Redshift, Athena, EMR). Experience designing and building data lakes and modern data platforms. Proficiency with Python, SQL, and data pipeline orchestration tools (e.g., Airflow, dbt). Strong understanding of data modelling More ❯
to junior engineers Keep up-to-date with emerging data technologies and apply them where relevant The Skill Requirements: Hands-on experience with AWS services (Glue, Lambda, S3, Redshift, EMR) Strong skills in Python, SQL, PySpark and pipeline orchestration Proven understanding of data warehousing and data lakehouse concepts Excellent problem-solving skills with the ability to resolve performance bottlenecks More ❯
IT experience, with 5+ years in: • Python, PySpark, and SQL for big data processing • Data lakes (Iceberg format), ETL (Informatica), and data quality • AWS services: S3, Glue, Redshift, Lambda, EMR, Airflow, Postgres • BASH/Shell scripting • Experience with healthcare data and leading data teams • Agile development experience • Strong problem-solving and communication skills Responsibilities • Design and maintain scalable data More ❯
data pipelines and services • Ingest, transform, and model structured and unstructured data for analytics and ML • Work with technologies like Apache Spark, Apache Iceberg, Trino, NiFi, OpenSearch, and AWS EMR • Ensure data integrity, lineage, and security across the entire lifecycle • Collaborate with DevOps to deploy containerized data solutions using Kubernetes • Support Agile delivery, version control, and data governance activities More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Rise Technical Recruitment Limited
Experience in a data-focused SRE, Data Platform, or DevOps role *Strong knowledge of Apache Flink, Kafka, and Python in production environments *Hands-on AWS experience with AWS (Lambda, EMR, Step Functions, Redshift, etc.) *Comfortable with monitoring tools, distributed systems debugging, and incident response Reference Number: BBBH259303 To apply for this role or for to be considered for further More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Rackner
moving Agile DevSecOps team that builds secure, scalable data platforms-and get paid weekly. What You'll Do Build OpenAPI-compliant APIs, data schemas, & pipelines in AWS (S3, RDS, EMR) Develop with Python (FastAPI, Django, Flask) & JavaScript (Node.js, Vue, React) Deploy containerized workloads in Kubernetes (AWS EKS, Rancher) with CI/CD Apply DevSecOps + security-first practices from More ❯
apply to real-world use cases Required Skills: 5+ years of Machine Learning experience in production environments Strong hands-on with AWS (SageMaker, S3, Lambda, DynamoDB, Kinesis) Spark/EMR, Databricks for ETL & large-scale data processing MLFlow for model tracking/monitoring Experience in CI/CD, testing, and scalable deployments Strong problem-solving & cross-team collaboration skills More ❯
as a technical liaison among system engineers, data scientists, analysts, and non-technical stakeholders to ensure aligned, mission-driven solutions. Key responsibilities include working with AWS cloud services (including EMR and Databricks), SQL database structures, and executing large-scale data migrations. The role also involves optimizing database architecture and performance, implementing DevSecOps practices, and building CI/CD pipelines More ❯
are seeking an experienced Data Engineer with expertise in AWS cloud technologies to design and build ETL pipelines , data warehouses , and data lakes . Key Skills: AWS services like EMR, Glue, Redshift, Kinesis, Lambda, DynamoDB (or equivalent open-source tools). Note: We require candidates who are eligible for SC Clearance or possess a higher level of clearance. More ❯
are seeking an experienced Data Engineer with expertise in AWS cloud technologies to design and build ETL pipelines , data warehouses , and data lakes . Key Skills: AWS services like EMR, Glue, Redshift, Kinesis, Lambda, DynamoDB (or equivalent open-source tools). Note: We require candidates who are eligible for SC Clearance or possess a higher level of clearance. More ❯
software solutions at one or more layers of the technical stack (data, application, UI) Programming languages, particularly Python, Java, Javascript, TypeScript, SQL Cloud application and data deployment in AWS (EMR, EKS, ECR, RDS, etc.) DevOps tools and services (Kubernetes, Terraform, Docker, Packer, etc.) Integration with applications and data across a platform (e.g. APIs) Developing software within Agile methodologies Preferred More ❯
and grow your skillset. We are looking for someone who can demonstrate an aptitude or willingness to learn some or all of the following technologies. AWS - S3, IAM, RDS, EMR, EC2, etc Linux Commands Trino Apache Spark Node.js JavaScript Preact.js Postgres MySQL HTML CSS Target Salary Range is $125k-$150k or more depending on experience. We recognize this skillset More ❯
all levels. REST API-based microservices using Python, Java or similar. Cloud data warehouse & analytics platforms (Snowflake, BigQuery). Designing, building, and managing data pipelines using Spark, BigQuery, AWS EMR, and GCP Cloud Analytics. Caching implementations (like Redis, Memcached, etc.). Messaging platforms and stream data processing (like Kafka, Flink, Pub/Sub). Data pipeline expectations framework experience … and Confluence. Nice to have expertise in Designing and maintaining Public APIs and data contracts. Nice to have Security basics (e.g., OAuth2 standard). Nice to have Foundational Amazon Cloud Services, like S3, CloudTrail, SQS, SNS, Lambda, API Gateway, ElasticCache, Athena, Kinesis, EKS, Cognito, and others. Nice to have NoSQL databases knowledge (Druid, DynamoDB, SingleStore, or others). More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Searchability
mix of real-time and batch ETL processes, ensuring accuracy, integrity, and scalability across vast datasets. You'll work with Python, SQL, Apache Spark, and AWS services such as EMR, Athena, and Lambda to deliver robust, high-performance solutions.You'll also play a key role in optimising data pipeline architecture, supporting knowledge sharing within the team, and collaborating with … application to our client in conjunction with this vacancy only. KEY SKILLS:Data Engineer/Python/SQL/AWS/ETL/Data Pipelines/Apache Spark/EMR/Athena/Lambda/Big Data/Manchester/Hybrid Working More ❯
alongside Associates, Mid-level Engineers, and Senior/Principal engineers. What You'll Do: Design, build, and scale data pipelines and services using AWS Glue, Lambda, Step Functions, S3, EMR, Athena, and more. Lead projects involving serverless, event-driven architectures and CI/CD workflows (GitLab CI). Write clean, production-grade code in Python (Scala is a bonus … Actively mentor Data Engineers and Associates, and lead technical discussions and design sessions. Key requirements: Must-Have: Strong experience with AWS services: Glue, Lambda, S3, Athena, Step Functions, EventBridge, EMR, EKS, RDS, Redshift, DynamoDB. Strong Python development skills. Proficient with Docker , containerization, and virtualization. Hands-on experience with CI/CD , especially GitLab CI. Solid experience with Infrastructure as More ❯
Connect, Redox, InterSystems, AWS HealthLake). Certifications in HL7, FHIR, or other relevant interoperability standards. Familiarity with healthcare data privacy and cybersecurity principles (e.g., HIPAA, GDPR). Exposure to EMR/EHR systems such as Epic, Cerner, or Meditech. If you're a motivated Integration Engineer passionate about transforming healthcare through technology, we'd love to hear from you. More ❯
We are seeking a Sr. Java Developer with strong experience in systems, software, cloud, and Agile methodologies to support a complex program providing Agile development, operations, and maintenance for critical systems. Working within a DevOps framework, you will participate in More ❯
Delivery Consultant- GenAI/ML, AWS, Industries Job ID: Amazon Web Services EMEA Dubai FZ Branch The Amazon Web Services Professional Services (ProServe) team is seeking a skilled Delivery Consultant to join our team at Amazon Web Services (AWS). In this role, you'll work closely with customers to design, implement, and manage … to apply. If your career is just starting, hasn't followed a traditional path, or includes alternative experiences, don't let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world's most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating - that's why customers from the most … GenAI/ML solutions (e.g., for training, fine tuning, and inferences) PREFERRED QUALIFICATIONS - AWS Experience preferred, with a proficiency in a wide range of AWS services (e.g. SageMaker, Bedrock, EMR, S3, OpenSearch Service, Step Functions, Lambda, and EC2) - AWS Professional level certifications (e.g., Solutions Architect Professional, DevOps Engineer Professional) preferred - Experience with automation and scripting (e.g., Terraform, Python) - Knowledge More ❯
experience with Amazon Web Services (AWS), including but not limited to core IaaS (IAM, VPC, EC2, S3, EBS, ELB) and native AWS PaaS Services (Lambda, AWS Config, EMR, Athena, EKS, etc.) Hands-on experience with Azure public cloud, including core (compute, storage, networking) platform, PaaS and IAM offerings. Infrastructure experience (Server Administration, Networking, Storage, Security) A commitment More ❯