the latest data analytics technologies? Would you like a career path that enables you to progress with the rapid adoption of cloud computing? At Amazon Web Services, we're hiring highly technical cloud architect specialised in data analytics to collaborate with our customers and partners to derive business value … Key job responsibilities Expertise - Collaborate with pre-sales and delivery teams to help partners and customers learn and use services such as AWS Glue, Amazon S3, Amazon DynamoDB, Amazon Relational Database Service (RDS), Amazon Elastic Map Reduce (EMR), Amazon Kinesis, Amazon Redshift, AmazonAthena, AWS Lake Formation, Amazon DataZone, Amazon SageMaker and Amazon Quicksight. Solutions - Deliver technical engagements with partners and customers. This includes participating in pre-sales visits, understanding customer requirements, creating consulting proposals and creating packaged data analytics service offerings. Delivery - Engagements include projects proving the More ❯
that address complex business requirements and drive decision-making. Your skills and experience Proficiency with AWS Tools: Demonstrable experience using AWS Glue, AWS Lambda, Amazon Kinesis, Amazon EMR , AmazonAthena, Amazon DynamoDB, Amazon Cloudwatch, Amazon SNS and AWS Step Functions. Programming Skills: Strong More ❯
React or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed … in the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/ More ❯
engineering background Exposure to building or deploying AI/ML models into a production environment Previously used AWS data services e.g. S3, Kinesis, Glue, Athena, DynamoDB, SNS/SQS Experience using any data streaming technologies/paradigms for real-time or near-real time analytics More ❯
engineering background Exposure to building or deploying AI/ML models into a production environment Previously used AWS data services e.g. S3, Kinesis, Glue, Athena, DynamoDB, SNS/SQS Experience using any data streaming technologies/paradigms for real-time or near-real time analytics More ❯
Develop new tools and infrastructure using Python (Flask/Fast API) or Java (Spring Boot) and relational data backend (AWS - Aurora/Redshift/Athena/S3) Support users and operational flows for quantitative risk, senior management and portfolio management teams using the tools developed Qualifications/Skills Required More ❯
stack, including (but not limited to) spark/hadoop, kafka, Aerospike/Dynamodb Experience with AWS tech stack, including but not limited to EMR, Athena, EKS Expert knowledge of multi-threading, memory model, etc. Understanding of database fundamentals and MySQL knowledge Experience with CICD tools such as Jenkins, Graphite More ❯
and automated deployments. Have excellent knowledge of AWS services (ECS, IAM, EC2, S3, DynamoDB, MSK). Our Technology Stack: Python and Scala Starburst and Athena Kafka and Kinesis DataHub ML Flow and Airflow Docker and Terraform Kafka, Spark, Kafka Streams and KSQL DBT AWS, S3, Iceberg, Parquet, Glue and More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Datatech Analytics
optimizing data delivery, re-designing infrastructure for greater scalability and performance. Essential Skills and Experience: Hands-on experience with AWS services, including Lambda, Glue, Athena, RDS, and S3. Strong SQL skills for data transformation, cleaning, and loading. Strong coding experience with Python and Pandas. Experience of data pipeline and More ❯
optimizing data delivery, re-designing infrastructure for greater scalability and performance. Essential Skills and Experience: ·Hands-on experience with AWS services, including Lambda, Glue, Athena, RDS, and S3. ·Strong SQL skills for data transformation, cleaning, and loading. ·Strong coding experience with Python and Pandas. ·Experience with any flavour of More ❯
with DBT, building and maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem More ❯
Central London, London, United Kingdom Hybrid / WFH Options
167 Solutions Ltd
warehouse and lakehouse solutions for analytics, reporting, and machine learning. Implement ETL/ELT processes using tools such as Apache Airflow, AWS Glue, and AmazonAthena . Work with cloud-native technologies to support scalable, serverless architectures. Collaborate with data science teams to streamline feature engineering and model … Expertise in Snowflake and modern data architectures. Experience designing and managing data pipelines, ETL, and ELT workflows . Knowledge of AWS services such as Athena, Glue, and EMR. Experience working closely with data scientists and ML engineers . Strong problem-solving skills and ability to work independently. Whats on More ❯
Experience Required: 13-18 years of overall Data and Analytics experience. At least 10+ years in AWS data platform including AWS S3, Glue, Redshift, Athena, Sagemaker, Quicksight, and MLOPS. Expertise in Snowflake DWH architecture, Snowpipe, Data Sharing, Polaris catalog, and data governance. Knowledge of additional technologies such as Python More ❯
easy and safe for teams to use and contribute to data systems. You’ll work with services like Lambda, S3, LakeFormation, Glue, Step Functions, Athena, EventBridge, SNS, SQS, and DynamoDB, and will be expected to navigate and manage data systems with a high degree of rigour and compliance. Familiarity More ❯
easy and safe for teams to use and contribute to data systems. You’ll work with services like Lambda, S3, LakeFormation, Glue, Step Functions, Athena, EventBridge, SNS, SQS, and DynamoDB, and will be expected to navigate and manage data systems with a high degree of rigour and compliance. Familiarity More ❯
Engineering colleagues. What you'll need Required: Experience designing and building high throughput, scalable, resilient, and secure data pipelines. Experience with AWS technologies: Glue, Athena, S3, Step Functions, Lambda, RDS (Aurora Postgres), DMS, Redshift, QuickSight, Kinesis Firehose. Expert knowledge of SQL. Hands-on experience with Kafka. Hands-on experience More ❯
SQL skills. In addition, you will also have a strong desire to work with Docker, Kubernetes, Airflow and the AWS data technologies such as Athena, Redshift, EMR and various other tools in the AWS ecosystem. You would be joining a team of 25+ engineers across mobile, web, data and More ❯
pipelines and doing transformation and ingestion in a a certain tech suite. Extensive experience in implementing solutions around the AWS cloud environment (S3, Snowflake, Athena, Glue), In depth understanding of database structure principles, Strong knowledge of database structure systems and data mining, Excellent understanding of Data Modelling (ERwin, PowerDesigner More ❯
and identify future opportunities for leveraging AWS services and implement effective metrics and monitoring processes. Your Skills and Experience Experience of AWS tools (e.g. Athena, Redshift, Glue, EMR), Java, Scala, Python, Spark. Experience of developing enterprise grade ETL/ELT data pipelines and demonstrable knowledge of applying Data Engineering More ❯
integrating and automating business workflows, including data-driven processes and system integrations. Familiarity with analytics platforms and tools such as GCP (BigQuery), AWS (Glue, Athena), or Azure Databricks. Proficiency in Python or .NET, with experience in both or the ability to quickly learn new technologies. Experience with front-end More ❯
integrating and automating business workflows, including data-driven processes and system integrations. Familiarity with analytics platforms and tools such as GCP (BigQuery), AWS (Glue, Athena), or Azure Databricks. Proficiency in Python or .NET, with experience in both or the ability to quickly learn new technologies. Experience with front-end More ❯
Computer Engineering or related field. Experience in containerization - Docker/Kubernetes. Experience in AWS cloud and services (S3, Lambda, Aurora, ECS, EKS, SageMaker, Bedrock, Athena, Secrets Manager, Certificate Manager etc.) Proven DevOps/MLOps experience provisioning and maintaining infrastructure leveraging some of the following: Terraform, Ansible, AWS CDK, CloudFormation. More ❯
services in production. Exposure to some of the following technologies (or equivalent): Apache Spark, AWS Redshift, AWS S3, Cassandra (and other NoSQL systems), AWS Athena, Apache Kafka, Apache Flink, AWS, and service-oriented architecture. What you'll get: Full responsibility for projects from day one, a collaborative team, and More ❯
TypeScript or Python A degree in Computer Science or equivalent Strong attention to detail, communication, and documentation skills Experience with AWS (Lambda, S3, DynamoDB, Athena), Linux BASH, Confluence, or JIRA Hybrid role two days at the office More ❯