algorithms. You would help lead and solve for lowering costs for customer and help bring in great stability and durability of our services. Amazon Aurora: Imagine a database where you don't have to worry about the configuration or capacity of your database. Where you don't have … your product or service will put on it and you only pay for what you use. Want to learn more? Then read on. Amazon Aurora Serverless an on-demand, auto-scaling configuration for Amazon Aurora where the database will automatically start up, shut down, and scale … Serverless we aim to do nothing less than revolutionize the database business. Aurora Serverless builds on top of foundational AWS services such as EC2, S3 and DynamoDB and we are breaking new ground in the way that customers experience databases. To learn more about Amazon Aurora Serverless more »
to join our Network Fabric Engineering (NFE) team. As a Network Development Engineer, you will be responsible for building, deploying and scaling the Amazon networks that support AWS, customers, and other business units, across multiple global datacenters. AWS Core Networking is focused on building Data Centers and the … the areas of network sustaining engineering, network deployment/implementation, network scaling, technology refresh, best practices application, and/or network optimization. • Sustain Amazon Web Services' next generation networks for significant AWS customers by providing critical network support to diagnose, mitigate impact, and resolve large-scale networking events. … needed. About the team AWS NFE (Network Fabric Engineering) owns datacenter network services that provide unconstrained connectivity to our customers, such as EC2, EBS, S3 and Amazon CDO Services. They own the networks internal to the datacenters end to end, including scaling and operational functions, for both more »
assistant. Our current tech stack includes but are not limited to: Java 8 MySQL Gradle XML JSON Various AWS services including EC2, RDS and S3 JavaScript HTML CSS JQuery Required Skills & Experience: At least 8 years experience as a software engineer, working with Java-based technologies. At least more »
services such as AWS Glue, AWS Lambda, and AWS Data Pipeline. Data Storage Management : Manage and optimize data storage solutions, including AmazonS3, Amazon Redshift, and AWS RDS. Data Quality and Validation : Ensure data quality and integrity through validation and cleansing techniques. Collaboration : Work closely … a focus on AWS and cloud technologies. Technical Skills : Proficiency in programming languages such as Python, Java, or Scala. Experience with AWS services including S3, Redshift, Glue, Lambda, and RDS. Strong SQL skills for data manipulation and querying. Familiarity with ETL tools and processes. Understanding of data modeling and more »
Greater London, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
Design, develop, and maintain backend services and APIs using Java and Scala. Implement scalable solutions on AWS cloud platform, leveraging services such as EC2, S3, Lambda, and DynamoDB. Utilize the Spring Framework to build and manage application components, including dependency injection, data access, and transaction management. Collaborate with product more »
in our on-call rotation, and resolve production issues Continuously improve automation, monitoring and deployment processes Requirements: Experience with AWS services such as ECS, S3, RDS, Lambda, CloudFront, etc. Experience with monitoring tools like DataDog, CloudWatch, and Grafana Experience with Docker, ECS, Kubernetes or similar containerisation technologies Knowledge of more »
deadlines/sprint goalsStrong experience with infrastructure as code on AWS using a wide range of AWS services; ECS and networking especially, but including S3, SQS, RDS, CloudWatchPython application or Java expert, but keen polyglot - very confident in at least one other mainstream languageComfortable with SQL and noSQL databases more »
as code, ensuring consistency and efficiency in deployment processes. Implement and manage hosting and orchestration solutions using Docker, Fargate, Service Catalogue, Service Manager, AWS S3, MKS, Docker, Kubernetes, AWS Autoscaling Groups (ASG), EC2 instances, and ELB (Elastic Load Balancer). Utilize Consul for configuration management and Vault for secure more »
mid-term solutions Experience of building data warehouses, enterprise data warehouses or data lakes with ETL pipelines Experience with AWS tools including Glue, Lambda, S3, RDS, is preferred An understanding of CICD concepts including having a working knowledge of GitHub Understanding of different data warehouse architectures is preferred (especially more »
/CD. Some experience in system architecture and making technical decisions. Some exposure to AWS (or other cloud platform) and best practices like EC2, S3, auto-scaling, security. Passion for learning and problem solving. At least one of the following: Working on an AI driven product. Used to working more »
responsibilities: Requirements analysis and design of solutions for the data platform; Integration and analysis of solutions; Understanding of toolsets including AWS hosted databases (Postgres, S3 based data lake, AWS glue MySQL, Athena); Integration services (AWS API gateway, lambda functions), nice to have MuleSoft or Axway; ETL services (AWS Glue more »
Leeds, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
for certifications and loads more Location: Leeds + very flexible hybrid working model (remote) Key skills desired/what you will learn: Terraform AWS: S3, Lambda/Serverless TypeScript, NodeJS, and ReactJS Python CI/CD: CircleCI, GitHub Actions Data Engineering technologies can be taught longer-term Role overview more »
experience Designing, creating and calling HTTP APIs SQL Databases, e.g. Postgres, MySQL, MariaDB Experience in using AWS services - 3+ of the following: EC2, RDS, S3, Route 53, Elastic search, EKS, Cloudwatch, Cloudfront. Works well in a team and with minimal supervision Desirable Requirements: Experience with: CI/CD pipelines more »
Reading, England, United Kingdom Hybrid / WFH Options
BJSS
projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services more »
Leeds, England, United Kingdom Hybrid / WFH Options
BJSS
projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services more »
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
BJSS
projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services more »
Greater Leeds Area, United Kingdom Hybrid / WFH Options
Trust In SODA
to hit the ground running they are looking to hire a Lead Engineer to spearhead the project with good knowledge of: AWS services (EC2, S3, ECS) Databases (RDS, Glue) Serverless (Lambda, SNS, SQS) Terraform GitHub Actions CircleCI TypeScript Node.js Driving DevOps adoption by defining best practices and guiding developers more »
such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/Fargate. Collaborative Approach: Proven ability to work effectively with both business and technical stakeholders, taking ownership of end-to-end more »
London, England, United Kingdom Hybrid / WFH Options
Vanguard
technology presence, and this role will work on leading technologies within a Microservice architecture, with the development stack including Java 8, Spring Framework, Amazon Web Services (including AWS Analytics), Atlassian toolsets (Bamboo, BitBucket), Cucumber automated testing and Tableau. The team is Agile based and operates leveraging a continuous … technology eg DynamoDB Experience using Tableau and data analytics on top of a data hub in AWS Experience with AWS technology such as AmazonSimpleStorageService, lambda, dynamoDB, Amazon Elastic Compute Cloud (EC2) Amazon Elastic MapReduce Experience with Python, Selenium Experience using Maven more »
Exposure to the following is advantageous: Experience of storage environment protocols (NAS, SAN, RAID, distributed file system, object storage, Restful API, AmazonS3). Solid networking knowledge: low level networking concepts, bridging, routing, VLAN, TCP/IP etc. Knowledge of High Availability environments (distributed system, load balancing more »
Develop solutions to parse and process tabular data from PDF and HTML documents Maintain, support and expand existing data pipelines using DBT, Snowflake and S3 Implement standardised data ingress/egress pipelines Onboard new, disparate data sets, sourced from many and varied data vendors, covering all asset types and more »
identifying the necessary edge cases that need to be tested in order to fully understand the data; programming in Python or Ruby, utilizing AWS S3, MongoDB, PostgreSQL, AWS Redshift or similar database technologies; using Jupyter notebooks and one or more statistical visualization or graphing toolkits such as Excel, Qlik more »
and programming principles knowledge (C++, Rust, C)Experience in running workloads in the cloud and designing clients interacting with it (networking, protocols, AWS, docker, S3, databases, Kubernetes)Knowledge in asset management for the game and movie industry, including revision control, cook pipelines, and digital asset management (Git, Perforce, DDC more »