Hive, Redis, and MySQL- Experience in using cloud-native services like GKE, EKS, AWS/GCP load balancing, AWS/GCP cloud storage platforms (S3, storage buckets)- Experience in designing and analyzing large-scale distributed systems- Experience leading and hiring engineersTikTok is committed to creating an inclusive space where more »
or more consecutive years Have practical experience designing and delivering using AWS or Azure such as:Azure SQL Data Warehouse, Azure Data Lake,AWS S3,AWS RDS,AWS Lambda or similar Have experience with Open Source big data products i.e.Hadoop Hive, Pig, Impala or similar Have experience with Open more »
field. 5+ years of experience in software development with a strong focus on C# and .NET. Proven experience with AWS services such as EC2, S3, Lambda, RDS, and DynamoDB. Hands-on experience with AI and machine learning frameworks and libraries. Strong understanding of software development methodologies, tools, and processes. more »
experience Designing, creating and calling HTTP APIs SQL Databases, e.g. Postgres, MySQL, MariaDB Experience in using AWS services - 3+ of the following: EC2, RDS, S3, Route 53, Elastic search, EKS, Cloudwatch, Cloudfront. Works well in a team and with minimal supervision Desirable Requirements: Experience with: CI/CD pipelines more »
with Java and Kotlin development, with a strong understanding of object-oriented programming principles. Proficiency in AWS services, including but not limited to EC2, S3, Lambda, and DynamoDB. Solid understanding of TypeScript/JavaScript for frontend development. Experience with microservices architecture and RESTful API design. Strong problem-solving skills more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive experience with database technologies (SQL/NoSQL). Proven ability to work collaboratively more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive experience with database technologies (SQL/NoSQL). Proven ability to work collaboratively more »
s degree in Computer Science, Information Technology, or a related field. Master’s degree preferred. Technical Expertise: Proven experience with AWS data services including S3, Redshift, RDS, Glue, Lambda, Kinesis, and Athena. Strong understanding of data modeling, ETL processes, and data warehousing. Proficiency in SQL and experience with database more »
Greater London, England, United Kingdom Hybrid / WFH Options
BJSS
projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services more »
London, Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
years of experience as a Data Engineer or in a similar role. Strong proficiency in AWS services related to data engineering (e.g., S3, Redshift, Glue). Hands-on experience with Databricks for data processing and analytics. Proficient in Python programming for data manipulation and automation. Solid understanding of Apache more »
Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an inside IR35 contract. If you're interested in this more »
Greater London, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
such as Spring. Familiarity with Scala programming language and its ecosystem. Basic understanding of cloud computing concepts and experience with AWS services (e.g., EC2, S3, Lambda, DynamoDB). Strong problem-solving skills and ability to learn new technologies quickly. Excellent communication and teamwork skills. Benefits: Competitive salary and benefits more »
HTML/CSS, Docker, GitHub Actions, AWS – CDK, ECS, ALB, VPC.API’s: Go, Rest, CDK. AWS – CDK, Lambda, API Gateway, Dynamo DB, Event Bridge, S3, Serverless and event driven architecturePlatforms: Go, CDK, AWSSkills and experience we’re looking for:Experience developing across a wide variety of programming languages with more »
and NPPV3) Job Description: As an AWS DevOps Engineer, you will be responsible for designing, implementing, and maintaining scalable infrastructure solutions on the Amazon Web Services (AWS) platform. You will work closely with development teams to streamline the deployment and operation of applications, ensuring high availability, performance, and … related field. Proven experience as a DevOps Engineer or similar role, with a focus on AWS. Strong proficiency in AWS services such as EC2, S3, RDS, Lambda, IAM, etc. Experience with infrastructure-as-code (IaC) tools like AWS CloudFormation, Terraform, or Ansible. Hands-on experience with CI/CD more »
integrations Requirements: Minimum 5 years of experience with Dataiku, demonstrating a deep understanding of its capabilities Proven experience with AWS platform engineering, including EC2, S3, IAM, and security best practices Comprehensive understanding of data security principles and best practices Expert understanding of CI/CD principles and tools (e.g. more »
/sprint goals Strong experience with infrastructure as code on AWS using a wide range of AWS services; ECS and networking especially, but including S3, SQS, RDS, CloudWatch Python application or Java expert, but keen polyglot - very confident in at least one other mainstream language Comfortable with SQL and more »
/DevOps Engineer to join an internationally fast growing company to help develop their digital platform. Existing Tech Stack : Python, Django,Django Ninja Lambase S3 AWS with ECS, ECR, RDS,PostgreSQL, SQS, SNS Terraform Github + Actions Cloudfare The role will be working directly with the CTO and Technical more »
using Kafka or equivalent distributed event store and stream-processing platform. Experience working with Redis or equivalent in-memory storage. Experience working with AWS S3, Athena, ECS, Cloud Formation, Lambdas & Cloudwatch. Experience with concurrent development source control (GIT). Systems integration experience with networking, data migrations, API integration and more »
process data for modeling • Experience with SQL • Experience in the data/BI space Preferred qualifications • Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift • Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets If you are interested more »
such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/Fargate. Collaborative Approach: Proven ability to work effectively with both business and technical stakeholders, taking ownership of end-to-end more »
and/or 5 EU working hours Good written & verbal communication skills in English Nice to Have Node js (NPM) Composer AWS EC2, S3, Route56, Cloudfront Vite Typescript Experience in agile scrum methodology Strengths in problem solving, attention to detail, ability to work in a deadline-driven work environment more »
through the full product cycle from design to deployment Experience with middleware technologies such as AMQP, Kafka, JMS Knowledge of data storage systems like S3, Couchbase, MongoDB Practical experience with CI/CD pipelines and DevOps engineering Why Join the Client: Be at the forefront of building the next more »