to share knowledge with your peers) Nice To Have Knowledge of systems design within a modern cloud-based environment (AWS, GCP) including AWS primitives such as IAM, S3, RDS, EMR, ECS and more Advanced experience working and understanding the tradeoffs of at least one of the following Data Lake table/file formats: Delta Lake, Parquet, Iceberg, Hudi Previous More ❯
disaster-recovery drills for stream and batch environments. Architecture & Automation Collaborate with data engineering and product teams to architect scalable, fault-tolerant pipelines using AWS services (e.g., Step Functions , EMR , Lambda , Redshift ) integrated with Apache Flink and Kafka . Troubleshoot & Maintain Python -based applications. Harden CI/CD for data jobs: implement automated testing of data schemas, versioned Flink More ❯
Experience in a data-focused SRE, Data Platform, or DevOps role Strong knowledge of Apache Flink, Kafka, and Python in production environments Hands-on AWS experience with AWS (Lambda, EMR, Step Functions, Redshift, etc.) Comfortable with monitoring tools, distributed systems debugging, and incident response Reference Number: BBBH259303 To apply for this role or for to be considered for further More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Rise Technical Recruitment Limited
Experience in a data-focused SRE, Data Platform, or DevOps role *Strong knowledge of Apache Flink, Kafka, and Python in production environments *Hands-on AWS experience with AWS (Lambda, EMR, Step Functions, Redshift, etc.) *Comfortable with monitoring tools, distributed systems debugging, and incident response Reference Number: BBBH259303 To apply for this role or for to be considered for further More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Rise Technical Recruitment Limited
Experience in a data-focused SRE, Data Platform, or DevOps role*Strong knowledge of Apache Flink, Kafka, and Python in production environments*Hands-on AWS experience with AWS (Lambda, EMR, Step Functions, Redshift, etc.)*Comfortable with monitoring tools, distributed systems debugging, and incident response Reference Number: BBBH259303 To apply for this role or for to be considered for further More ❯
Software Engineering using a high level language like Go, Java, JavaScript, Python Distributed Software Architecture exposure in high volume production scenarios Working with Data Mesh, BigData technologies such as EMR, Spark, Databricks Designing, tracking and testing to SLOs and Chaos Engineering to Error Budgets Implementing Business Continuity (BCP) and Disaster Recovery (DRP) plans including tracking RTO and RPO CryptoCurrency More ❯
Leeds, West Yorkshire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
AWS Data Engineer - IT Consultancy - Leeds - Hybrid Are you a data engineer with a passion for AWS and solving real-world business problems? I am working with a fast-growing IT consultancy based in Leeds that's delivering cutting-edge More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Rackner
Object Relational Mapping (ORM) libraries and SQL, and implementing data validation business rules Mature AWS cloud solutions with a specific focus on data services such as S3, RDS, and EMR Nice to have: M.S in Computer Science or related AI/ML Kubernetes (Rancher RKE2, AWS EKS) and microservice architectures Data engineering and big data technologies such as Apache … Airflow, Trino, and AWS EMR NIST Risk Management Framework and security accreditation process and tasks What will make you successful: Using DevSecOps best practices to rapidly develop and deliver first class solutions for a DoD customer Developing software using leading languages and frameworks including Python (FastAPI), AWS, and Rancher Kubernetes Leveraging Test Driven Development and User Centered design best More ❯
Coventry, Warwickshire, United Kingdom Hybrid / WFH Options
Sannik
from the model, with a focus on AWS S3 storage. • Collaborate with the development team to implement architectural changes and system tweaks. • Work with AWS services such as Lambda, EMR, and DynamoDB to optimize data management processes. • Utilize Spark for data processing and analysis tasks. • Ensure the integrity and security of data during the extraction and processing stages. • Troubleshoot … programming is essential. Strong understanding of cloud technologies, including AWS services such as S3, Lambda, and DynamoDB. Demonstrated experience with database management and backend development. Familiarity with tools like EMR and Spark for data processing. Ability to work independently and as part of a collaborative team. Excellent problem-solving and analytical skills. Prior experience in financial or mortgage industries More ❯