implement new technologies to enhance network and compute performance, container management, workload orchestration, and cloud integration. Support the business by maintaining and managing storage requirements on NFS, CIFS, and S3 compatible systems within a Linux-based environment. Support and enhance highly available database architectures, ensuring systems are up-to-date with minimal downtime, following best practices, and coordinating with More ❯
of AI, and AI ethics An understanding of data safety in use of Large Language Models Knowledge and experience of either AWS or Azure: AWS (boto3, Bedrock, Sagemaker, Lambda, S3, EC2) Azure (azure Open AI service, Cosmos DB) Python Langgraph Neo4j/cypher Other coding languages/frameworks e.g. Java/.Net AI RAG (retrieval augmented generation) Graph RAG More ❯
modern Python features (asyncio, type hints, context managers) Deep experience using AWS SDKs in both Python (boto3) and Go (aws-sdk-go-v2) to: Interact with services like EC2, S3, IAM, CloudWatch, and Lambda. Experience with pytest, unittest, and mocking tools (unittest.mock, responses). More ❯
services as required Essential Requirements 3+ years consulting/managed services experience Solid Data Lake experience with AWS data solutions (Redshift, Glue, Athena, Lake Formation) Advanced AWS skills (EC2, S3, VPC, IAM, Lambda) Infrastructure as Code experience (CloudFormation/Terraform) AWS Solutions Architect Associate certification Strong communication and presentation abilities Desirable Skills Azure fundamentals knowledge FinOps and cost management More ❯
/ELT processes, automation Technical Requirements: Strong proficiency in SQL and Python programming Extensive experience with data modeling and data warehouse concepts Advanced knowledge of AWS data services, including: S3, Redshift, AWS Glue, AWS Lambda Experience with Infrastructure as Code using AWS CDs Proficiency in ETL/ELT processes and best practices Experience with data visualization tools (Quicksight) Strong More ❯
/ELT processes, automation Technical Requirements: Strong proficiency in SQL and Python programming Extensive experience with data modeling and data warehouse concepts Advanced knowledge of AWS data services, including: S3, Redshift, AWS Glue, AWS Lambda Experience with Infrastructure as Code using AWS CDs Proficiency in ETL/ELT processes and best practices Experience with data visualization tools (Quicksight) Strong More ❯
engineering principles, REST APIs, and asynchronous programming Experience with modern frontend development, ideally with React Nice to Have: Experience with NestJS Cloud experience with AWS services (Lambda, API Gateway, S3, etc.) Exposure to payments or fintech environments This role is 3 days on site in central London and offers a quick 2 - 3 stage interview process with interview slots More ❯
with AI and predictive modeling experts to create impactful applications to positively impact the lives of millions of people Drive innovation using Python, FastAPI and AWS infrastructure tools (EC2, S3, RDS, Lambda, etc.), SQL/NoSQL What We're Looking For: Experience working with autonomy and taking the lead of back end development Experience improving the scalability and performance More ❯
for this role Minimum Qualifications Production Experience in operationalizing large scale distributed, fault-tolerant, multi-tenant services Experience in presenting with AWS Services, including but not limited to: EC2, S3, EKS, DynamoDB, EBS, Cloud formation, Lambda, VPC, Route 53 Experience operating in core SDLC CI/CD processes, along with SRE concepts - Monitoring, Alerting, Incident management. Worked within DevOps More ❯
for this role. Minimum Qualifications Production Experience in operationalizing large scale distributed, fault-tolerant, multi-tenant services Experience in presenting with AWS Services, including but not limited to: EC2, S3, EKS, DynamoDB, EBS, Cloud formation, Lambda, VPC, Route 53 Experience operating in core SDLC CI/CD processes, along with SRE concepts - Monitoring, Alerting, Incident management. Worked within DevOps More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
CD, infrastructure-as-code, and modern data tooling Introduce and advocate for scalable, efficient data processes and platform enhancements Tech Environment: Python, SQL, Spark, Airflow, dbt, Snowflake, Postgres AWS (S3), Docker, Terraform Exposure to Apache Iceberg, streaming tools (Kafka, Kinesis), and ML pipelines is a bonus What We're Looking For: 5+ years in Data Engineering, including 2+ years More ❯
HTML, XML, SQL, Bash. Experience with build tools like Maven and Gradle. Knowledge of relational DBMS such as Postgres. Experience using AWS services via Terraform (Secrets Manager, Kinesis Firehose, S3, Open Search, etc.). Familiarity with monitoring tools like New Relic and Kibana for metrics and alerts. Experience with Docker, containers, Helm, Kustomize, and Kubernetes management tools. Knowledge of More ❯
engineered solutions Generative AI & ML: Integrate Gen AI solutions and explore machine learning for business applications AWS Cloud Expertise: Architect and deploy systems using AWS services (Lambda, API Gateway, S3, etc) Security Mindset: Implement security best practices across development processes CI/CD & Automation: Set up CI/CD pipelines and automated testing to accelerate delivery Collaboration: Work with More ❯
a focus on building scalable data solutions. Experience with data pipeline orchestration tools such as Dagster or similar. Familiarity with cloud platforms (e.g. AWS) and their data services (e.g., S3, Redshift, Snowflake). Understanding of data warehousing concepts and experience with modern warehousing solutions. Experience with GitHub Actions (or similar) and implementing CI/CD pipelines for data workflows More ❯
Server, MongoDB, DB2 Integration: Enterprise Integration Patterns, Apache Camel DevOps: Gitlab CI/CD, Ansible, Observability/Telemetry Unix: bash scripting, system and process monitoring Cloud: Experience with AWS, S3 Web: HTML5 frameworks: React, Redux, SSE/Websockets, CSSBootstrap Career Growth and Learning Opportunities Great opportunity to be at the intersection of business and technology. Gaining in-depth knowledge More ❯
direction, and collaborate effectively across business and technology functions Desirable skills Familiarity with machine learning pipelines and MLOps practices Additional experience with Databricks and specific AWS such as Glue, S3, Lambda Proficient in Git, CI/CD pipelines, and DevOps tools (e.g., Azure DevOps) Hands-on experience with web scraping, REST API integrations, and streaming data pipelines Knowledge of More ❯
scalable, and efficient data infrastructure. What you'll be doing - your accountabilities Lead the design and implementation of robust, scalable, and secure data solutions using AWS services such as S3, Glue, Lambda, Redshift, EMR, Kinesis, and more-covering data pipelines, warehousing, and lakehouse architectures. Drive the migration of legacy data workflows to Lakehouse architectures, leveraging Apache Iceberg to enable More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
and a good understanding of data modelling approaches (e.g. star schema, dimensional modelling) Familiarity with Airflow or similar orchestration tools Comfortable working with AWS services such as Glue and S3, or equivalent cloud infrastructure Experience using version control and CI/CD tools like Git and GitHub Actions Confident working independently and taking ownership of problems and solutions Comfortable More ❯
communication skills. Strong proficiency in at least one backend programming language. Strong expertise in the following technologies: Kubernetes (k8s), Postgres, Redis and AWS technologies like SQS/SNS, DynamoDB, S3 and more. Willingness to work hands-on as a developer focused on complex problems with high-impact. Why Personio Personio is an equal opportunities employer, committed to building an More ❯
and data governance processes to enable confident data self-service. Support monitoring, testing, and validation efforts to maintain data quality and reliability. Work with cloud infrastructure services, mainly AWS (S3, Glue, Lambda), and use Python scripting for automation where needed. Help identify opportunities to improve data processes, scalability, and usability. What You Bring 2-4 years experience in analytics More ❯
engineering, including: Snowflake (data warehousing and performance tuning) Informatica (ETL/ELT development and orchestration) - nice to have Python (data processing and scripting) - required AWS (data services such as S3, Glue, Redshift, Lambda) - required Cloud data practices and platform - AWS required Basic knowledge of related disciplines such as data science, software engineering, and business analytics. Proven ability to independently More ❯
Nest Database: Dynamo/PostgreSQL/MySQL/Mongo Experience with AWS: Lambda, DynamoDB & App Sync (GraphQL) Will be a plus: Amplify, API Gateway, CloudFormation, Cognito, SQS, Secrets Manager, S3, CloudWatch, KMS Experience with webhooks Why Join Us? Be part of a growing team at the intersection of real estate , Web3 , and finance . Work on high-impact systems More ❯
additional technologies (listed below) is advantageous: Kotlin Cloud Technologies (Kubernetes, Open Shift) Messaging Technologies (Kafka, Solace, TIBCO) Database/Data Store/Data Query Technologies (SQL Server, Trino, Mongo, S3) Observability Technologies (OpenTelemetry, Elastic Stack/ELK, Grafana) This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned More ❯
in the organization Qualifications Understanding of data engineering (including SQL, Python, Data Warehousing, ETL, Dimensional Modelling, Analytics) Understanding of cloud data infrastructure elements, and ideally AWS (Redshift, Glue, Athena, S3) and understanding of existing governance frameworks of data quality and their dimensions (DAMA, GOV.UK, etc.). Experience in Exploratory Testing, automation experience, and time and attention management between multiple More ❯
You'll Do: Architect and Build: Design and implement a robust, cloud-native data analytics platform spanning AWS, GCP, and other emerging cloud environments. You'll leverage services like S3/GCS, Glue, BigQuery, Pub/Sub, SQS/SNS, MWAA/Composer, and more to create a seamless data experience. (Required) Data Lake , Data Zone, Data Governance: Design More ❯