culture of transparency, innovation, and inclusive collaboration 5+ years of experience developing scalable backend systems Strong proficiency in Python, TypeScript, and cloud platforms (especially AWS) Hands-on experience with tools such as CDK, ECS, StepFunctions, Lambda, and S3 Proficiency with GitHub or similar version control systems … is posted on behalf of one of our partner companies. If you choose to apply, your application will go through our AI-powered 3-step screening process, where we automatically select the 5 best candidates. Our AI thoroughly analyzes every line of your CV and LinkedIn profile to assess More ❯
London, England, United Kingdom Hybrid / WFH Options
Jobgether
culture of transparency, innovation, and inclusive collaboration 5+ years of experience developing scalable backend systems Strong proficiency in Python, TypeScript, and cloud platforms (especially AWS) Hands-on experience with tools such as CDK, ECS, StepFunctions, Lambda, and S3 Proficiency with GitHub or similar version control systems … is posted on behalf of one of our partner companies. If you choose to apply, your application will go through our AI-powered 3-step screening process, where we automatically select the 5 best candidates. Our AI thoroughly analyzes every line of your CV and LinkedIn profile to assess More ❯
London, England, United Kingdom Hybrid / WFH Options
Augusta Hitech
Key Responsibilities: Design and implement efficient data ingestion pipelines to collect and process large volumes of data from various sources. Hands-on experience with AWS Database Migration Service for seamless data migration between databases. Develop and maintain scalable data processing systems, ensuring high performance and reliability. Utilize advanced data … with cross-functional teams to understand data needs and deliver solutions that meet business requirements. Manage and optimize cloud-based infrastructure, particularly within the AWS ecosystem, including services such as S3, Step-Function, EC2, and IAM. Experience with cloud platforms and understanding of cloud architecture. Knowledge of SQL … to ensure optimal efficiency. Stay updated with emerging trends and technologies in data engineering and propose adaptations to existing systems as needed. Proficient in AWS Glue for ETL (Extract, Transform, Load) processes and data cataloging. Hands-on experience with AWS Lambda for serverless computing in data workflows. Knowledge More ❯
London, England, United Kingdom Hybrid / WFH Options
DATAPAO
by leading small delivery teams. Our projects are fast-paced, typically 2 to 4 months long, and primarily use Apache Spark/Databricks on AWS/Azure. You will manage customer relationships either alone or with a Project Manager, and support our pre-sales, mentoring, and hiring efforts. What … does it take to fit the bill? Technical Expertise 5+ years in Data Engineering , focusing on cloud platforms (AWS, Azure, GCP); Proven experience with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); Extensive ETL/ELT and data pipeline orchestration experience (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, StepFunctions); Proficiency in SQL and Python for data transformation and optimization; Knowledge of CI/CD pipelines and Infrastructure as Code (Terraform, CloudFormation, Bicep); Hands-on experience with Databricks integration with BI tools (Power BI, Tableau, Looker). Consulting & Client-Facing Skills Experience in consulting or product companies More ❯
analysis/debugging/diagnostic skills Experience and understanding of testing frameworks (Jest, Mocha, Pytest, Unittest, etc.) Experience and knowledge of cloud services, preferably AWS (Lambda, Step Function, DynamoDB, Cloudformation, etc.) Understanding of agile development methodologies Much preferred - Experience working with and understanding of high-level language (Java More ❯
integrating with third party services. Ability to read and review code, paired programming, and debugging code related performance issues, SQL tuning, etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka). Experience with batch processing/ETL using Glue Jobs, AWS Lambda, and StepMore ❯
London, England, United Kingdom Hybrid / WFH Options
Datapao
projects are the norm, our projects are fast-paced, typically 2 to 4 months long. Most are delivered using Apache Spark/Databricks on AWS/Azure and require you to directly manage the customer relationship alone or in collaboration with a Project Manager. Additionally, at this seniority level … it take to fit the bill? Technical Expertise You (ideally) have 5+ years of experience in Data Engineering , with a focus on cloud platforms (AWS, Azure, GCP); You have a proven track record working with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); You have extensive experience in ETL/… ELT development and data pipeline orchestration (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, and Step Functions.); You’re proficient in SQL and Python , using them to transform and optimize data like a pro; You know your way around CI/CD pipelines and Infrastructure as Code (Terraform, CloudFormation, or More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Arrow
and pandas (or Polars) with a proven track record of delivering data systems. Experience with JSON, XML, CSV, and transforming unstructured data. Familiarity with AWS cloud-native tools like Lambda and Step Functions. Interest or experience with LLMs for data tasks. View pipelines as products: robust, debuggable, and More ❯