pipelines using Apache Airflow Proficiency in Git based version control tools Proficiency with Linux commands and Bash Scripting Working experience in AWS Bigdata services (EMR, Glue, Data Pipelines, Athena, S3, Step Functions etc.) & AWS CLI Experience with CI/CD tools such as Jenkins Experience working with relational and more »
London, England, United Kingdom Hybrid / WFH Options
Maclean Moore Ltd
pipelines using Apache Airflow Proficiency in Git based version control tools Proficiency with Linux commands and Bash Scripting Working experience in AWS Bigdata services (EMR, Glue, Data Pipelines, Athena, S3, Step Functions etc.) & AWS CLI Experience with CI/CD tools such as Jenkins Experience working with relational and more »
Chicago, Illinois, United States Hybrid / WFH Options
Request Technology - Robyn Honquest
AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
English Bachelor's degree in Computer Science or a related field (Master's degree preferred) Nice to have: experience with LLMs, Vector Databases, AWS EMR, Spark, and Python Our commitment: Equal opportunities are important to us. We believe that diversity and inclusion at The Stepstone Group are critical to more »
role, you'll spearhead backend and data engineering and mentor team members. Tech stack: Athena; Python, Flask, Redis, Postgres, React, Plotly, Docker, SQL, Athena & EMR Spark, ECS and Temporal. This is a 60/40 split between tech and leadership. Your background: 8 years+ coding experience, 4+ years Python more »
specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. Essential Skills and Experience: * AWS (e.g., Athena, Redshift, Glue, EMR) * Strong AWS Data Solution Architect Experience on Data Related Projects * Java, Scala, Python, Spark, SQL * Experience of developing enterprise grade ETL/ELT data … applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). * Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. * NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. * BigQuery and Data Studio/Looker. more »
Greater London, England, United Kingdom Hybrid / WFH Options
Humand Talent
8+ with experience in Laravel/Symfony desired. JavaScript/TypeScript proficiency with NodeJS and React.JS. Familiarity with AWS Services (Aurora, MSK Kafka, ECS, EMR). 3+ years of database experience, ideally with MySQL. Hands-on experience with various data storage paradigms (e.g., RDMS + Document + KV Stores more »
PHP 8+ Experience with Laravel/Symfony framework Strong grasp of JavaScript/TypeScript Familiarity with AWS services including Aurora, MSK Kafka, ECS, and EMR Solid understanding of SQL Key Responsibilities: Collaborate with a talented team to conceptualise, develop, and deploy scalable web applications Write clean, efficient, and maintainable more »
massively distributed cloud-hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End to End … Strong Python/Java Software Engineering skills Excellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and Snowflake Previous experience with Dremio, dbt, EMR or Dagster Good Computer Science fundamentals knowledge with strong knowledge of software and data architecture. If you would like more information on the above more »