South East London, England, United Kingdom Hybrid / WFH Options
NatPower Marine
SQL and NoSQL databases, particularly Postgres.· Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.· Experience with AWS cloud services: EC2, EMR, RDS, Redshift.· Experience with multiple data architecture paradigms (relational, non-structured, streaming)· Knowledge of various data communication protocols (Rest API, GraphQL, RPCs, MQTT, AMQP more »
pipelines using Apache Airflow Proficiency in Git based version control tools Proficiency with Linux commands and Bash Scripting Working experience in AWS Bigdata services (EMR, Glue, Data Pipelines, Athena, S3, Step Functions etc.) & AWS CLI Experience with CI/CD tools such as Jenkins Experience working with relational and more »
pipelines using Apache Airflow Proficiency in Git based version control tools Proficiency with Linux commands and Bash Scripting Working experience in AWS Bigdata services (EMR, Glue, Data Pipelines, Athena, S3, Step Functions etc.) & AWS CLI Experience with CI/CD tools such as Jenkins Experience working with relational and more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive more »
Chicago, Illinois, United States Hybrid / WFH Options
Request Technology - Robyn Honquest
AWS CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL more »
Greater London, England, United Kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation, EMR, EventBridge, Athena, etc.) & metadata management tools (such as Amundsen, Atlas, DataHub, OpenDataDiscovery, Marquez, etc.), Experience on RDBMS like PostgreSQL would be a plus. Experience more »
Appomattox, Virginia, United States Hybrid / WFH Options
Maxar Technologies
and the Joint Enterprise Modeling and Analytics (JEMA) framework or Apache NiFi. Experience in AWS services including but not limited to EC2, Sagemaker, S3, EMR, and IAM. Our salary ranges are market-driven and set to allow for flexibility. Individual pay will be competitive based on a candidate's more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
English Bachelor's degree in Computer Science or a related field (Master's degree preferred) Nice to have: experience with LLMs, Vector Databases, AWS EMR, Spark, and Python Our commitment: Equal opportunities are important to us. We believe that diversity and inclusion at The Stepstone Group are critical to more »
specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. Essential Skills and Experience: * AWS (e.g., Athena, Redshift, Glue, EMR) * Strong AWS Data Solution Architect Experience on Data Related Projects * Java, Scala, Python, Spark, SQL * Experience of developing enterprise grade ETL/ELT data … applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). * Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. * NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. * BigQuery and Data Studio/Looker. more »
specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. Essential Skills and Experience: • AWS (e.g., Athena, Redshift, Glue, EMR) • Strong AWS Data Solution Architect Experience on Data Related Projects • Java, Scala, Python, Spark, SQL • Experience of developing enterprise grade ETL/ELT data … applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). • Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. • NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. • BigQuery and Data Studio/Looker. more »
Greater London, England, United Kingdom Hybrid / WFH Options
Humand Talent
8+ with experience in Laravel/Symfony desired. JavaScript/TypeScript proficiency with NodeJS and React.JS. Familiarity with AWS Services (Aurora, MSK Kafka, ECS, EMR). 3+ years of database experience, ideally with MySQL. Hands-on experience with various data storage paradigms (e.g., RDMS + Document + KV Stores more »
PHP 8+ Experience with Laravel/Symfony framework Strong grasp of JavaScript/TypeScript Familiarity with AWS services including Aurora, MSK Kafka, ECS, and EMR Solid understanding of SQL Key Responsibilities: Collaborate with a talented team to conceptualise, develop, and deploy scalable web applications Write clean, efficient, and maintainable more »
South East London, England, United Kingdom Hybrid / WFH Options
Durlston Partners
a massively distributed cloud-hosted data platform that would be used by the entire firm.Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster.Cloud: AWS, Lambdas, ECS servicesThis role would focus on various areas of Data Engineering including:End to End ETL pipeline … quality tooling.Requirements:Strong Python/Java Software Engineering skillsExcellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and SnowflakePrevious experience with Dremio, dbt, EMR or DagsterGood Computer Science fundamentals knowledge with strong knowledge of software and data architecture.If you would like more information on the above, please apply more »
massively distributed cloud-hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End to End … Strong Python/Java Software Engineering skills Excellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and Snowflake Previous experience with Dremio, dbt, EMR or Dagster Good Computer Science fundamentals knowledge with strong knowledge of software and data architecture. If you would like more information on the above more »