specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. Essential Skills and Experience: • AWS (e.g., Athena, Redshift, Glue, EMR) • Strong AWS Data Solution Architect Experience on Data Related Projects • Java, Scala, Python, Spark, SQL • Experience of developing enterprise grade ETL/ELT data … applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). • Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. • NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. • BigQuery and Data Studio/Looker. more »
delivering projects using relevant Data Architectural paradigms such as Kimball, Data Vault, Data Mesh, Lakehouse Demonstrable knowledge of key AWS services such as S3, EMR, Redshift, Athena, Lambda Excellent stakeholder management skills To be successfully appointed to this role, it is a requirement to obtain Security Check (SC) clearance. more »
Senior Data Engineer Hybrid - London £70,000 - £80,000 We're currently partnered with a leading technology-driven company specialising in digital solutions that enhance user experience across various platforms. With over 15,000 5-star reviews and partnerships with more »
estate of cloud native software products and services. The DevOps Engineer will have deep knowledge of the AWS data engineering stack: Kinesis, Redshift, Glue, EMR and be passionate about writing high-quality code that not only solves problems but improves the overall estate. The role will involve operating and … Required DevOps Engineer experience. AWS skills. Docker, Kubernetes skills. knowledge across the AWS data engineering stack including but not limited to: Kinesis, Redshift, Glue, EMR, Databricks. of the software development lifecycle and the role DevOps plays within it. working in autonomous agile teams and environments. to degree level or more »
onto the cloud platforms, one of the key strategies for the division in which you’ll get exposure to technologies like AWS S3, Snowflake, EMR etc. These are great roles playing a role on some really big interesting projects. more »
similar). Experience using modern build tools such as Maven, Jenkins, GitHub, etc. Experience with Amazon Web Services a strong plus - CloudFormation, EMR, S3, EC2, Athena etc. Experience with scheduling services such as Airflow, Oozie. Experience with Data ETL and data modeling Experience with building large-scale more »
pipelines using Apache Airflow Proficiency in Git based version control tools Proficiency with Linux commands and Bash Scripting Working experience in AWS Bigdata services (EMR, Glue, Data Pipelines, Athena, S3, Step Functions etc.) & AWS CLI Experience with CI/CD tools such as Jenkins Experience working with relational and more »
Greater London, England, United Kingdom Hybrid / WFH Options
Humand Talent
8+ with experience in Laravel/Symfony desired. JavaScript/TypeScript proficiency with NodeJS and React.JS. Familiarity with AWS Services (Aurora, MSK Kafka, ECS, EMR). 3+ years of database experience, ideally with MySQL. Hands-on experience with various data storage paradigms (e.g., RDMS + Document + KV Stores more »
massively distributed cloud-hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End to End … Strong Python/Java Software Engineering skills Excellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and Snowflake Previous experience with Dremio, dbt, EMR or Dagster Good Computer Science fundamentals knowledge with strong knowledge of software and data architecture. If you would like more information on the above more »
Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation, EMR, EventBridge, Athena, etc.) & metadata management tools (such as Amundsen, Atlas, DataHub, OpenDataDiscovery, Marquez, etc.), Experience on RDBMS like PostgreSQL would be a plus. Experience more »
specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. Essential Skills and Experience: * AWS (e.g., Athena, Redshift, Glue, EMR) * Strong AWS Data Solution Architect Experience on Data Related Projects * Java, Scala, Python, Spark, SQL * Experience of developing enterprise grade ETL/ELT data … applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). * Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. * NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. * BigQuery and Data Studio/Looker. more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
English Bachelor's degree in Computer Science or a related field (Master's degree preferred) Nice to have: experience with LLMs, Vector Databases, AWS EMR, Spark, and Python Our commitment: Equal opportunities are important to us. We believe that diversity and inclusion at The Stepstone Group are critical to more »