Cloud Data Fusion. NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. Snowflake Data Warehouse/Platform Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc Experience and more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Experian Ltd
using Python and the AWS Cloud Development Kit. You'll use a variety of services such as SNS, SQS and Lambda for event streams, Amazon Redshift, S3 and AWS Glue for data storage & reporting and Amazon SageMaker for our pioneering machine learning products. We are a growing team … products and the communication skills to work with our colleagues across ECS to help them use our services. Python andAWS Services such as S3, Kinesis, Lambda, Redshift, DynamoDB, Glue and SageMaker Infrastructure-as-Code tools and approaches (we use the AWS CDK with CloudFormation) Data processing frameworks such as more »
Data Architect will design and implement solutions using a range of AWS infrastructure, including S3, Redshift, Lambda, Step Functions, DynamoDB, AWS Glue, RDS, Athena, Kinesis, Quicksight. We also widely use other tech such as Snowflake, DBT, Databricks, Informatica, Matillion, Airflow, Tableau, Power BI etc. The Lead Data Architect will … aligned to the functional and non-functional requirements Work closely with other members of agile deployment team Selection & configuration of appropriate, base technologies (eg Amazon Redshift, RDS) Selection & application of appropriate standards & principles Capture & implementation of functional & non-functional requirements Expert in data modelling and latest data trends including more »
certifications on the training platform of your choice. As a Senior Solution architect, you'll be implementing and driving large-scale cloud implementations on Amazon AWS. Other duties include: Architecting and implementing cloud solutions for different applications, automating cloud operations and cloud deployments Assume responsibility for the architecture of … consulting with application development teams on same Cloud technology, ideally AWS including PaaS services such as S3, Lambda, Step Functions, ECS, Docker, SNS, SQS, Kinesis, SageMaker etc. Understanding of distributed application design, patterns, and experience producing design diagrams and documenting standards Qualifications Security Clearance eligible. You musthavethe right to more »
certifications on the training platform of your choice. As a Senior Solution architect, you'll be implementing and driving large-scale cloud implementations on Amazon AWS. Other duties include: Architecting and implementing cloud solutions for different applications, automating cloud operations and cloud deployments Assume responsibility for the architecture of … consulting with application development teams on same Cloud technology, ideally AWS including PaaS services such as S3, Lambda, Step Functions, ECS, Docker, SNS, SQS, Kinesis, SageMaker etc. Understanding of distributed application design, patterns, and experience producing design diagrams and documenting standards Qualifications Security Clearance eligible. You musthavethe right to more »
certifications on the training platform of your choice. As a Senior Solution architect, you'll be implementing and driving large-scale cloud implementations on Amazon AWS. Other duties include: Architecting and implementing cloud solutions for different applications, automating cloud operations and cloud deployments Assume responsibility for the architecture of … consulting with application development teams on same Cloud technology, ideally AWS including PaaS services such as S3, Lambda, Step Functions, ECS, Docker, SNS, SQS, Kinesis, SageMaker etc. Understanding of distributed application design, patterns, and experience producing design diagrams and documenting standards Qualifications Security Clearance eligible. You musthavethe right to more »
data integration, data platform) using cloud-based hyperscales, with emphasis on AWS services including S3, Redshift, Lambda, Step Functions, DynamoDB, AWS Glue, RDS, Athena, Kinesis, Quicksight. Hands on experience on design solutions using Snowflake and DBT. Familiarity with Databricks or Informatica. Hands on experience in designing data models aligned more »
Diversity and Inclusion. Role : Machine Learning engineer Key Responsibilities Work with our customers and focus on our AWS Analytics and ML service offerings such AmazonKinesis, AWS Glue, Amazon Redshift, Amazon EMR, Amazon Athena, Amazon Sagemaker and more. Help our customers to remove the … Solution Architects, Data Scientists and Service Engineering teams Experience with design, development and operations that leverages deep knowledge in the use of services like AmazonKinesis, Apache Kafka, Apache Spark, Amazon Sagemaker, Amazon EMR, NoSQL technologies and other 3rd parties Develop and define key business questions … Science, Engineering, Mathematics or a related field Experience of Data platform implementation, including 3+ years of hands-on experience in implementation and performance tuning Kinesis/Kafka/Spark/Storm implementations Experience with analytic solutions applied to the Marketing or Risk needs of enterprises Basic understanding of machine more »
services (Apache Spark, Beam or equivalent). • In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. • Excellent consulting experience and ability to design and build solutions, actively contribute more »
you build it, you run it. It’s a full end to end role within the scope of backend services. Tech : Java, AWS (S3, Kinesis Streams, SQS, SNS, Lambda, ECS Fargate), NoSQL, Micronaut + more. About you The ideal candidate will have demonstrable experience with Java and AWS services. more »
Manchester Area, United Kingdom Hybrid / WFH Options
Burns Sheehan
DevOps Engineer Location: 🌐 Primarily home based with monthly in person meetings in Manchester 🌐 Salary range: 💷 £70,000 to £80,000 💷 Technology: AWS (EC2, S3, Kinesis, Lambda), Terraform, Prometheus Burns Sheehan is working with a thriving SaaS payments business to help identify their next Senior DevOps Engineer. Reporting to the more »
AWS (ideally in both) Knowledge of MPP systems such as Athena, BigQuery, EMR, Hive, Iceberg Exposure to data streaming technologies such as Kafka or Kinesismore »
Implement Data Solutions: Develop scalable and efficient data architecture solutions on AWS. Create and maintain data pipelines using AWS services such as Glue, Lambda, Kinesis, and Data Pipeline. Design data models and databases using AWS RDS, Redshift, DynamoDB, and other relevant technologies. Data Integration and Management: Integrate various data … Information Technology, or a related field. Master’s degree preferred. Technical Expertise: Proven experience with AWS data services including S3, Redshift, RDS, Glue, Lambda, Kinesis, and Athena. Strong understanding of data modeling, ETL processes, and data warehousing. Proficiency in SQL and experience with database management systems. Familiarity with data more »
features across their estate of cloud native software products and services. The DevOps Engineer will have deep knowledge of the AWS data engineering stack: Kinesis, Redshift, Glue, EMR and be passionate about writing high-quality code that not only solves problems but improves the overall estate. The role will … organisation going forward. Required DevOps Engineer experience. AWS skills. Docker, Kubernetes skills. knowledge across the AWS data engineering stack including but not limited to: Kinesis, Redshift, Glue, EMR, Databricks. of the software development lifecycle and the role DevOps plays within it. working in autonomous agile teams and environments. to more »
Monitoring and optimising applications, especially those written for the JVM Relational and NoSQL databases, particularly PostgreSQL, Aurora, & DynamoDB with experience modelling & optimising query performance Kinesis or any other streaming data Excellent communication skills, specifically in understanding, framing and simplifying both technical & business requirements Comfort with ambiguity and leading conversations more »
the following to build high throughput, low latency, near real-time products: Java and Scala based Web Services, Databricks Data Lakes (Delta Lakes), AWS Kinesis and MSK, AWS ElasticSearch, AWS RDS, Apache Flink & Spark, scripting using Python, Terraform’s infrastructure as a code. The interview process Our interview aims more »
Technology Stack: We are cloud-native, born in AWS, and embrace their services across our platform. Our technology stack includes Python, Django, ECS, Postgres, Kinesis, Cloudfront, Flink, Elastic Search, Lambda, Amazonmq, Terraform, and Postgres. Tooling includes Datadog, Linear, Slack, Notion and CircleCI. If you think you tick the boxes more »
and Snowflake. Experience with various programming languages, such as Python and Java. Familiarity with streaming/event-driven architecture, including tools like Kafka and Kinesis/Firehose. Expertise in data ingestion techniques, encompassing both stream and batch processing. Proficiency using tools like Terraform for Infrastructure-as-Code and AWS more »
reporting. • Specialised in AWS cloud technologies for ETL, data warehouse, and data lake design. • Hands-on experience with AWS services like EMR, Glue, RedShift, Kinesis, Lambda, DynamoDB. • Capable of processing large volumes of structured and unstructured data on AWS. • Familiarity with AWS best practices in data engineering, data science more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
Engineer; Adeptness in utilising AWS and Azure data and analytics services (or their open-source equivalents), including but not limited to EMR, Glue, RedShift, Kinesis, Lambda, Data Factory, and Databricks. Strong experience in data engineering, data science, and product development, encompassing experience in both stream and batch processing. Proficiency more »
Engineer; Adeptness in utilising AWS and Azure data and analytics services (or their open-source equivalents), including but not limited to EMR, Glue, RedShift, Kinesis, Lambda, Data Factory, and Databricks. Strong experience in data engineering, data science, and product development, encompassing experience in both stream and batch processing. Proficiency more »
Master's preferred) 7+ years as a Network Engineer/Architect, with focus on SDN, AWS, Azure, and security Hands-on with AWS (Lambda, Kinesis, DynamoDB, S3) and Azure for serverless apps Deep knowledge of network security (firewalls, IDPS), reducing security incidents by 70% Certifications: CCNP, CCIE, AWS Certified more »
weight from the legacy system without actively changing it, to move to a cloud-native, event-driven architecture Tech Stack: Javascript, Node, Typescript, AWS (Kinesis, Event Bridge, DynamoDB, Lambda. What will you be doing? Working along side the team on the market place platform Working towards OKRS Modernising the more »
Grabyo is the video platform built for live, social and mobile. The amazon web services hosted platform removes the traditional complexities of professional video production and distribution. Grabyo is trusted by major sports teams, federations and broadcasters such as ITV, the Premier League and UFC to name only a … the team architecting our platform to improve our path to microservices. Java, JavaScript, Go, NoSQL, serverless architectures, microservices, auto-scaling and auto-recovery, AWS Kinesis, ECS Fargate, DynamoDB, media processing, Terraform - these are a few of the state-of-the-art techs you will be working with. To make more »
London, England, United Kingdom Hybrid / WFH Options
Jobleads-UK
requirements are well understood and stakeholders have accurate expectations on dates and functionality Requirements Experience building a data platform with streaming tech (Pulsar, Kafka, Kinesis) Design, build and launch highly available and distributed systems data extraction, transformation and loading of datasets Excellent communication and leadership skills. Including the ability … custom data frameworks and platforms Agile and Continuous Delivery Continuous integration pipelines Strong Python AWS or Azure with large-scale streaming data (Pulsar, Kafka, Kinesis, etc) ETL management; structured or custom (Airflow, Luigi, etc) Bonus Robust experience managing and developing an engineering team Delta lake or Iceberg Trino or more »