structure organizationPreferred qualifications, capabilities and skillsPrior experience with customer-centric, multi-channel applicationsExperience running applications on KubernetesFamiliarity with event based streaming systems like Kafka, Kinesis or PulsarFamiliarity with non-relational databases (DynamoDB, MongoDB, Bigtable)Familiarity with authn/z protocols like OAuth2/OIDC and IdPs like (Auth0, Forgerock more »
certifications on the training platform of your choice. As a Senior Solution architect, you'll be implementing and driving large-scale cloud implementations on Amazon AWS. Other duties include: Architecting and implementing cloud solutions for different applications, automating cloud operations and cloud deployments Assume responsibility for the architecture of … consulting with application development teams on same Cloud technology, ideally AWS including PaaS services such as S3, Lambda, Step Functions, ECS, Docker, SNS, SQS, Kinesis, SageMaker etc. Understanding of distributed application design, patterns, and experience producing design diagrams and documenting standards Qualifications Security Clearance eligible. You musthavethe right to more »
certifications on the training platform of your choice. As a Senior Solution architect, you'll be implementing and driving large-scale cloud implementations on Amazon AWS. Other duties include: Architecting and implementing cloud solutions for different applications, automating cloud operations and cloud deployments Assume responsibility for the architecture of … consulting with application development teams on same Cloud technology, ideally AWS including PaaS services such as S3, Lambda, Step Functions, ECS, Docker, SNS, SQS, Kinesis, SageMaker etc. Understanding of distributed application design, patterns, and experience producing design diagrams and documenting standards Qualifications Security Clearance eligible. You musthavethe right to more »
certifications on the training platform of your choice. As a Senior Solution architect, you'll be implementing and driving large-scale cloud implementations on Amazon AWS. Other duties include: Architecting and implementing cloud solutions for different applications, automating cloud operations and cloud deployments Assume responsibility for the architecture of … consulting with application development teams on same Cloud technology, ideally AWS including PaaS services such as S3, Lambda, Step Functions, ECS, Docker, SNS, SQS, Kinesis, SageMaker etc. Understanding of distributed application design, patterns, and experience producing design diagrams and documenting standards Qualifications Security Clearance eligible. You musthavethe right to more »
certifications on the training platform of your choice. As a Senior Solution architect, you'll be implementing and driving large-scale cloud implementations on Amazon AWS. Other duties include: Architecting and implementing cloud solutions for different applications, automating cloud operations and cloud deployments Assume responsibility for the architecture of … consulting with application development teams on same Cloud technology, ideally AWS including PaaS services such as S3, Lambda, Step Functions, ECS, Docker, SNS, SQS, Kinesis, SageMaker etc. Understanding of distributed application design, patterns, and experience producing design diagrams and documenting standards Qualifications Security Clearance eligible. You musthavethe right to more »
operates at peak performance, scalability, and reliability. Technical Skills required: RedShift Glue (inc. Glue Studio, Glue Data Quality, Glue DataBrew) Step Functions Athena Lambda Kinesis Python, Spark, Pyspark, SQL Your contributions as a Data Engineer will directly impact the organization's operations and revenue. In addition to a competitive more »
Diversity and Inclusion. Role : Machine Learning engineer Key Responsibilities Work with our customers and focus on our AWS Analytics and ML service offerings such AmazonKinesis, AWS Glue, Amazon Redshift, Amazon EMR, Amazon Athena, Amazon Sagemaker and more. Help our customers to remove the … Solution Architects, Data Scientists and Service Engineering teams Experience with design, development and operations that leverages deep knowledge in the use of services like AmazonKinesis, Apache Kafka, Apache Spark, Amazon Sagemaker, Amazon EMR, NoSQL technologies and other 3rd parties Develop and define key business questions … Science, Engineering, Mathematics or a related field Experience of Data platform implementation, including 3+ years of hands-on experience in implementation and performance tuning Kinesis/Kafka/Spark/Storm implementations Experience with analytic solutions applied to the Marketing or Risk needs of enterprises Basic understanding of machine more »
services (Apache Spark, Beam or equivalent). • In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. • Excellent consulting experience and ability to design and build solutions, actively contribute more »
you build it, you run it. It’s a full end to end role within the scope of backend services. Tech : Java, AWS (S3, Kinesis Streams, SQS, SNS, Lambda, ECS Fargate), NoSQL, Micronaut + more. About you The ideal candidate will have demonstrable experience with Java and AWS services. more »
AWS (ideally in both) Knowledge of MPP systems such as Athena, BigQuery, EMR, Hive, Iceberg Exposure to data streaming technologies such as Kafka or Kinesismore »
Implement Data Solutions: Develop scalable and efficient data architecture solutions on AWS. Create and maintain data pipelines using AWS services such as Glue, Lambda, Kinesis, and Data Pipeline. Design data models and databases using AWS RDS, Redshift, DynamoDB, and other relevant technologies. Data Integration and Management: Integrate various data … Information Technology, or a related field. Master’s degree preferred. Technical Expertise: Proven experience with AWS data services including S3, Redshift, RDS, Glue, Lambda, Kinesis, and Athena. Strong understanding of data modeling, ETL processes, and data warehousing. Proficiency in SQL and experience with database management systems. Familiarity with data more »
So each team leverages the technology that fits their needs best. br br You’ll see us working with data processing/streaming like Kinesis; application technologies like PostgreSQL, Redis & DynamoDB; and breaking things using in-house chaos principles and tools such as Gatling to drive load all deployed … especially those written for the JVM. br br Relational and NoSQL databases, particularly PostgreSQL, Aurora, & DynamoDB with experience modelling & optimising query performance. br br Kinesis or any other streaming data. br br Excellent communication skills, specifically in understanding, framing and simplifying both technical & business requirements. br br Comfort with more »
features across their estate of cloud native software products and services. The DevOps Engineer will have deep knowledge of the AWS data engineering stack: Kinesis, Redshift, Glue, EMR and be passionate about writing high-quality code that not only solves problems but improves the overall estate. The role will … organisation going forward. Required DevOps Engineer experience. AWS skills. Docker, Kubernetes skills. knowledge across the AWS data engineering stack including but not limited to: Kinesis, Redshift, Glue, EMR, Databricks. of the software development lifecycle and the role DevOps plays within it. working in autonomous agile teams and environments. to more »
At Wave Talent, we don't want to consume any more of your time trying to decipher job descriptions to identify the information you need. Instead, we spoke with you all to understand the key information you'd like to more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry standards like CRISP-DM. more »
the following to build high throughput, low latency, near real-time products: Java and Scala based Web Services, Databricks Data Lakes (Delta Lakes), AWS Kinesis and MSK, AWS ElasticSearch, AWS RDS, Apache Flink & Spark, scripting using Python, Terraform’s infrastructure as a code. The interview process Our interview aims more »
working knowledge of AWS services (EMR, ECS, IAM, EC2, S3, DynamoDB, MSK). Our Technology Stack ð» Scala and Python Kafka, Spark, Kafka Streams, Kinesis, Akka and KSQL AWS, S3, Iceberg, Parquet, Glue and Spark/EMR for our Data Lake Elasticsearch, Dynamodb and Redis Starburst and Athena Airflow more »
Monitoring and optimising applications, especially those written for the JVM Relational and NoSQL databases, particularly PostgreSQL, Aurora, & DynamoDB with experience modelling & optimising query performance Kinesis or any other streaming data Excellent communication skills, specifically in understanding, framing and simplifying both technical & business requirements Comfort with ambiguity and leading conversations more »
energetic and dynamic environmentCompetent and comfortable working with Typescript, Serverless technologiesYou should have extensive experience working with AWS (API Gateway, Lambda, Dynamo, DB, SQS, Kinesis, S3, ELK + More)Great experience in implementing best practices like TDD, SOLID and clean codeExperience in developing REST-APIs, Microservices, relational databases and more »
as Adobe Campaign, Braze or moEngage.Experience in integrating and working with marketing technologies and third-party platformsFamiliarity with event based streaming systems like Kafka, Kinesis or Pulsar#ICBcareers #ICBEngineering J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world’s most prominent corporations more »
our objectives. So each team leverages the technology that fits their needs best. You’ll see us working with data processing/streaming like Kinesis; application technologies like PostgreSQL, Redis & DynamoDB; and breaking things using in-house chaos principles and tools such as Gatling to drive load… all deployed … Monitoring and optimising applications, especially those written for the JVM. Relational and NoSQL databases, particularly PostgreSQL, Aurora, & DynamoDB with experience modelling & optimising query performance. Kinesis or any other streaming data. Excellent communication skills, specifically in understanding, framing and simplifying both technical & business requirements. Comfort with ambiguity and leading conversations more »
availabilityData integration: Familiarity with data integration tools and techniques, including ETL (Extract, Transform, Load) processes and real-time data streaming (e.g., using Apache Kafka, Kinesis, or Pub/Sub), exposing data sets via GraphQLCloud platforms expertise: Deep understanding of GCP/AWS services, architectures, and best practices, with hands more »
a technical leadership roleExperience in one of the main cloud services (AWS, Google Cloud or Azure) and Big Data services (EMR, Databricks, Synapse, HDInsight, Kinesis, Snowflake, etc.)Experience Skilled in use of the Power Platform, including Power Apps, Power Automate and Power BIAdvanced knowledge of Microsoft Excel and other more »
Technology Stack: We are cloud-native, born in AWS, and embrace their services across our platform. Our technology stack includes Python, Django, ECS, Postgres, Kinesis, Cloudfront, Flink, Elastic Search, Lambda, Amazonmq, Terraform, and Postgres. Tooling includes Datadog, Linear, Slack, Notion and CircleCI. If you think you tick the boxes more »
and Snowflake. Experience with various programming languages, such as Python and Java. Familiarity with streaming/event-driven architecture, including tools like Kafka and Kinesis/Firehose. Expertise in data ingestion techniques, encompassing both stream and batch processing. Proficiency using tools like Terraform for Infrastructure-as-Code and AWS more »