Java Spring - SQL, JPA, Hibernate - AWS Services: Lambda, S3, deployment in EKS, Api Gateway, load balancer - REST APIs design - OpenAPI/Swagger (definition of the yaml ) - Postman or similar API testing - API First Knowledge appreciated: - Python - Knowledge of the deployments pipelines within AWS. Soft Skills: - Teamwork - Pro-activity - Engaged More ❯
your skillset. We are looking for someone who can demonstrate an aptitude or willingness to learn some or all of the following technologies. AWS - S3, IAM, RDS, EMR, EC2, etc Linux Commands Trino Apache Spark Node.js JavaScript Preact.js Postgres MySQL HTML CSS Target Salary Range is $125k-$150k or More ❯
front runner for this position. Nice to haves (in order of priority – at least one or two is needed): Python (commercial experience) AWS services – S3, ECS, AppSync GraphQL or REST APIs Kafka or message queue systems PostgreSQL or NoSQL database experience To apply – click the link or for a More ❯
front runner for this position. Nice to haves (in order of priority – at least one or two is needed): Python (commercial experience) AWS services – S3, ECS, AppSync GraphQL or REST APIs Kafka or message queue systems PostgreSQL or NoSQL database experience To apply – click the link or for a More ❯
years data engineering experience Proficient using both SQL & Python Commercial experience using AWS Further knowledge of AWS skills including some of the following S3, EC2, RDS, Cognito, Cloudwatch, EFS, Aurora, Athena, RedShift What's in it for you: 📍Location: Somerset (2/3 days a week onsite) ⭐Annual discretionary More ❯
of relevant experience in data engineering. Experience in data modelling and end-to-end data pipelines. Knowledge of AWS core services (EC2, ECS, RDS, S3). Strong communication skills and ability to work in a collaborative environment. Please note that you will be required to work in their Bristol More ❯
Scala (with a focus on functional programming), and Sparknbsp; Familiarity with Spark APIs, including RDD, DataFrame, MLlib, GraphX, and Streamingnbsp; Experience working with HDFS, S3, Cassandra, and/or DynamoDBnbsp; Deep understanding of distributed systemsnbsp; Experience with building or maintaining cloud-native applicationsnbsp; Familiarity with serverless approaches using AWS More ❯
environments, the capability to continuously integrate and containerisation functionality and the automation of testing. Requirements: Must be able to build new DevOps pipelines AWS S3 RDS Route 53 IAM EKS Secrets Manager ECR Kubernetes Helm Kops Ingress/Egress Terraform Deployment of AWS Resources Pipelines OCI Observability ELK Dynatrace More ❯
Node.js, Web framework: hapi (but equivalent experience with express.js is fine), Postgres Frontend - SASS, GDS, Vanilla JS or other frameworks AWS - Cloudfront, Elasticbeanstalk, RDS, S3, SES, SQS, etc API gateways, Labda Fullstack Developer - Remote - £500-£550 per day - 5 months Damia Group Limited acts as an employment agency for More ❯
app development, SQL, ETL or data pipelines, and data analysis. You have experience with cloud data warehouses/lakes including Snowflake, Databricks, BigQuery, Redshift, S3, and ADLS. You have experience with AWS, GCP, and/or Azure cloud services. You have strong technical skills and experience with data modeling More ❯
process data for modeling Experience with SQL Experience in the data/BI space Preferred qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets If you are interested More ❯
process data for modeling Experience with SQL Experience in the data/BI space Preferred qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets If you are interested More ❯
to consume complex REST APIs. • Demonstrated experience with tools, languages, and frameworks: Python, PySpark, JavaScript, Vue, Nuxt.js, and Viz.js • Demonstrated experience with AWS services: S3 and EC2. • Demonstrated experience with databases: relational, graph (Neo4J/Graph-Tool), and NoSQL/document (MongoDB). • Demonstrated experience deploying software using: Linux More ❯
with security, and user access DBT/general data modelling with dault vault experience being desirable Airflow and Python experience Proficient with AWS- Lambda, S3, SNS, CDK- DevOPs Need to be able to build, deploy and use Terraform Benefits Bonus opportunity - 10% of annual salary Actual amount depends on More ❯
websites. You should be proficient in JavaScript, jQuery, HTML, CSS/SCSS, Shopify (working with liquid files), and related technologies. Experience in AWS (Lambda, S3, API Gateway), Node/NPM, and with SQL and/or NoSQL databases is also desirable. A working knowledge of CMS implementation and server More ❯
websites. You should be proficient in JavaScript, jQuery, HTML, CSS/SCSS, Shopify (working with liquid files), and related technologies. Experience in AWS (Lambda, S3, API Gateway), Node/NPM, and with SQL and/or NoSQL databases is also desirable. • A working knowledge of CMS implementation and server More ❯
Vault modelling (strong experience required) Building data pipelines with DBT & PySpark Snowflake Airflow JSON, Parquet Data processing in an AWS cloud environment: AWS services (S3, Glue, Athena, AppFlow, DMS, ) & IAM Agile delivery, CI/CD, Git Experience with, notions of or profound interest in ("nice to have"): Qlik Data More ❯
SAS, or Matlab Proficiency in SQL and scripting (Python) for data processing and modeling Preferred qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, and working with large, complex datasets in a business environment Client Description Our client is a FTSE More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Intellect Group
Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work from anywhere in the UK Optional weekly in-person collaboration in More ❯
cambridge, east anglia, United Kingdom Hybrid / WFH Options
Intellect Group
Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work from anywhere in the UK Optional weekly in-person collaboration in More ❯
Cambridge, south west england, United Kingdom Hybrid / WFH Options
Intellect Group
Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work from anywhere in the UK Optional weekly in-person collaboration in More ❯
applications using modern javascript frameworks (ideally Vue/Nuxt.js) Has experience of working with both MongoDB and AWS hosting and backend services (e.g. Cognito, S3, SES) Has experience of designing, building and maintaining automated continuous integration tests Has experience of working with other developers in an agile set-up More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Aquent
process data for modeling Experience with SQL Experience in the data/BI space. Preferred qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets. Client Description: Our Client More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Aquent
process data for modeling Experience with SQL Experience in the data/BI space. Preferred qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets. Client Description: Our Client More ❯