environments, the capability to continuously integrate and containerisation functionality and the automation of testing. Requirements: Must be able to build new DevOps pipelines AWS S3 RDS Route 53 IAM EKS Secrets Manager ECR Kubernetes Helm Kops Ingress/Egress Terraform Deployment of AWS Resources Pipelines OCI Observability ELK Dynatrace More ❯
Node.js, Web framework: hapi (but equivalent experience with express.js is fine), Postgres Frontend - SASS, GDS, Vanilla JS or other frameworks AWS - Cloudfront, Elasticbeanstalk, RDS, S3, SES, SQS, etc API gateways, Labda Fullstack Developer - Remote - £500-£550 per day - 5 months Damia Group Limited acts as an employment agency for More ❯
app development, SQL, ETL or data pipelines, and data analysis. You have experience with cloud data warehouses/lakes including Snowflake, Databricks, BigQuery, Redshift, S3, and ADLS. You have experience with AWS, GCP, and/or Azure cloud services. You have strong technical skills and experience with data modeling More ❯
all of the below skills, but must be willing to learn and upskill Systems: Windows & Linux admin, AD, GPO, DFS Cloud: AWS (EC2, RDS, S3, IAM, VPC, CloudWatch, etc.) Automation: Terraform, Vagrant, Ansible, Shell, Python, PowerShell Monitoring: Grafana, Prometheus, Node Exporter Security & Tools: Juniper Firewalls, Nexus scanning, Git, Jira More ❯
process data for modeling Experience with SQL Experience in the data/BI space Preferred qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets If you are interested More ❯
process data for modeling Experience with SQL Experience in the data/BI space Preferred qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets If you are interested More ❯
to consume complex REST APIs. • Demonstrated experience with tools, languages, and frameworks: Python, PySpark, JavaScript, Vue, Nuxt.js, and Viz.js • Demonstrated experience with AWS services: S3 and EC2. • Demonstrated experience with databases: relational, graph (Neo4J/Graph-Tool), and NoSQL/document (MongoDB). • Demonstrated experience deploying software using: Linux More ❯
with security, and user access DBT/general data modelling with dault vault experience being desirable Airflow and Python experience Proficient with AWS- Lambda, S3, SNS, CDK- DevOPs Need to be able to build, deploy and use Terraform Benefits Bonus opportunity - 10% of annual salary Actual amount depends on More ❯
websites. You should be proficient in JavaScript, jQuery, HTML, CSS/SCSS, Shopify (working with liquid files), and related technologies. Experience in AWS (Lambda, S3, API Gateway), Node/NPM, and with SQL and/or NoSQL databases is also desirable. A working knowledge of CMS implementation and server More ❯
websites. You should be proficient in JavaScript, jQuery, HTML, CSS/SCSS, Shopify (working with liquid files), and related technologies. Experience in AWS (Lambda, S3, API Gateway), Node/NPM, and with SQL and/or NoSQL databases is also desirable. • A working knowledge of CMS implementation and server More ❯
Vault modelling (strong experience required) Building data pipelines with DBT & PySpark Snowflake Airflow JSON, Parquet Data processing in an AWS cloud environment: AWS services (S3, Glue, Athena, AppFlow, DMS, ) & IAM Agile delivery, CI/CD, Git Experience with, notions of or profound interest in ("nice to have"): Qlik Data More ❯
SAS, or Matlab Proficiency in SQL and scripting (Python) for data processing and modeling Preferred qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, and working with large, complex datasets in a business environment Client Description Our client is a FTSE More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Intellect Group
Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work from anywhere in the UK Optional weekly in-person collaboration in More ❯
cambridge, east anglia, United Kingdom Hybrid / WFH Options
Intellect Group
Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work from anywhere in the UK Optional weekly in-person collaboration in More ❯
Cambridge, south west england, United Kingdom Hybrid / WFH Options
Intellect Group
Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work from anywhere in the UK Optional weekly in-person collaboration in More ❯
with ETL/ELT 3+ years of experience with SQL Experience with Python, C#, or similar high-level language Experience with AWS, AWS Lambda, S3, Airflow, and related technologies Professional work experience with Databricks or PySpark Demonstrated team player who works well in a fast-paced environment and can More ❯
applications using modern javascript frameworks (ideally Vue/Nuxt.js) Has experience of working with both MongoDB and AWS hosting and backend services (e.g. Cognito, S3, SES) Has experience of designing, building and maintaining automated continuous integration tests Has experience of working with other developers in an agile set-up More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Aquent
process data for modeling Experience with SQL Experience in the data/BI space. Preferred qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets. Client Description: Our Client More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Aquent
process data for modeling Experience with SQL Experience in the data/BI space. Preferred qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets. Client Description: Our Client More ❯
Do: Design, develop, and optimize scalable ELT pipelines, ensuring reliable data delivery from diverse sources such as APIs, transactional databases, file-based endpoints, and S3 buckets. Build and maintain a robust Data Platform using tools like Airbyte, dbt, and Snowflake. Collaborate with product and regional teams to design data More ❯
Do: Design, develop, and optimize scalable ELT pipelines, ensuring reliable data delivery from diverse sources such as APIs, transactional databases, file-based endpoints, and S3 buckets. Build and maintain a robust Data Platform using tools like Airbyte, dbt, and Snowflake. Collaborate with product and regional teams to design data More ❯
Do: Design, develop, and optimize scalable ELT pipelines, ensuring reliable data delivery from diverse sources such as APIs, transactional databases, file-based endpoints, and S3 buckets. Build and maintain a robust Data Platform using tools like Airbyte, dbt, and Snowflake. Collaborate with product and regional teams to design data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
Do: Design, develop, and optimize scalable ELT pipelines, ensuring reliable data delivery from diverse sources such as APIs, transactional databases, file-based endpoints, and S3 buckets. Build and maintain a robust Data Platform using tools like Airbyte, dbt, and Snowflake. Collaborate with product and regional teams to design data More ❯
East London, London, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
Do: Design, develop, and optimize scalable ELT pipelines, ensuring reliable data delivery from diverse sources such as APIs, transactional databases, file-based endpoints, and S3 buckets. Build and maintain a robust Data Platform using tools like Airbyte, dbt, and Snowflake. Collaborate with product and regional teams to design data More ❯