Fullstack Engineer - Up to £60k - 2 Days onsite - Fulham I am currently working with an excting SaaS client who has built out a Risk management solutions platform. Their primary focus is to provide businesses with advanced tools and platforms that More ❯
and management of errors in Go hands on expereince of Packages & Moduldes plus Go Testing Packages Understanding of Restful APIs, Microservices and distributed system AWS Expereince using EC2, Lamda, S3 EBS, RDS/Dyanamo DB and IAM/KMS Understanding GitLab CI/CD , Git Merging, Rebasing & Conflit Reolution Expereince in Containerization (Docker images) and understanding of Container lifecycle More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Halian Technology Limited
recommend appropriate technologies, frameworks, and tools for specific use cases. Cloud Architecture (AWS): Lead the design and implementation of cloud-native applications on AWS, leveraging services such as EC2, S3, RDS, Lambda, and others. Ensure that security, performance, and cost optimization principles are applied in cloud-based solutions. Documentation & Governance: Maintain comprehensive technical documentation for architecture designs, patterns, and … not required. Cloud Expertise: Solid experience with Amazon Web Services (AWS) , including architecture, deployment, and optimization of cloud applications. Familiarity with AWS services such as EC2, Lambda, S3, RDS, and API Gateway . Architecture & Design Patterns: Strong grasp of software design patterns and best practices for large-scale distributed systems. Ability to design solutions that are highly More ❯
fast-moving agile squads Experience in commodities or trading (preferred but not essential) Nice to haves (in order of priority – at least one or two is needed): AWS services – S3, ECS, AppSync PostgreSQL or NoSQL database experience GraphQL or REST APIs TypeScript React or Svelte (not essential, but helpful for collaboration) To apply – click the link or for a More ❯
this will make you a front runner for this position. Nice to haves (in order of priority – at least one or two is needed): Python (commercial experience) AWS services – S3, ECS, AppSync GraphQL or REST APIs Kafka or message queue systems PostgreSQL or NoSQL database experience To apply – click the link or for a faster response, email Barry.Ansell@HarringtonStarr.com More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
leader with deep AWS expertise and a consulting mindset as an experienced Principal Data Engineer. Key Responsibilities • Lead and deliver enterprise-grade data engineering solutions using core AWS services (S3, Glue, Lambda, Redshift, Matillion, etc.). • Define architecture, mentor large technical teams, and engage directly with senior client stakeholders. • Own technical responses for RFI/RFP processes, partnering with More ❯
working with ITSM systems such as Service Now and understands priorities and SLA’s. • Has experience of AWS Product set such as EC2, Lambda, EKS, ECR, WorkSpaces, EBS, EFS, S3, RDS MSSQL, Route 53, VPC, Cloud Formation.Has strong documentation and technical writing skills. Desired • Has experience working in financial services sector. • Has experience of Qualys, and Ivanti Risk Sense. More ❯
in Informatica Axon Administration · 3 years' experience with the UNIX operating system · 3 years' experience with the AWS platform and services (EC2,RDS,Security,Loadbalancing, Route-53,Jump box,S3, etc. · 3 years' experience UNIX scripting (UNIX Shell or Perl) · 5 years' experience with SQL on Oracle and MS SQL Server · Experience of working across multiple stakeholder levels in More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Intec Select
from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batch processing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data to meet evolving business requirements. Perform customer behavior analysis, gaming analytics, and create actionable insights to enhance customer … experience in data engineering roles, with a proven ability to lead and mentor a team. Expertise in SQL, Python, and R. Strong proficiency in AWS technologies such as Redshift, S3, EC2, and Lambda. Experience with Kafka and real-time data streaming technologies. Advanced skills in building ETL pipelines and integrating data from APIs. Familiarity with data visualization and reporting More ❯
solving, thrives in greenfield project environments, and enjoys working both independently and collaboratively. Key Responsibilities as a Principal Data Engineer Propose and implement data solutions using AWS services including S3, Redshift, Lambda, Step Functions, DynamoDB, AWS Glue, and Matillion . Work directly with clients to define requirements, refine solutions, and ensure successful handover to internal teams. Design and implement … principles. Contribute to a collaborative, knowledge-sharing team culture. Required Qualifications & Skills Strong experience in ETL processes and cloud data warehouse patterns . Hands-on expertise with AWS services (S3, Glue, Redshift). Proficiency with Matillion for data transformation. Experience working with various relational databases . Familiarity with data visualization tools such as QuickSight, Tableau, Looker, or QlikSense . More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
tooling. It suits someone who thrives in greenfield environments, enjoys client engagement, and values clean, scalable, well-documented engineering. Key Responsibilities: Design and build robust data pipelines using AWS (S3, Redshift, Glue, Lambda, Step Functions, DynamoDB). Deliver ETL/ELT solutions with Matillion and related tooling. Work closely with client teams to define requirements and hand over production … ready solutions. Own infrastructure and deployment via CI/CD and IaC best practices. Contribute to technical strategy and mentor junior engineers. Requirements: Strong hands-on AWS experience – S3, Redshift, Glue essential. Proven experience building ETL/ELT pipelines in cloud environments. Proficient in working with structured/unstructured data (JSON, XML, CSV, Parquet). Skilled in working with More ❯
their organisation are looking to utilise the newest technologies on the market. Your role will include: Responsible for designing and implementing effective Architectural solutions around the AWS Severless Technologies (S3, Lambda, Athena, Kafka) and Databricks including Data Lake and Data Warehousing. Assess database implementation procedures to ensure they comply with GDPR and data compliance. Guide, influence and challenge the … have extensive knowledge of writing codes, building data pipelines and doing digital transformation and ingestion a certain tech suite. Extensive experience in implementing solutions around the AWS cloud environment (S3, Databricks, Athena, Glue), In depth understanding of Workflows, Asset Bundles, SQS, EKS, Terraform, Excellent understanding of Data Modelling & Kinesis An understanding of SQL/database management. Strong hands-on More ❯
QA processes and supporting the business with both regular and ad hoc data deliverables 🛠 Tech you’ll work with: SQL Server (SSIS, SSRS, SSAS) Python AWS stack – Glue, Lambda, S3, EC2, Jupyter Power BI or Tableau (bonus) Excel (PowerPivot, VBA, lookups, advanced formulas) 🌱 You’ll also: Collaborate closely with our Data Engineers and Product Owner Own your solutions end More ❯
. Implement and manage DBT models for data transformation and modeling in a modern data stack. Proficiency in SQL , Python , and PySpark . Experience with AWS services such as S3, Athena, Redshift, Lambda, and CloudWatch. Familiarity with data warehousing concepts and modern data stack architectures. Experience with CI/CD pipelines and version control (e.g., Git). Collaborate with More ❯
using Java and python. Ensure code is optimized, scalable, and maintainable. Azure Expertise: Leverage Azure services extensively, particularly Azure Storage, for scalable cloud solutions. Ensure seamless integration with AWS S3 and implement secure data encryption/decryption practices. Team Leadership: Mentor a team of 3 engineers, fostering best practices in software development and code quality. Vendor Collaboration: Work closely More ❯
model layer to drive consistency across Looker, Power BI, and other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical More ❯
Job Description AWS Stack, data being landed in S3, Lambda triggers, Data Quality, data written back out to AWS S3(Parquet Formats), Snowflake for dimensional model. Design and build the data pipelines, work with someone around understanding data transformation, this is supported by BA's, building out the data pipelines, moving into layers in the data architecture (Medallion … architecture). Requirements 5 plus year's experience with AWS 5-8 years overall exposure to Data Technical Lead experience Proficient in Snowflake, S3, Lambda Triggers Hybrid London 1-2 days a week £(Apply online only) (Outside IR35) The reason they're hiring now, they have a client that is all on a legacy system, on-prem platform, and More ❯
and scripting experience (Python) to process data for modeling Experience with SQL Experience in the data/BI space Preferred qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets If you are interested please apply or send your More ❯
Proficiency in SQL for complex querying, optimization, and stored procedures. Strong coding skills in Python for data transformation, scripting, and automation. Experience with cloud platforms such as AWS (e.g., S3, Redshift, Lambda) , Azure (e.g., Data Factory, Synapse) , or GCP (e.g., BigQuery, Cloud Functions) . Familiarity with data orchestration tools (e.g., Airflow, dbt) and version control (Git). Solid understanding More ❯
Points Experience in data-heavy creative or media industries Passion for the arts Familiarity with structured metadata standards (e.g., DDEX ERN or RDR) Experience with PostgreSQL , AWS RDS , and S3 Perks & Benefits 25 days holiday , plus bank holidays and a Christmas office closure 6% employer pension contribution (min. 3% employee) Access to ClassPass , Headspace , and regular creative perks Industry More ❯
stack consists of: Websites built using Next.js and Node.js Headless CMS using Drupal APIs written in PHP Hosting using Heroku Apps written in React Native AWS for other services – S3, WAF, Cloudfront, RDS, EC2, Route53 You Have These: Interest in the Arts Strong leadership and influencing skills across all areas of a business Strong communicator who is clear and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Reward
A degree in STEM, Data Science, Operational Research, or a related field Nice to have Power BI advanced features or Excel (charts, pivots, modelling) Familiarity with AWS tools: Athena, S3, Glue, Lambda The Benefits We take care of our people with a benefits package that’s designed for long-term balance, wellbeing, and opportunity: 25 days annual leave + More ❯
within a modern, cloud-based BI ecosystem , including : Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, AmazonS3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control More ❯
experience in Python and SQL. Experience working with large datasets. Exceptional communication and collaboration skills. Advantageous: Any operational experience supporting real-time systems. Working knowledge of AmazonS3, Airflow, Kafka, Docker. More ❯
be responsible for deploying tooling through cloud platforms, utilising new LLM models and building out new pipelines. Desired Skills ⚙️ Python, SQL Tableau, Power BI Terraform, Bedrock AWS (DMS, Redshift, S3) Azure (Synapse, Microsoft AI) If you are a skilled Engineer (Python, SQL, Tableau, AWS, Azure) who is interested in this role then please apply below and I will be More ❯