City of London, London, United Kingdom Hybrid/Remote Options
Noir
Java Developer – Financial Technology – London/Hybrid (INSIDE IR35) (Key skills: Java, Spring Boot, Kubernetes, AWS EKS, AmazonRedshift, PostgreSQL, RESTful APIs, CI/CD, Microservices, iOS/Android Native, Agile, Financial Services) Are you a highly skilled Java Developer with a passion for building scalable, high-performance financial systems? Do you enjoy working with cutting-edge technologies … integrated with modern mobile and analytics platforms. As part of a collaborative, Agile development team, you will be responsible for building robust microservices, managing data pipelines across PostgreSQL and AmazonRedshift , and delivering performant, secure RESTful APIs used across web and mobile platforms. You’ll work closely with iOS and Android engineers to ensure seamless end-to-end More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Billigence
experience across data engineering, cloud computing, or data warehousing Minimum 2 years in hands-on development capacity Expertise in one or more modern cloud data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data More ❯
or similar Cloud data ecosystem (AWS) : hands-on experience with core AWS data services. Key services include: S3 for data lake storage AWS Glue for ETL and data cataloging AmazonRedshift or Athena for data warehousing and analytics Lambda for event-driven data processing. ETL/ELT pipeline development : experience in designing, building, and maintaining robust, automated data More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Az-Tec Talent
data engineering experience , including at least 2 years hands-on with modern data platforms. Strong proficiency in SQL , data modelling, and query optimisation. Practical experience with Snowflake, Databricks, AWS Redshift, or Microsoft Fabric . Solid understanding of ETL/ELT pipelines and data warehousing principles . Strong communication and problem-solving skills. Ability to work both independently and collaboratively More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Client Server
scale that to 500 million events per day. You'll be using Kafka for real-time data streaming and AWS to build, optimise and monitor data warehousing solutions in AmazonRedShift, owning and managing AWS based systems to ensure cost effective, secure and high performance data operations. Location/WFH: You can work from home from anywhere in … data engineering skills and experience, having been through multiple end-to-end data pipeline builds You have a deep knowledge of AWS cloud services including: S3, EC2, Lambda, RDS, RedShift, Glue as well as Kinesis, MongoDB and PostgreSQL You have experience of working in environments with high throughput data (millions of events per hour) You have strong Kafka experience More ❯
years of experience in data engineering or analytics, including designing and delivering enterprise grade data solutions Strong hands-on experience with Snowflake (or comparable cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Harrington Starr
sports betting, gaming, or a similar high-volume data environment. Strong coding skills in Python and advanced SQL expertise. Hands-on experience with AWS data tools (S3, Glue, Lambda, Redshift, Kinesis, etc.). Proficiency with Power BI (or equivalent data visualisation tools). Solid experience building and supporting data models for forecasting, pricing, or predictive analytics. Strong communication and More ❯
and risk workflows. What you’ll bring Proven experience as a Data Engineer within financial services, ideally hedge funds or buy-side. Strong expertise in AWS (S3, Glue, Lambda, Redshift, Kinesis, etc.). Advanced skills in Python and SQL for building pipelines and optimising performance. Hands-on experience with Power BI for data visualisation and reporting. Strong understanding of More ❯
branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache Spark (using PySpark). Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or Google BigQuery/Cloud Storage ). Exposure to workflow orchestration tools ( Apache Airflow, Prefect, or Dagster ). Bachelor's degree in Computer Science More ❯
. Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift). Experience with streaming technologies (Kafka, Kinesis, or similar). Strong knowledge of data modelling, governance, and architecture best practices . Excellent leadership, communication, and stakeholder management skills. NICE More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Intec Select
Engineer, particularly in data modelling. Strong SQL and hands-on dbt experience. Ability to convert business requirements into logical, scalable data models. Knowledge of cloud data platforms (e.g., Snowflake, Redshift, BigQuery). Strong communication and documentation skills. Structured, detail-oriented mindset. Desirable: Experience with semantic modelling tools (e.g., dbt SL, LookML). Familiarity with workflow orchestration and BI tooling. More ❯
Familiarity with Airflow, Dagster or similar data orchestration frameworks Strong understanding of RESTful APIs as well as experience working with both synchronous and asynchronous endpoints Experience with Snowflake or Redshift with a strong understanding of SQL. Proficient in Python and Pandas Experience working with JSON and XML Strong understanding of cloud computing concepts and services (AWS preferably) Experience with More ❯
business impact. Skills Excellent communication skills (written, verbal, and presentation). Strong organisational and planning abilities. Advanced technical expertise across Tableau, MySQL, Python, and modern BI tools (e.g. AWS Redshift). Ability to translate complex data into clear, actionable insights for business leaders. Knowledge Advanced knowledge of statistics and data modelling. Data visualisation and storytelling best practice. Data pipelines More ❯
libraries for data wrangling, such as Pandas, NumPy, and SQLAlchemy. Experience working with traditional SQL databases (e.g. PostgreSQL, MySQL, SQL Server) and cloud data warehouses (e.g. Snowflake, Databricks, BigQuery, Redshift). Experience with time-series data, and/or implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git, Docker, Jenkins/TeamCity More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Areti Group | B Corp™
sources. Ensure the pipelines are optimized for performance, reliability, and maintainability. 🌳 AWS Expertise: Utilize AWS services for data storage, processing, and analytics. Leverage AWS technologies such as S3, Glue, Redshift, and others to build and maintain data solutions. Implement best practices for security and compliance in an AWS environment. 🌳 Python Programming: Demonstrate proficiency in Python programming for data processing More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Advanced Resource Managers
monitoring and observer ability toolsets inc. Splunk, Datadog Experience using Github Actions Experience using AWS RDS/SQL based solutions Experience using containerization in AWS Working data warehouse knowledge Redshift and Snowflake preferred Working with IaC – Terraform and Cloud Formation Working understanding of scripting languages including Python and Shell Experience working with streaming technologies inc. Kafka, Apache Flink Experience More ❯
your team. Lead by example — fostering collaboration, accountability, and agile delivery in every sprint. 🧠 What You Bring Expertise in AWS : Hands-on experience with Python, Glue, S3, Airflow, DBT, Redshift, and RDS. Proven success in end-to-end data engineering — from ingestion to insight. Strong leadership and communication skills, with a collaborative, solution-driven mindset. Experience working in Agile More ❯
Java) and SQL Deep experience with data pipeline orchestration tools (Airflow, dbt, Dagster, Prefect) Strong knowledge of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift) Hands-on experience with streaming technologies (Kafka, Kinesis, or similar) Solid understanding of data modelling, governance, and architecture best practices Familiarity with machine learning pipelines or AI model integration More ❯
will: Architect, optimise, and deliver ELT pipelines and data workflows (using dbt, Airflow, SQL, Python). Own and evolve large-scale analytics platforms and data warehouse solutions (Snowflake, BigQuery, Redshift). Mentor engineers across analytics engineering and data engineering disciplines. Ensure data solutions meet best practices for governance, integrity, modelling, and scalability. Drive innovation in data orchestration, schema design More ❯
techniques. What You’ll Bring Proven experience in DevOps or a similar engineering role. Proven experience deploying and managing AWS infrastructure (EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc.). Strong background in Terraform for infrastructure as code. Expertise in CI/CD automation using GitHub Actions. Hands-on experience managing Linux environments and scripting (Python, Bash More ❯
efficiency and deliver scalable, data-driven solutions. What you’ll bring: Strong ML Ops experience. Proficiency in Python and SQL . Hands-on cloud engineering experience with AWS & Azure (Redshift, Synapse). Familiarity with Terraform, Power BI, and Excel . A proactive, problem-solving mindset with strong collaboration skills. This role offers the chance to work within a progressive More ❯
data strategy for a high-growth Financial Services or FinTech firm. A Data Engineering background (or similar) with excellent hands-on experience and understanding of Data Warehouse (e.g. BigQuery, Redshift, Snowflake, Databricks), orchestration, data modelling and cloud technology (AWS, GCP, Azure) Strong AI literacy, including leveraging AI to pull insights from structured and unstructured data Excellent communication skills with More ❯
techniques. What You’ll Bring Proven experience in DevOps or a similar engineering role. Proven experience deploying and managing AWS infrastructure (EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc.). Strong background in Terraform for infrastructure as code. Expertise in CI/CD automation using GitHub Actions. Hands-on experience managing Linux environments and scripting (Python, Bash More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Harnham
will have the following skills and experience: Strong Python (python frameworks), backend development experience (3+ years) Experience working with AI/ML or big data systems. Familiarity with AWS , Redshift , or other cloud data platforms (GCP/Azure also welcome). Background in AdTech, MarTech or experience with Meta Marketing API or Google Ads API is a plus. Ability More ❯