City of London, London, United Kingdom Hybrid / WFH Options
Noir
Java Developer – Financial Technology – London/Hybrid (INSIDE IR35) (Key skills: Java, Spring Boot, Kubernetes, AWS EKS, AmazonRedshift, PostgreSQL, RESTful APIs, CI/CD, Microservices, iOS/Android Native, Agile, Financial Services) Are you a highly skilled Java Developer with a passion for building scalable, high-performance financial systems? Do you enjoy working with cutting-edge technologies … integrated with modern mobile and analytics platforms. As part of a collaborative, Agile development team, you will be responsible for building robust microservices, managing data pipelines across PostgreSQL and AmazonRedshift , and delivering performant, secure RESTful APIs used across web and mobile platforms. You’ll work closely with iOS and Android engineers to ensure seamless end-to-end More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Billigence
experience across data engineering, cloud computing, or data warehousing Minimum 2 years in hands-on development capacity Expertise in one or more modern cloud data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data More ❯
or similar Cloud data ecosystem (AWS) : hands-on experience with core AWS data services. Key services include: S3 for data lake storage AWS Glue for ETL and data cataloging AmazonRedshift or Athena for data warehousing and analytics Lambda for event-driven data processing. ETL/ELT pipeline development : experience in designing, building, and maintaining robust, automated data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Az-Tec Talent
data engineering experience , including at least 2 years hands-on with modern data platforms. Strong proficiency in SQL , data modelling, and query optimisation. Practical experience with Snowflake, Databricks, AWS Redshift, or Microsoft Fabric . Solid understanding of ETL/ELT pipelines and data warehousing principles . Strong communication and problem-solving skills. Ability to work both independently and collaboratively More ❯
years of experience in data engineering or analytics, including designing and delivering enterprise grade data solutions Strong hands-on experience with Snowflake (or comparable cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Harrington Starr
sports betting, gaming, or a similar high-volume data environment. Strong coding skills in Python and advanced SQL expertise. Hands-on experience with AWS data tools (S3, Glue, Lambda, Redshift, Kinesis, etc.). Proficiency with Power BI (or equivalent data visualisation tools). Solid experience building and supporting data models for forecasting, pricing, or predictive analytics. Strong communication and More ❯
and risk workflows. What you’ll bring Proven experience as a Data Engineer within financial services, ideally hedge funds or buy-side. Strong expertise in AWS (S3, Glue, Lambda, Redshift, Kinesis, etc.). Advanced skills in Python and SQL for building pipelines and optimising performance. Hands-on experience with Power BI for data visualisation and reporting. Strong understanding of More ❯
. Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift). Experience with streaming technologies (Kafka, Kinesis, or similar). Strong knowledge of data modelling, governance, and architecture best practices . Excellent leadership, communication, and stakeholder management skills. NICE More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Intec Select
Engineer, particularly in data modelling. Strong SQL and hands-on dbt experience. Ability to convert business requirements into logical, scalable data models. Knowledge of cloud data platforms (e.g., Snowflake, Redshift, BigQuery). Strong communication and documentation skills. Structured, detail-oriented mindset. Desirable: Experience with semantic modelling tools (e.g., dbt SL, LookML). Familiarity with workflow orchestration and BI tooling. More ❯
optimize data pipelines and workflows using modern tools (e.g., dbt, Airflow, SQL, Python). Take ownership of large-scale analytics platforms and data warehouse solutions (such as Snowflake, BigQuery, Redshift). Mentor and guide engineers across analytics and data engineering disciplines. Ensure solutions adhere to best practices for data governance, integrity, and scalability. Drive continuous innovation in data orchestration More ❯
Familiarity with Airflow, Dagster or similar data orchestration frameworks Strong understanding of RESTful APIs as well as experience working with both synchronous and asynchronous endpoints Experience with Snowflake or Redshift with a strong understanding of SQL. Proficient in Python and Pandas Experience working with JSON and XML Strong understanding of cloud computing concepts and services (AWS preferably) Experience with More ❯
business impact. Skills Excellent communication skills (written, verbal, and presentation). Strong organisational and planning abilities. Advanced technical expertise across Tableau, MySQL, Python, and modern BI tools (e.g. AWS Redshift). Ability to translate complex data into clear, actionable insights for business leaders. Knowledge Advanced knowledge of statistics and data modelling. Data visualisation and storytelling best practice. Data pipelines More ❯
libraries for data wrangling, such as Pandas, NumPy, and SQLAlchemy. Experience working with traditional SQL databases (e.g. PostgreSQL, MySQL, SQL Server) and cloud data warehouses (e.g. Snowflake, Databricks, BigQuery, Redshift). Experience with time-series data, and/or implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git, Docker, Jenkins/TeamCity More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Areti Group | B Corp™
sources. Ensure the pipelines are optimized for performance, reliability, and maintainability. 🌳 AWS Expertise: Utilize AWS services for data storage, processing, and analytics. Leverage AWS technologies such as S3, Glue, Redshift, and others to build and maintain data solutions. Implement best practices for security and compliance in an AWS environment. 🌳 Python Programming: Demonstrate proficiency in Python programming for data processing More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Advanced Resource Managers
monitoring and observer ability toolsets inc. Splunk, Datadog Experience using Github Actions Experience using AWS RDS/SQL based solutions Experience using containerization in AWS Working data warehouse knowledge Redshift and Snowflake preferred Working with IaC – Terraform and Cloud Formation Working understanding of scripting languages including Python and Shell Experience working with streaming technologies inc. Kafka, Apache Flink Experience More ❯
your team. Lead by example — fostering collaboration, accountability, and agile delivery in every sprint. 🧠 What You Bring Expertise in AWS : Hands-on experience with Python, Glue, S3, Airflow, DBT, Redshift, and RDS. Proven success in end-to-end data engineering — from ingestion to insight. Strong leadership and communication skills, with a collaborative, solution-driven mindset. Experience working in Agile More ❯
Java) and SQL Deep experience with data pipeline orchestration tools (Airflow, dbt, Dagster, Prefect) Strong knowledge of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift) Hands-on experience with streaming technologies (Kafka, Kinesis, or similar) Solid understanding of data modelling, governance, and architecture best practices Familiarity with machine learning pipelines or AI model integration More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch Alongside their commitment to work-life balance the business also has an extremely competitive benefits package including, but not More ❯
like DBT, Airflow, Matillion, Fivetran, or Informatica. Establish architecture blueprints for multi-region, multi-tenant, and secure Snowflake deployments. Lead migration from legacy data warehouses (Teradata, Oracle, SQL Server, Redshift, BigQuery, etc.) to Snowflake. Qualifications we seek in you! Minimum qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field . Preferred qualifications BE More ❯
techniques. What You’ll Bring Proven experience in DevOps or a similar engineering role. Proven experience deploying and managing AWS infrastructure (EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc.). Strong background in Terraform for infrastructure as code. Expertise in CI/CD automation using GitHub Actions. Hands-on experience managing Linux environments and scripting (Python, Bash More ❯
efficiency and deliver scalable, data-driven solutions. What you’ll bring: Strong ML Ops experience. Proficiency in Python and SQL . Hands-on cloud engineering experience with AWS & Azure (Redshift, Synapse). Familiarity with Terraform, Power BI, and Excel . A proactive, problem-solving mindset with strong collaboration skills. This role offers the chance to work within a progressive More ❯
data strategy for a high-growth Financial Services or FinTech firm. A Data Engineering background (or similar) with excellent hands-on experience and understanding of Data Warehouse (e.g. BigQuery, Redshift, Snowflake, Databricks), orchestration, data modelling and cloud technology (AWS, GCP, Azure) Strong AI literacy, including leveraging AI to pull insights from structured and unstructured data Excellent communication skills with More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Harnham
will have the following skills and experience: Strong Python (python frameworks), backend development experience (3+ years) Experience working with AI/ML or big data systems. Familiarity with AWS , Redshift , or other cloud data platforms (GCP/Azure also welcome). Background in AdTech, MarTech or experience with Meta Marketing API or Google Ads API is a plus. Ability More ❯
in either Java, Golang (Go), Python or Scala – multiple would be even better! • Experience with cloud platforms e.g. AWS, Azure, GCP - Google Cloud • Experience with data warehouses e.g. BigQuery, Redshift, Snowflake • Ability to work collaboratively • Open to new AI tools and technologies • Could come from roles like Senior Backend Engineer or Senior Full Stack Engineer They offer a great More ❯