City of London, London, United Kingdom Hybrid / WFH Options
Crimson
Lead Data Engineer (x2) - London - Permanent - Hybrid This role requires hybrid working based in Moorgate, London The salary for this role ranges between £65,000 - £74,000 Lead Data Engineer required for a leading client based in Moorgate, London. Our more »
or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income … work is largely down to you. It can be entirely Back End. Otherwise, the stack includes Redux Saga, Ag-Grid, Node, TypeScript, gRPC, protobuf, Apache Ignite, ApacheAirflow and AWS. As the application suite grows and advances in complexity, there is a decent amount of interaction with … the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech more »
maintaining data-driven microservices and distributed systems. Expertise in managing data across relational and non-relational databases (MySQL, MongoDB). Strong experience with FastAPI, Airflow, Kafka, RabbitMQ. Proven ability to write clean, maintainable code and architect scalable systems using patterns like event-sourcing and CQRS. Bonus Points If You more »
Proficiency in effective code management, collaboration and version control: GIT Adequate level of knowledge and experience with some of the following: API, YAML, Kafka, Airflow, JSON, AVRO, Parquet Professional Experience & Education: 7+ years of experience in data engineering STEM, Finance, or Economics degree preferred, Masters degree bonus Relevant certification more »
the perfect role for you. What You’ll Do: Maintain Stability: Ensure the smooth daily production of cloud data pipelines using tools like Kubernetes, Airflow, and AWS. Create Dashboards: Develop dashboards for risk analytics, signal generation, and trade simulations, used by both the quant team and the wider desk. more »
data pipelines and products Atleast 2 years' experience with SQL including data modelling (DBT/Dataform) Experience with streaming and batch ETL solutions using Airflow for orchestration Familiarity with cloud-based services such as GCP, AWS or Azure Deep understanding of data best practices e.g. CI/CD and more »
Your Profile Key skills/knowledge/experience: Advanced Data Engineering Skills : Proficiency in designing and managing ETL processes using DBT, Python, Terraform and Airflow Expertise in Cloud Platforms: In-depth knowledge of Snowflake and Azure with experience in leveraging these platforms for scalable data solutions. Data Architecture and more »
with AWS Experience with modern build tools - Jenkins, GitHub etc. Experience with Spring Boot or similar API framework. Experience with scheduling services such as Airflow, Oozie. Responsibilities: Developing Data Pipelines: Create and manage robust data pipelines to ensure seamless data flow across our platforms. Optimising Data Storage and Retrieval more »
deployment architectures. Tooling & Best Practices: Develop and implement best practices for model development, deployment, and management, leveraging modern MLOps tools like Docker, Kubernetes, MLflow, Airflow, etc. Qualifications: Educational Background: Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field. Experience: 3+ years of more »
Solid understanding of ETL/ELT processes, along with hands-on experience building and maintaining data pipelines using DBT, Snowflake, Python, SQL, Terraform and Airflow Experience in designing and implementing data products and solutions on cloud-based architectures. Cloud Platforms: Experience working with cloud data warehouses and analytics platforms more »
related to a wider Data Strategy 🔧Enhance your skills and be part of a function using your expertise in Python, Spark and AWS (Kinesis, ApacheAirflow) 🔧 Create Pipelines for model evaluations including interactive dashboards, tables, and plots to display insights and projections to non-technical project stakeholders 🔧 Focus more »
learn a lot about the world of finance whilst working with the latest technologies. Key Requirements Expert knowledge of data streaming technologies such as Apache Kafka and Apache Flink Extensive experience with distributed programming and building distributed systems (Spark, Dask, Airflow etc) Strong experience with one or more »
Job title: Platform Engineer – HPC & Kubernetes Client: Elite FinTech Salary: £80,000-£140,000 + Bonus Location: London Skills: HPC, Kubernetes, Slurm, Airflow, Python, Git, Scheduling, Orchestration The role: My client are seeking an Engineer to join a highly technical team focused on the Scheduling side of technology. You … Kubernetes. The role is broad and interacts with many different technical teams throughout the organisation. Core skills required: HPC Kubernetes GPU Slurm Desirable skills: Airflow Prefect GitLab Docker ELK Ansible Terraform Please apply ASAP for more information. more »
the lead in optimising our data infrastructure. With a strong focus on PostgreSQL, PostGIS, and TimescaleDB, you will design APIs, manage data orchestration with AirFlow and SQLMesh, and prepare our systems for key security accreditations. This role is ideal for someone passionate about data engineering and excited to work … Development : Design and build secure, scalable APIs to streamline data flow between frontend and backend systems. Data Orchestration : Manage and optimise data pipelines using AirFlow and SQLMesh for smooth workflow automation. Off-Cloud Deployment : Use Kubernetes (Talos, RKE2) and storage solutions like Ceph for deploying applications and databases in …/PostGIS, TimescaleDB, and API design. Off-Cloud Architecture Skills : Experience with Kubernetes (Talos, RKE2) and storage solutions like Ceph. Data Orchestration : Proficiency with AirFlow and SQLMesh. Strong Security & Compliance Knowledge : Experience with data encryption, RBAC, and secure systems design. Performance Optimisation : Expertise in query optimisation, backup strategies, and more »
solutions meet standards for accessibility, security, data quality, and ethical handling. Location: London - Can work fully remote Tech Stack: Python, FastAPI/Flask, Spark, Airflow, SQL, PostgreSQL, SQL Server, Azure Data Lakes, Kubernetes Salary: Up to 75k (Plus benefits - 13% Bonus - 12% Pension Contribution, 30 Days Holiday) If this more »
are looking for several skilled DV Cleared Data Engineers for a long term outside ir35 contract with a Defence customer of mine. Skillset; Python Apache Spark, ApacheAirflow, Apache nifi Git/GitHub Object storage Servers (ie MinIO) Details: Active DV clearance (used within last more »
Platforms: Kubernetes (AWS EKS), Helm, Docker Observability: EFK stack, Prometheus, Grafana Persistence: PostgreSQL, MongoDB, Redis, Kafka CI/CD: GitHub Actions, AWS CodeBuild Workflows: Airflow, Kubernetes Languages: Python, Node.js, Go What's on Offer Key role in a growing company: Be a major contributor in a fast-growing startup more »
years of professional experience in a computer science/computational role Experience working in a technical environment with DevOps functions (Google Cloud, Airflow, InfluxDB, Grafana) Design and implementation of front-office systems for quant trading Highly Valued Relevant Experience Knowledge of machine learning and statistical techniques and related libraries more »
tools (e.g., Glue, QuickSight) Proficiency in Python and Java Experience with data lakes and data pipelines Knowledge of ETL processes and tools such as ApacheAirflow (experience with Kafka is a plus) Familiarity with data governance, particularly GDPR compliance Strong problem-solving skills and the ability to work more »
junior engineers and contribute to continuous learning within the team Technical Stack: Frontend: React.js, Redux Backend: Python Databases: Hive, MongoDB, SQL Server ETL Pipelines: Airflow, Spark, dbt Other: Docker, Git, Test-driven development Requirements: 5+ years of full-stack development experience in Python 5+ years of experience with SQL more »
decision-making applications. Cloud Platform Experience : Proficiency with cloud-based platforms like Databricks, Snowflake, and Google BigQuery, and experience deploying workflows with tools like Airflow or Databricks. Product Deployment Track Record : Proven experience taking data science products from conception to production deployment. Commitment to Quality : Strong focus on accuracy more »
Terraform would be great. AI-ML techniques are becoming more prominent - Sagemaker etc would be a nice to have. Data Platform technologies (Redshift/Airflow etc) - also a nice to have. The interview process will consist of 3 stages. An initial call with the hiring manager followed by a more »
It will require 2/3 days onsite In London. The successful candidate will be highly skilled in PySpark, Pyhton, GIT, Postgress, SQL and, ApacheAirflow, Apache Spark, Apache hifi Any exposure to Palantir would be greatly desired. Please note Active DV and SOLE British Nationality more »
collaborative design & development Shared code ownership & cross-functional teams Bonus points if you: Have experience with Serverless architectures Experienced with job orchestration frameworks (e.g. Airflow, MWAA on AWS) MLOps knowledge and grasp of basic concepts Have a strong interest in the health/fitness technologies Our tech stack Below more »