Some Front End ability (Vue, React or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income JavaScript Node Fixed Income … requires the team to be in the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech attitude towards technology. Hours More ❯
London, England, United Kingdom Hybrid / WFH Options
Fitch Ratings
Agile/SCRUM environment What Would Make You Stand Out Experience working with Kafka data streams, Big data, Postgres DB and Data lake is a plus Experience working with Amazon AWS services like Athena, S3, Glue, Lambda, Starburst etc., is a plus Understanding of Continuous Integration Continuous Delivery/Deployment concepts and CI tools like Jenkins/Bamboo More ❯
such as Bootstrap, Material UI, and Tailwind CSS. Strong understanding of API protocols and standards, including REST and GraphQL. Hands-on experience with AWS services such as S3, Lambda, Athena, EC2, SQS, RDS, DynamoDB, etc. Experience with CI/CD pipelines, automated testing, Git and GitHub, containerization, and infrastructure as code (IaC) tools like Terraform. Solid understanding of agile More ❯
London, England, United Kingdom Hybrid / WFH Options
Fitch Group
Agile/SCRUM environment What Would Make You Stand Out: Experience working with Kafka data streams, Big data, Postgres DB and Data lake is a plus Experience working with Amazon AWS services like Athena, S3, Glue, Lambda, Starburst etc., is a plus Understanding of Continuous Integration Continuous Delivery/Deployment concepts and CI tools like Jenkins/Bamboo More ❯
field. At least 10 years of experience in data engineering, data architecture, or software engineering. Proficiency in Python and SQL Proficient in AWS data services such as S3, Glue, Athena, Redshift, EMR, Kinesis, Lambda, etc. Strong knowledge of data lake concepts, architectures, and design patterns. Experience in building and managing data pipelines using tools such as Airflow, Spark, Kinesis More ❯
Hands-on practical experience in system design, application development, testing, and operational stability Cloud implementation experience with AWS including: AWS Data Services: Glue ETL (or) EMR, S3, Glue Catalog, Athena, Lambda + Step Functions + Event Bridge, ECS Data De/Serialization: Parquet and JSON format AWS Data Security: Good Understanding of security concepts such as: IAM, Service roles More ❯
Functions, and Kinesis. Work with structured and unstructured data from multiple sources, ensuring efficient data ingestion, transformation, and storage. Develop and optimize data lake and data warehouse solutions using Amazon S3, Redshift, Athena, and Lake Formation. Implement data governance, security, and compliance best practices, including IAM roles, encryption, and access controls. Monitor and optimize performance of data workflows … engineering with a strong focus on AWS cloud technologies. Proficiency in Python, PySpark, SQL, and AWS Glue for ETL development. Hands-on experience with AWS data services, including Redshift, Athena, Glue, EMR, and Kinesis. Strong knowledge of data modeling, warehousing, and schema design. Experience with event-driven architectures, streaming data, and real-time processing using Kafka or Kinesis. Expertise More ❯
commodities Familiarity with quantitative finance and electronic trading concepts Experience developing dashboards and data visualization applications with Plotly, Matplotlib, Bokeh, Dash, etc. Experience with AWS technologies such as S3, Athena, SQS, Batch, Lambda Experience with DevOps practices using containerization and orchestration tools like Docker and Kubernetes #J-18808-Ljbffr More ❯
to design innovative data solutions that address complex business requirements and drive decision-making. Your skills and experience Proficiency with AWS Tools: Demonstrable experience using AWS Glue, AWS Lambda, Amazon Kinesis, Amazon EMR , AmazonAthena, Amazon DynamoDB, Amazon Cloudwatch, Amazon SNS and AWS Step Functions. Programming Skills: Strong experience with modern programming languages More ❯
forward. It's an exciting time, and to continue our growth, we are recruiting a Senior Software Engineer focusing on Python for our Software Team. Our Tech Stack: AWS, Athena SQL, Athena Spark, ECS, Azure, Azure Synapse SQL & Spark, Python, Flask, Fast API, Redis, Postgres, React, Plotly, Docker. We will potentially add GCP and on-premise in the More ❯
common databases ( RDBMS and NoSQL), Graph Databases (such as GraphDB), and storage solutions. Knowledge of cloud development practices and API development utilising technologies such as AWS Lambda functions, AWS Athena, AWS Glue, AWS Step Functions etc. Software engineering best practices, including DevOps, CI/CD, Agile, and infrastructure-as-code (particularly Terraform). Knowledge of search tooling ( OpenSearch and More ❯
Experience working with big-data stack, including (but not limited to) spark/hadoop, kafka, Aerospike/Dynamodb Experience with AWS tech stack, including but not limited to EMR, Athena, EKS Expert knowledge of multi-threading, memory model, etc. Understanding of database fundamentals and MySQL knowledge Experience with CICD tools such as Jenkins, Graphite, Grafana and Docker Knowledge of More ❯
of orchestration tooling (e.g., Airflow, Dagster, Azure Data Factory, Fivetran) Desirable: - Experience deploying AI/ML models in production environments - Familiarity with AWS data services (e.g., S3, Glue, Kinesis, Athena) - Exposure to real-time data streaming and analytics paradigms #LI-RJ1 Skills: Data Engineering Snowflake What you can expect from us: Together, as owners, let’s turn meaningful insights More ❯
end delivery of complex features, ideally having worked with peers of different levels to complete projects collaboratively. Our technology stack: Python (including FastAPI, OpenTelemetry, procrastinate, SQLAlchemy, Uvicorn), Postgres, MySQL, Athena, Liquibase, Retool, Docker, AWS Who you are: A professional history in software engineering with a deep knowledge of the technologies in our stack Proven experience in making technology decisions More ❯
and customise them for different use cases. Develop data models and Data Lake designs around stated use cases to capture KPIs and data transformations. Identify relevant AWS services – on Amazon EMR, Redshift, Athena, Glue, Lambda, to design an architecture that can support client workloads/use-cases; evaluate pros/cons among the identified options to arrive at More ❯
Use Terraform to automate infrastructure provisioning, deployment, and configuration, ensuring efficiency and repeatability in cloud environments. Database Design & Optimisation : Design and optimise complex SQL queries, and relational databases (e.g., Amazon Redshift, PostgreSQL, MySQL) to enable fast, efficient data retrieval and analytics. Data Transformation : Apply ETL/ELT processes to transform raw financial data into usable insights for business intelligence … understanding of data engineering concepts, including data modelling, ETL/ELT processes, and data warehousing. Proven experience with AWS services (e.g., S3, Redshift, Lambda, ECS, ECR, SNS, Eventbridge, CloudWatch, Athena etc.) for building and maintaining scalable data solutions in the cloud. Technical Skills (must have): Python: Proficient in Python for developing custom ETL solutions, data processing, and integration with More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
Key Responsibilities Design, build, and maintain robust data pipelines using AWS services (Glue, Lambda, Step Functions, S3, etc.) Develop and optimize data lake and data warehouse solutions using Redshift, Athena, and related technologies Collaborate with data scientists, analysts, and business stakeholders to understand data requirements Ensure data quality, governance, and compliance with financial regulations Implement CI/CD pipelines … Proven experience as a Data Engineer working in cloud- environments (AWS ) Strong proficiency with Python and SQL Extensive hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data modeling, ETL frameworks, and More ❯
case to adopt new technologies Develop new tools and infrastructure using Python (Flask/Fast API) or Java (Spring Boot) and relational data backend (AWS - Aurora/Redshift/Athena/S3) Support users and operational flows for quantitative risk, senior management and portfolio management teams using the tools developed Qualifications/Skills Required Advance degree in computer science More ❯
a core hub—making it easy and safe for teams to use and contribute to data systems. You'll work with services like Lambda, S3, LakeFormation, Glue, Step Functions, Athena, EventBridge, SNS, SQS, and DynamoDB, and will be expected to navigate and manage data systems with a high degree of rigour and compliance. Familiarity with additional tools such as More ❯
structured queries Hands on experience with DBT, building and maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services ( S3, Athena, Glue ) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses ( Snowflake, Redshift, BigQuery ) A pragmatic problem solver who can balance technical More ❯
structured queries Hands on experience with DBT, building and maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance technical More ❯