Some Front End ability (Vue, React or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income JavaScript Node Fixed Income … requires the team to be in the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech attitude towards technology. Hours More ❯
London, England, United Kingdom Hybrid / WFH Options
Fitch Ratings
Agile/SCRUM environment What Would Make You Stand Out Experience working with Kafka data streams, Big data, Postgres DB and Data lake is a plus Experience working with Amazon AWS services like Athena, S3, Glue, Lambda, Starburst etc., is a plus Understanding of Continuous Integration Continuous Delivery/Deployment concepts and CI tools like Jenkins/Bamboo More ❯
such as Bootstrap, Material UI, and Tailwind CSS. Strong understanding of API protocols and standards, including REST and GraphQL. Hands-on experience with AWS services such as S3, Lambda, Athena, EC2, SQS, RDS, DynamoDB, etc. Experience with CI/CD pipelines, automated testing, Git and GitHub, containerization, and infrastructure as code (IaC) tools like Terraform. Solid understanding of agile More ❯
London, England, United Kingdom Hybrid / WFH Options
Fitch Group
Agile/SCRUM environment What Would Make You Stand Out: Experience working with Kafka data streams, Big data, Postgres DB and Data lake is a plus Experience working with Amazon AWS services like Athena, S3, Glue, Lambda, Starburst etc., is a plus Understanding of Continuous Integration Continuous Delivery/Deployment concepts and CI tools like Jenkins/Bamboo More ❯
field. At least 10 years of experience in data engineering, data architecture, or software engineering. Proficiency in Python and SQL Proficient in AWS data services such as S3, Glue, Athena, Redshift, EMR, Kinesis, Lambda, etc. Strong knowledge of data lake concepts, architectures, and design patterns. Experience in building and managing data pipelines using tools such as Airflow, Spark, Kinesis More ❯
Hands-on practical experience in system design, application development, testing, and operational stability Cloud implementation experience with AWS including: AWS Data Services: Glue ETL (or) EMR, S3, Glue Catalog, Athena, Lambda + Step Functions + Event Bridge, ECS Data De/Serialization: Parquet and JSON format AWS Data Security: Good Understanding of security concepts such as: IAM, Service roles More ❯
Life Cycle, agile methodologies such as CI/CD, Application Resiliency, and Security Cloud implementation experience with AWS including: AWS Data Services: Glue ETL (or) EMR, S3, Glue Catalog, Athena, Lambda + Step Functions + Event Bridge, ECS Data De/Serialization: Parquet and JSON format AWS Data Security: Good Understanding of security concepts such as: IAM, Service roles More ❯
Functions, and Kinesis. Work with structured and unstructured data from multiple sources, ensuring efficient data ingestion, transformation, and storage. Develop and optimize data lake and data warehouse solutions using Amazon S3, Redshift, Athena, and Lake Formation. Implement data governance, security, and compliance best practices, including IAM roles, encryption, and access controls. Monitor and optimize performance of data workflows … engineering with a strong focus on AWS cloud technologies. Proficiency in Python, PySpark, SQL, and AWS Glue for ETL development. Hands-on experience with AWS data services, including Redshift, Athena, Glue, EMR, and Kinesis. Strong knowledge of data modeling, warehousing, and schema design. Experience with event-driven architectures, streaming data, and real-time processing using Kafka or Kinesis. Expertise More ❯
commodities Familiarity with quantitative finance and electronic trading concepts Experience developing dashboards and data visualization applications with Plotly, Matplotlib, Bokeh, Dash, etc. Experience with AWS technologies such as S3, Athena, SQS, Batch, Lambda Experience with DevOps practices using containerization and orchestration tools like Docker and Kubernetes #J-18808-Ljbffr More ❯
Security In-depth knowledge of the financial services industry and their IT systems Cloud implementation experience with AWS including: AWS Data Services: Glue ETL (or) EMR, S3, Glue Catalog, Athena, Lambda + Step Functions + Event Bridge, ECS Data De/Serialization: Parquet and JSON format AWS Data Security: Good Understanding of security concepts such as: IAM, Service roles More ❯
to design innovative data solutions that address complex business requirements and drive decision-making. Your skills and experience Proficiency with AWS Tools: Demonstrable experience using AWS Glue, AWS Lambda, Amazon Kinesis, Amazon EMR , AmazonAthena, Amazon DynamoDB, Amazon Cloudwatch, Amazon SNS and AWS Step Functions. Programming Skills: Strong experience with modern programming languages More ❯
forward. It's an exciting time, and to continue our growth, we are recruiting a Senior Software Engineer focusing on Python for our Software Team. Our Tech Stack: AWS, Athena SQL, Athena Spark, ECS, Azure, Azure Synapse SQL & Spark, Python, Flask, Fast API, Redis, Postgres, React, Plotly, Docker. We will potentially add GCP and on-premise in the More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Metronome LLC
and services that expose AI capabilities to internal and external consumers Data Tool Development Design and implement data pipelines, dashboards, and analytics tools using AWS services such as Glue, Athena, Redshift, and QuickSight. Automate data ingestion, transformation, and visualization workflows Cloud Engineering (AWS) Deploy and manage applications using AWS services, including Lambda, ECS, S3, CloudFormation, and CDK. Implement CI More ❯
solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Leverage AWS services (e.g., S3, EC2, Athena, Lambda, Glue) to build scalable and secure cloud solutions. Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code More ❯
common databases ( RDBMS and NoSQL), Graph Databases (such as GraphDB), and storage solutions. Knowledge of cloud development practices and API development utilising technologies such as AWS Lambda functions, AWS Athena, AWS Glue, AWS Step Functions etc. Software engineering best practices, including DevOps, CI/CD, Agile, and infrastructure-as-code (particularly Terraform). Knowledge of search tooling ( OpenSearch and More ❯
Pandas or PySpark) and SQL, with exposure to ETL/orchestration tools such as Airflow or dbt. Strong knowledge of cloud-native services on AWS (e.g., S3, Glue, Lambda, Athena) and Azure (Data Factory, Data Lake). Track record of collaborating with scientific teams and translating research needs into scalable data solutions. Preferred Qualifications Experience with cheminformatics libraries (e.g. More ❯
Experience working with big-data stack, including (but not limited to) spark/hadoop, kafka, Aerospike/Dynamodb Experience with AWS tech stack, including but not limited to EMR, Athena, EKS Expert knowledge of multi-threading, memory model, etc. Understanding of database fundamentals and MySQL knowledge Experience with CICD tools such as Jenkins, Graphite, Grafana and Docker Knowledge of More ❯
of orchestration tooling (e.g., Airflow, Dagster, Azure Data Factory, Fivetran) Desirable: - Experience deploying AI/ML models in production environments - Familiarity with AWS data services (e.g., S3, Glue, Kinesis, Athena) - Exposure to real-time data streaming and analytics paradigms #LI-RJ1 Skills: Data Engineering Snowflake What you can expect from us: Together, as owners, let’s turn meaningful insights More ❯
Penryn, England, United Kingdom Hybrid / WFH Options
Aspia Space
data engineering, data architecture, or similar roles. •Expert proficiency in Python, including popular data libraries (Pandas, PySpark, NumPy, etc.). •Strong experience with AWS services—specifically S3, Redshift, Glue (Athena a plus). •Solid understanding of applied statistics. •Hands-on experience with large-scale datasets and distributed systems. •Experience working across hybrid environments: on-premise HPCs and cloud platforms. More ❯
Fairfax, Virginia, United States Hybrid / WFH Options
Metronome LLC
and services that expose AI capabilities to internal and external consumers Data Tool Development Design and implement data pipelines, dashboards, and analytics tools using AWS services such as Glue, Athena, Redshift, and QuickSight. Automate data ingestion, transformation, and visualization workflows Cloud Engineering (AWS) Deploy and manage applications using AWS services, including Lambda, ECS, S3, CloudFormation, and CDK. Implement CI More ❯
end delivery of complex features, ideally having worked with peers of different levels to complete projects collaboratively. Our technology stack: Python (including FastAPI, OpenTelemetry, procrastinate, SQLAlchemy, Uvicorn), Postgres, MySQL, Athena, Liquibase, Retool, Docker, AWS Who you are: A professional history in software engineering with a deep knowledge of the technologies in our stack Proven experience in making technology decisions More ❯
flows using Apache Kafka, Apache Nifi and MySQL/PostGreSQL Develop within the components in the AWS cloud platform using services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena Communicate with data owners to set up and ensure configuration parameters Document SOP related to streaming configuration, batch configuration or API management depending on role requirement Document details of … and problem-solving skills Experience in instituting data observability solutions using tools such as Grafana, Splunk, AWS CloudWatch, Kibana, etc. Experience in container technologies such as Docker, Kubernetes, and Amazon EKS Qualifications: Ability to obtain an Active Secret clearance or higher Bachelors Degree in Computer Science, Engineering, or other technical discipline required, OR a minimum of 8 years equivalent More ❯
will be working on complex data problems in a challenging and fun environment, using some of the latest Big Data open-source technologies like Apache Spark, as well as Amazon Web Service technologies including Elastic MapReduce, Athena and Lambda to develop scalable data solutions. Key Responsibilities: Adhering to Company Policies and Procedures with respect to Security, Quality and More ❯
and customise them for different use cases. Develop data models and Data Lake designs around stated use cases to capture KPIs and data transformations. Identify relevant AWS services - on Amazon EMR, Redshift, Athena, Glue, Lambda, to design an architecture that can support client workloads/use-cases; evaluate pros/cons among the identified options to arrive at More ❯