Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. You MUST have the … Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. The application allows the more »
City of London, London, United Kingdom Hybrid / WFH Options
GCS Ltd
harnessing diverse AWS services. Key Requirements: High level of experience in both SQL and Python programming (10+ years) Experience managing data engineering pipelines using Apache Airflow Proficiency in CI/CD pipelines and automation Git proficiency for version control (branching strategies and repo management) Competent in monitoring tools such more »
science or other related engineering fields Plusses: • Experience with React • Experience with MongoDB • Experience working on streaming technologies like Kaftka and distributed technologies like Apache Ignite • Experience working on AWS, GCP, Kubernetes, IaC • Experience working with C# • Financial industry experience • Experience with cloud like AWS, GCP, Azure (ideally GCP more »
quickly and apply new skills. Desirable: Solid understanding of microservices development. SQL and NoSQL databases working set. Familiar with or able to quickly learn Apache NiFi, Apache Airflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM. Good skills working with JSON, XML, YAML files. Experience working in more »
developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds more »
field (STEM) Technical proficiency in cloud-based data solutions (AWS, Azure or GCP), engineering languages including Python, SQL, Java, and pipeline management tools e.g., Apache Airflow. Familiarity with big data technologies, Hadoop, or Spark. If this opportunity is of interest, or you know anyone who would be interested in more »
mindset with a desire to solve technical problems and model/forecast intricate real-life systems • Good knowledge of parallel computing techniques (Python multiprocessing, Apache Spark), and performance profiling and optimisation • Good understanding of data structures and algorithms • The ability to communicate complex technical concepts to those with little more »
proficiency in SQL for data querying and transformation. ● Programming skills in Python, including experience with basic libraries like os, csv, and pandas. ● Experience with Apache Airflow for workflow management. ● Experience with enterprise DBMS (e.g., DB2, MS SQL Server) and cloud data warehouses, particularly Google BigQuery. ● Proficiency in Google Cloud more »
management and data governance open source platform that we will teach you. Other technologies in use in our space: RESTful services, Maven/Gradle, Apache Spark, BigData, HTML 5, AngularJs/ReactJs, IntelliJ, Gitlab, Jira. Cloud Technologies: You’ll be involved in building the next generation of finance systems more »
Manchester, North West, United Kingdom Hybrid / WFH Options
N Brown Group
and practices and tools like Jira and Confluence. What technical skills you will have Experience with general Cloud products (Cloyd SQL, BigQuery, RedShift, Snowflake, Apache Beam, Spark) or similar products. Experience with open-source data-stack tools such as Airflow, Airbyte, DBT, Kafka etc. Awareness of data visualisation tools more »
Azure Data Lake , Azure Databricks or GCP Cloud Dataproc . Familiarity with big data technologies and distributed computing frameworks, such as Hadoop, Spark, or Apache Flink. Experience scaling an “API-Ecosystem ”, designing, and implementing “API-First” integration patterns. Experience working with authentication and authorisation protocols/patterns. Other Information more »
structures. Experience of API (REST) development, Docker, and Kubernetes. Familiarity with IntelliJ, Subversion and Maven. Exposure to one or more of the following technologies: Apache Storm, OpenSearch, Cassandra and Kafka. Ability to work within a hybrid Agile methodology. Understand the design and development approaches required to build a scalable more »
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
is data analysis, building enterprise data platform within asset management or Financial Services is required Expert level Java/Python Development skills AWS expert - Apache Iceberg/Spark and Airflow Please apply for immediate consideration more »
Luton, England, United Kingdom Hybrid / WFH Options
Ventula Consulting
models and following best practices. The Ability to develop pipelines using SageMaker, MLFlow or similar frameworks. Strong experience with data programming frameworks such as Apache Spark. Understanding of common Data Science and Machine Learning models, libraries and frameworks. This role provides a competitive salary plus excellent benefits package. In more »