Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Hypercube Consulting
Azure, GCP) Docker, Kubernetes and container services CI/CD, DevOps Additional experience with the following would be beneficial but not essential: Orchestration tools - Apache Airflow/Prefect/Azure Data Factory Containers and related services (AKS, Container Registry) Other desirable skills and experience Ability to get stuck in more »
Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. You MUST have the … Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. The application allows the more »
City of London, London, United Kingdom Hybrid / WFH Options
GCS Ltd
harnessing diverse AWS services. Key Requirements: High level of experience in both SQL and Python programming (10+ years) Experience managing data engineering pipelines using Apache Airflow Proficiency in CI/CD pipelines and automation Git proficiency for version control (branching strategies and repo management) Competent in monitoring tools such more »
science or other related engineering fields Plusses: • Experience with React • Experience with MongoDB • Experience working on streaming technologies like Kaftka and distributed technologies like Apache Ignite • Experience working on AWS, GCP, Kubernetes, IaC • Experience working with C# • Financial industry experience • Experience with cloud like AWS, GCP, Azure (ideally GCP more »
team has strong appetite to embrace latest tech stacks offered in the industry. We need passionate software engineers who use Java, JavaScript, Python, React, Apache Spark and open source data science toolkits to build solutions that offer best in class experience to our risk managers and analysts. Our technology more »
quickly and apply new skills. Desirable: Solid understanding of microservices development. SQL and NoSQL databases working set. Familiar with or able to quickly learn Apache NiFi, Apache Airflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM. Good skills working with JSON, XML, YAML files. Experience working in more »
developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds more »
field (STEM) Technical proficiency in cloud-based data solutions (AWS, Azure or GCP), engineering languages including Python, SQL, Java, and pipeline management tools e.g., Apache Airflow. Familiarity with big data technologies, Hadoop, or Spark. If this opportunity is of interest, or you know anyone who would be interested in more »
mindset with a desire to solve technical problems and model/forecast intricate real-life systems • Good knowledge of parallel computing techniques (Python multiprocessing, Apache Spark), and performance profiling and optimisation • Good understanding of data structures and algorithms • The ability to communicate complex technical concepts to those with little more »
proficiency in SQL for data querying and transformation. ● Programming skills in Python, including experience with basic libraries like os, csv, and pandas. ● Experience with Apache Airflow for workflow management. ● Experience with enterprise DBMS (e.g., DB2, MS SQL Server) and cloud data warehouses, particularly Google BigQuery. ● Proficiency in Google Cloud more »
management and data governance open source platform that we will teach you. Other technologies in use in our space: RESTful services, Maven/Gradle, Apache Spark, BigData, HTML 5, AngularJs/ReactJs, IntelliJ, Gitlab, Jira. Cloud Technologies: You’ll be involved in building the next generation of finance systems more »
Terraform/Docker/Kubernetes. Write software using either Java/Scala/Python . The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Database design concepts. Writing and analysing SQL queries. Application overVIOOH Our recruitment team will more »
Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with more »
Manchester, North West, United Kingdom Hybrid / WFH Options
N Brown Group
and practices and tools like Jira and Confluence. What technical skills you will have Experience with general Cloud products (Cloyd SQL, BigQuery, RedShift, Snowflake, Apache Beam, Spark) or similar products. Experience with open-source data-stack tools such as Airflow, Airbyte, DBT, Kafka etc. Awareness of data visualisation tools more »
with impressive visualization (Power BI) · Experience in building large scale DW/BI systems for B2B SAAS companies · Experience with open-source tools like Apache Flink and AWS tools like S3, Redshift, EMR and RDS. · Experience with AI/Machine Learning and Predictive Analytics · Experience in developing global products more »