City of London, London, United Kingdom Hybrid / WFH Options
Winston Fox
in a Hedge Fund, Trading Firm, and/or working with Quant Trading Technology Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps More ❯
in a Hedge Fund, Trading Firm, and/or working with Quant Trading Technology Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps More ❯
z2bz0 years experience gained in a Hedge Fund, Investment Bank, FinTech or similar Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Winston Fox
z2bz0 years experience gained in a Hedge Fund, Investment Bank, FinTech or similar Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git More ❯
Maintain and develop data warehouses Provide guidance and mentorship to junior team members To be successful in this role you will have. Extensive experience with Snowflake Experience with DBT, Airflow or Python Cloud Data Engineering experience with AWS/Azure/GCP This is a hybrid role based from the companies London office with some of the benefits including More ❯
London, England, United Kingdom Hybrid / WFH Options
Plutus
experience advantageous. Knowledge of advanced machine learning algorithms and statistics. Experience in Natural Language Processing. CI/CD knowledge. Experience with JIRA/Asana for project management. Familiarity with Airflow, Fivetran, Matillion, or other ETL/ELT tools. Agile working methodology. Why Choose BI:PROCSI? BI:PROCSI offers a unique work environment focused on innovation and personal growth. As More ❯
warehousing concepts and data modeling. >Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. >Understanding/hands on experience in Orchestration solutions such as Airflow >Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability More ❯
in data engineering or a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark, dbt, Airflow OR Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling, distributed systems, streaming architectures, and ETL/ELT pipelines. Proficiency in SQL More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
in a Hedge Fund, Trading Firm, and/or working with Quant Trading Technology Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
z2bz0 years experience gained in a Hedge Fund, Investment Bank, FinTech or similar Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
z2bz0 years experience gained in a Hedge Fund, Investment Bank, FinTech or similar Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git More ❯
implementing complex data pipelines at scale Strong knowledge of distributed computing frameworks (Spark, Hadoop ecosystem) Experience with cloud-based data platforms (AWS, Azure, GCP) Proficiency in data orchestration tools (Airflow, Prefect, Dagster, or similar) Solid programming skills in Python, Scala, or Java Experience integrating ML workflows into production data systems Strong understanding of data modeling, ETL processes, and database More ❯
/Haskell/F# etc) Nice to Have A ComSci degree from a top rated uni Experience in a fast-paced startup environment Task orchestration frameworks (e.g. Luigi, Dask, Airflow + Celery etc) Experience owning or being involved longer-term in an open-source project Demonstrable Rust experience or keen interest Data pipelines and big data tech Docker: both More ❯
data pipeline development and management. Hands-on experience with data processing tools such as Spark or ClickHouse. Proficiency in Pandas or other DataFrame libraries. Familiarity with orchestration tools like Airflow, Luigi, or Prefect. Our Interview Process Company benefits After passing probation, employees are entitled to critical illness insurance, life insurance, and private medical insurance. The company offers 5 WFA More ❯
Sunderland, England, United Kingdom Hybrid / WFH Options
Client Server
segmentation techniques You have coding skills with Python (or C#) and SQL You have experience with SQL databases (e.g. Amazon Redshift, PostgreSQL) You have experience with data tooling (e.g. Airflow, DBT, AWS Kinesis) You have strong analysis and problem solving skills You're collaborative with excellent communication skills What's in it for you: Competitive salary to £55k depending More ❯
similar role. Strong Python development skills, particularly for developing and testing data pipelines. Experience with cloud-based data warehousing (BigQuery preferred). Hands-on experience with orchestration tools (preferably Airflow). Proficient in SQL and data modelling best practices. Experience with DBT or other modern data transformation frameworks. Ability to use a version control system (e.g. git) for code More ❯
services. Take care of Data Platform. Write DBT models for Core Datamarts. You have Strong knowledge of Python and SQL (any syntax), preferably BigQuery, Clickhouse, Postgres. Production experience with Airflow, Clickhouse or BigQuery, DBT, git. General understanding and experience with GCP or AWS. Friendliness and willingness to help colleagues. English level: B2+ At P2P.org , we have a team of More ❯
as well as partner and customer organisations. Requirements Experience with ETL pipeline solutions for the ingestion, transformation, and serving of data utilising technologies such as AWS Step Functions or Apache Airflow. Good understanding of data modelling, algorithm, and data transformation techniques to work with data platforms. Good knowledge of common databases ( RDBMS and NoSQL), Graph Databases (such as GraphDB More ❯
Familiarity with monitoring and logging tools (e.g., Prometheus, Loki, Grafana) in application and data-intensive environments. Proficiency in Configuration Management tools (Chef, Puppet, Ansible) and data orchestration tools (e.g., Airflow, Prefect). Strong background in containerization using Docker and orchestration with Kubernetes. In-depth knowledge of Linux, SQL, cloud security, scripting for automation (Python, Bash), load balancing technologies, and More ❯
data analyst Knowledge of a variety of financial instruments, in particular exposure to derivatives instruments Experience working with SQL Experience with cloud storage solutions Experience with workflow management tools (Airflow/Argo) Prior experience writing documentation for senior stakeholders; the ability to accurately abstract and summarize technical information is critical Python programming skills: PySpark, Pandas, Jupyter Notebooks (3+ years More ❯
implementations or large-scale martech/data projects Comfortable leading client-facing technical engagements, from roadmap to delivery Strong working knowledge of SQL, Python, and modern ETL tooling (e.g. Airflow, DBT, Spark) Familiar with cloud platforms (AWS, GCP, or Azure) and modern data stack components Experience in agile delivery environments, comfortable managing sprints and backlogs A curious mindset with More ❯
building SQL-based transformation flows in dbt or similar tools. Good understanding of cloud platforms such as GCP, AWS or Azure. Experience configuring orchestration of SQL and Python via Airflow or similar tools. Experience working with data pipelines, defining problems, crafting and launching solutions, and practicing continuous improvement. Experience with process improvement frameworks and/or project management frameworks More ❯
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
Milton Keynes, England, United Kingdom Hybrid / WFH Options
Allica Bank Limited
production-grade tools, have a test-driven approach, consistent and well documented code). You have strong SQL skills. Deployed applications on cloud services. Experience in using orchestration tools (Airflow, Dagster or Prefect). Experience with container technology (Docker, Kubernetes). Experience with CI/CD pipelines (preferably Azure DevOps). Experience with application deployment to Cloud services (GCP More ❯
managing databases (we use Elasticsearch/MongoDB/PostgreSQL). Experience with SQL. Experience with data versioning tools. Experience developing and maintaining data infrastructure for ETL pipelines, such as Apache Airflow. EPIC JOB + EPIC BENEFITS = EPIC LIFE We pay 100% for benefits except for PMI (for dependents). Our current benefits package includes pension, private medical insurance, health More ❯