work. 6-month contract , likely to extend Hybrid - 2 days a week onsite in London (Likely to go remote) Active SC Tech stack: Python, SQL, Jinja, AWS or Azure, Airflow, GitHub Actions, Terraform You'll be designing scalable pipelines, building CI/CD workflows, and collaborating with cross-functional teams. Apply now with your CV and availability. More ❯
of data warehousing concepts and SQL (Snowflake experience a plus) Experience with or willingness to learn CI/CD tooling (e.g. GitHub Actions), containerization (Docker), and workflow orchestration tools (Airflow/AstroCloud) Strong debugging and troubleshooting skills for data pipelines and ML systems Experience writing tests (unit, integration) and implementing monitoring/alerting for production systems Strong data skills … of data warehousing concepts and SQL (Snowflake experience a plus) Experience with or willingness to learn CI/CD tooling (e.g. GitHub Actions), containerization (Docker), and workflow orchestration tools (Airflow/AstroCloud) Strong debugging and troubleshooting skills for data pipelines and ML systems Experience writing tests (unit, integration) and implementing monitoring/alerting for production systems Strong data skills More ❯
SRE, and product teams. We'd Love to See Experience with semantic technologies: ontologies, RDF, or graph databases (e.g., Neo4j, RDF4J). Familiarity with ETL or EIS platforms like Apache Camel or Airflow. Knowledge of financial market data, especially around latency, availability, and correctness. Experience building or contributing to observability platforms or knowledge graph tooling. More ❯
pipelines. 1-2 years of hands-on experience with Azure services such as Data Factory, Databricks, Synapse (DWH), Azure Functions, and other data analytics tools, including streaming. Experience with Airflow and Kubernetes. Programming skills in Python (PySpark) and scripting languages like Bash. Knowledge of Git, CI/CD operations, and Docker. Basic PowerBI knowledge is a plus. Experience deploying More ❯
with an ability to work on multiple projects simultaneously Strong interpersonal and communication skills Quick, self-learning capabilities and creativity in problem-solving Preferred: Familiarity with Python Familiarity with Airflow, ETL tools, Snowflake and MSSQL Hands-on with VCS (Git) Apply for this job indicates a required field First Name Last Name Email Phone Resume/CV Enter manually More ❯
SRE, and product teams. We'd Love to See Experience with semantic technologies: ontologies, RDF, or graph databases (e.g., Neo4j, RDF4J). Familiarity with ETL or EIS platforms like Apache Camel or Airflow. Knowledge of financial market data, especially around latency, availability, and correctness. Experience building or contributing to observability platforms or knowledge graph tooling. Bloomberg is an equal More ❯
development of a new global data platform (PaaS) Ensure scalable storage solutions (data lakes, data warehouses) to handle structured and unstructured data. Implement ETL/ELT pipelines using Dagster, Airflow, or similar tools. Optimize performance and scalability for large data volumes. Govern data security, compliance, and access controls. Development & DevOps: Strong programming and scripting skills in Python. Knowledge of More ❯
Proficiency in data modelling, data management processes, and data profiling. Experience in developing APIs and working with WebSockets. Knowledge of React, Django, FastAPI, or equivalent technologies. Previous experience with AirFlow, Linux shell commands and setup websites on IIS. What we would like from you: Bachelor’s or Master’s degree in a relevant field. Creative and solutions-oriented, with More ❯
Proficiency in data modeling, data management processes, and data profiling. Experience in developing APIs and working with WebSockets. Knowledge of React, Django, FastAPI, or equivalent technologies. Previous experience with AirFlow, Linux shell commands, and setting up websites on IIS. What we would like from you: Bachelor's or Master's degree in a relevant field. Creative and solutions-oriented More ❯
/CD processes GitHub experience including actions and CI/CD Data Modeling experience, including e xtensive experience designing dimensional models based on business use cases and reporting needs Airflow experience (Task scheduler and orchestrator) Python experience (Programming Language) Soft Skills: Interpersonal skills to engage and communicate effectively with customers and audiences of different backgrounds within the organization Please More ❯
role. A strong understanding of data quality concepts, methodologies, and best practices. Proficiency in SQL and data querying for data validation and testing purposes. Hands-on experience with Snowflake, Airflow or Matillion would be ideal Familiarity with data integration, ETL processes, and data governance frameworks. Solid understanding of data structures, relational databases, and data modelling concepts. Excellent analytical and More ❯
similarly complex operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. MLFlow, GCP Vertex, Airflow, dbt or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational teams We're flexible on experience More ❯
sensitive and global datasets. Hands-on experience integrating advanced AI/ML capabilities into operational and analytical data platforms. Extensive knowledge of modern data orchestration and workflow technologies (e.g., Airflow, Kubeflow), and infrastructure automation frameworks (Terraform, CloudFormation). Demonstrated leadership in managing technical product roadmaps, agile delivery practices, and stakeholder management in complex environments. Boston Consulting Group is an More ❯
.NET framework backend services and React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for: End-to-end data migration More ❯
software development and code quality. Vendor Collaboration: Work closely with third-party vendors to integrate their solutions, ensuring they meet our high standards for production environments. Workflow Automation: Utilize Airflow to automate and optimize workflows, ensuring efficient and reliable operations. Required 5-7 years of experience in software development with a focus on production-grade code. Proficiency in Java More ❯
Are you a problem-solver with a passion for data, performance, and smart engineering? This is your opportunity to join a fast-paced team working at the forefront of data platform innovation in the financial technology space. You'll tackle More ❯
Are you a problem-solver with a passion for data, performance, and smart engineering? This is your opportunity to join a fast-paced team working at the forefront of data platform innovation in the financial technology space. You'll tackle More ❯
Job Title: Data Modeller Salary: £85,000 - £95,000 + Benefits and Bonus Location: London (3 days a week onsite) The Role: As a Data Modeler , you will be responsible for designing and implementing data models to support complex data More ❯
ML Pipelines: Engage with data and ML scientists to plan the architecture for end-to-end machine learning workflows. Implement scalable training and deployment pipelines using tools such as ApacheAirflow and Kubernetes. Perform comprehensive testing to ensure reliability and accuracy of deployed models. Develop instrumentation and automated alerts to manage system health and detect issues in real More ❯
features. Rapid Prototyping: Create interactive AI demos and proofs-of-concept with Streamlit, Gradio, or Next.js for stakeholder feedback; MLOps & Deployment: Implement CI/CD pipelines (e.g., GitLab Actions, ApacheAirflow), experiment tracking (MLflow), and model monitoring for reliable production workflows; Cross-Functional Collaboration: Participate in code reviews, architectural discussions, and sprint planning to deliver features end-to More ❯