start 12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (ApacheAirflowMore ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient in developing applications in Python language. Exposed to python More ❯
frameworks, and clear documentation within your pipelines Experience in the following areas is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like ApacheAirflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of More ❯
frameworks, and clear documentation within your pipelines Experience in the following areas is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like ApacheAirflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of More ❯
research and technology teams. Exposure to low-latency or real-time systems. Experience with cloud infrastructure (AWS, GCP, or Azure). Familiarity with data engineering tools such as Kafka, Airflow, Spark, or Dask. Knowledge of equities, futures, or FX markets. Company Rapidly growing hedge fund offices globally including London Salary & Benefits The salary range/rates of pay is More ❯
architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused More ❯
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused More ❯
Machine Learning: Experience supporting and understanding ML pipelines and models in a production setting. Direct experience with Google Cloud Platform, BigQuery, and associated tooling. Experience with workflow tools like Airflow or Kubeflow. Familiarity with dbt (Data Build Tool). Please send your CV for more information on these roles. Reasonable Adjustments: Respect and equality are core values to us. More ❯
CD. Proven track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
practices (testing, CI/CD, automation). Proven track record of designing, building, and scaling data platforms in production environments. Hands-on experience with big data technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
for individuals with: Experience: Proven background as a Machine Learning Engineer. Technical Skills: Strong in SQL and Python (Pandas, Scikit-learn, Jupyter, Matplotlib). Data transformation & manipulation : experience with Airflow, DBT and Kubeflow Cloud: Experience with GCP and Vertex AI (developing ML services). Expertise: Solid understanding of computer science fundamentals and time-series forecasting. Machine Learning: Strong grasp More ❯
large scale bioinformatics datasets Experience using Nextflow pipelines Knowledge of NLP techniques and experience of processing unstructured data, using vector stores, and approximate retrieval Familiarity with orchestration tooling (e.g. Airflow or Google Workflows) Experience with AI/ML powered applications Experience with Docker or containerized applications More ❯
computer vision. Big Data Tools: Experience with big data platforms like Spark (PySpark) for handling large-scale datasets. MLOps: Familiarity with MLOps tools and concepts (e.g., Docker, Kubernetes, MLflow, Airflow) for model deployment and lifecycle management. Financial Domain Knowledge: Direct experience with at least two of the following domains: Credit Risk Modeling, Fraud Detection, Anti-Money Laundering (AML), Know More ❯
that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Snowflake Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Snowflake Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
skills for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/CD. Company Rapidly growing, cutting edge AI organisation Remote working - Offices in More ❯
checks Understanding of agile software delivery and collaborative development Nice to Have: Experience with bioinformatics or large-scale biological data (e.g., genomics, proteomics) Familiarity with orchestration tools such as Airflow or Google Workflows Experience with containerisation (Docker) Exposure to NLP, unstructured data processing, or vector databases Knowledge of ML and AI-powered data products What You'll Bring Strong More ❯
as pytest. Preferred Qualifications Experience working with biological or scientific datasets (e.g. genomics, proteomics, or pharmaceutical data). Knowledge of bioinformatics or large-scale research data. Familiarity with Nextflow, Airflow, or Google Workflows. Understanding of NLP techniques and processing unstructured data. Experience with AI/ML-powered applications and containerised development (Docker). Contract Details Day Rate: £750 (Inside More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Sanderson Recruitment
on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯
Job Title: Airflow/AWS Data Engineer Location: Manchester Area (3 days per week in the office) Rate: Up to £400 per day inside IR35 Start Date: 03/11/2025 Contract Length: Until 31st December 2025 Job Type: Contract Company Introduction: An exciting opportunity has become available with one of our sector-leading financial services clients. They … to join their growing data engineering function. This role will play a key part in designing, deploying, and maintaining modern cloud infrastructure and data pipelines, with a focus on Airflow, AWS, and data platform automation. Key Responsibilities: Deploy and manage cloud infrastructure across Astronomer Airflow and AccelData environments. Facilitate integration between vendor products and core systems, including data … Establish and enforce best practices for cloud security, scalability, and performance. Configure and maintain vendor product deployments, ensuring reliability and optimized performance. Ensure high availability and fault tolerance for Airflow clusters. Implement and manage monitoring, alerting, and logging solutions for Airflow and related components. Perform upgrades, patches, and version management for platform components. Oversee capacity planning and resource More ❯
teams (Data Scientists, Quality, Manufacturing, IT) to define data requirements and deliver reliable data solutions. Develop and maintain ETL/ELT workflows using modern orchestration and integration tools (e.g., Airflow, Fivetran, or similar). Implement data governance, lineage, and documentation best practices. Support business intelligence tools (e.g., Tableau, Power BI) by ensuring data accessibility and consistency. Continuously evaluate emerging … hands-on experience with: Snowflake (architecture, performance tuning, cost optimisation) DBT (model development, testing, documentation, deployment) SQL (advanced query optimisation and debugging) Experience with data integration and orchestration tools (Airflow, Dagster, Prefect, or similar). Familiarity with cloud data platforms (AWS, Azure, or GCP). Contract: Initial 6-month contract Hybrid - 1 day per week on-site in London More ❯
troubleshoot data workflows and performance issues Essential Skills & Experience: Proficiency in SQL , Python , or Scala Experience with cloud platforms such as AWS, Azure, or GCP Familiarity with tools like Apache Spark , Kafka , and Airflow Strong understanding of data modelling and architecture Knowledge of CI/CD pipelines and version control systems Additional Information: This role requires active SC More ❯