data-driven decision-making is embedded across the business. What We're Looking For: Strong proficiency in SQL (essential). Experience with data migration, ETL, and pipeline development (e.g., Airflow, Fabric, Azure). Familiarity with APIs (REST API, JSON), data mapping, validation, and quality control. Ability to design, build, and maintain dashboards (e.g., Power BI, Looker Studio, Excel or More ❯
Mathematics, or similar field of study or equivalent work experience. Proven experience with object-oriented programming languages preferably in Python. Experience in infrastructure Automation or orchestration Frameworks e.g. Ansible, Airflow, Terraform, Chef, Salt. A bachelor's or master's degree in computer science, Engineering, Mathematics, a similar field of study or equivalent work experience. Desirable Experience with Telemetry: Splunk More ❯
using the below technologies: Python as our main programming language Databricks as our datalake platform Kubernetes for data services and task orchestration Terraform for infrastructure Streamlit for data applications Airflow purely for job scheduling and tracking Circle CI for continuous deployment Parquet and Delta file formats on S3 for data lake storage Spark for data processing DBT for data More ❯
and Python programming languages. Strong understanding of graph databases (e.g., RDF, Neo4j , GraphDB). Experience with data modeling and schema design. Knowledge of data pipeline tools and frameworks (e.g., ApacheAirflow, Luigi). Excellent problem-solving and analytical skills. Ability to work independently and as part of a team. Clinical knowledge More ❯
in AWS. Strong expertise with AWS services, including Glue, Redshift, Data Catalog, and large-scale data storage solutions such as data lakes. Proficiency in ETL/ELT tools (e.g. Apache Spark, Airflow, dbt). Skilled in data processing languages such as Python, Java, and SQL. Strong knowledge of data warehousing, data lakes, and data lakehouse architectures. Excellent analytical More ❯
data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize More ❯
of building performant, maintainable, and testable systems Solid background in microservices architecture Proficiency with Postgres & MongoDB (relational + non-relational) Experience with event-driven architectures and asynchronous workflows (Kafka, Airflow, etc.) Solid coding practices (clean, testable, automated) The mindset of a builder: thrives in fast-paced startup environments, takes ownership, solves complex challenges Bonus points if youve worked with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Xact Placements Limited
of building performant, maintainable, and testable systems Solid background in microservices architecture Proficiency with Postgres & MongoDB (relational + non-relational) Experience with event-driven architectures and asynchronous workflows (Kafka, Airflow, etc.) Solid coding practices (clean, testable, automated) The mindset of a builder: thrives in fast-paced startup environments, takes ownership, solves complex challenges Bonus points if you’ve worked More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
B testing Experiment design and hypothesis testing MLOps & Engineering Scalable ML systems (batch and real-time) ML pipelines, CI/CD, monitoring, deployment Familiarity with tools like MLflow, Kubeflow, Airflow, Docker, Kubernetes Strategic skills Align ML initiatives with business goals Prioritize projects based on ROI, feasibility, and risk Understand market trends and competitive ML strategies Communicate ML impact to More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
iO Associates
x Contract Data Engineers - Snowflake/AWS/Python/Airflow/Iceberg Location: London (Hybrid - 3 days per week onsite) Duration: 6 months Day Rate: £550 - £600 (Inside IR35) A highly reputable consultancy is seeking 2 x Contract Data Engineers to join their data team on a 6-month engagement. You will play a key role in building … and reporting capabilities. Key Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates More ❯
london, south east england, united kingdom Hybrid / WFH Options
iO Associates
x Contract Data Engineers - Snowflake/AWS/Python/Airflow/Iceberg Location: London (Hybrid - 3 days per week onsite) Duration: 6 months Day Rate: £550 - £600 (Inside IR35) A highly reputable consultancy is seeking 2 x Contract Data Engineers to join their data team on a 6-month engagement. You will play a key role in building … and reporting capabilities. Key Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
iO Associates
x Contract Data Engineers - Snowflake/AWS/Python/Airflow/Iceberg Location: London (Hybrid - 3 days per week onsite) Duration: 6 months Day Rate: £550 - £600 (Inside IR35) A highly reputable consultancy is seeking 2 x Contract Data Engineers to join their data team on a 6-month engagement. You will play a key role in building … and reporting capabilities. Key Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates More ❯
City of London, London, United Kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
london, south east england, united kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
slough, south east england, united kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production pipelines using tools such as Airflow or Dagster Understanding of CI/CD principles, version control (Git) and software development lifecycle. Strong communication and stakeholder-management skills with the ability to influence technical and business More ❯
and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production pipelines using tools such as Airflow or Dagster Understanding of CI/CD principles, version control (Git) and software development lifecycle. Strong communication and stakeholder-management skills with the ability to influence technical and business More ❯
including technical design, coding standards, code review, source control, build, test, deploy, and operations Awesome If You: Are experienced in Rust/Java/Kotlin Have experience with AWS, Apache Kafka, Kafka Streams, Apache Beam/Flink/Spark - especially deployment, monitoring & debugging Have experience with productisation of Machine Learning research projects Are familiar with Airflow or More ❯
in delta one, store of value, and/or FICC options trading Experience with Linux-based, concurrent, high-throughput, low-latency software systems Experience with pipeline orchestration frameworks (e.g. Airflow, Dagster) Experience with streaming platforms (e.g. Kafka), data lake platforms (e.g. Delta Lake, Apache Iceberg), and relational databases Have a Bachelor or advanced degree in Computer Science, Mathematics More ❯
fine-tuning, or deployment. Technical Skills: Proficiency in Python and frameworks such as PyTorch , TensorFlow , or JAX . Strong familiarity with distributed computing and data engineering tools (e.g., SLURM , Apache Spark , Airflow ). Hands-on experience with LLM training, fine-tuning, and deployment (e.g., Hugging Face , LLamafactory , NVIDIA NeMo ). Preferred Qualifications Advanced degree (MS/PhD) in More ❯
fine-tuning, or deployment. Technical Skills: Proficiency in Python and frameworks such as PyTorch , TensorFlow , or JAX . Strong familiarity with distributed computing and data engineering tools (e.g., SLURM , Apache Spark , Airflow ). Hands-on experience with LLM training, fine-tuning, and deployment (e.g., Hugging Face , LLamafactory , NVIDIA NeMo ). Preferred Qualifications Advanced degree (MS/PhD) in More ❯
fine-tuning, or deployment. Technical Skills: Proficiency in Python and frameworks such as PyTorch , TensorFlow , or JAX . Strong familiarity with distributed computing and data engineering tools (e.g., SLURM , Apache Spark , Airflow ). Hands-on experience with LLM training, fine-tuning, and deployment (e.g., Hugging Face , LLamafactory , NVIDIA NeMo ). Preferred Qualifications Advanced degree (MS/PhD) in More ❯