designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. ApacheAirflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL … . Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. more »
to some SAS development on legacy projects when required. Python or PySpark and SQL will be your bread and butter, with any experience of Airflow being a great bonus. The core skillset: Python/PySpark for building scalable & robust data pipelines Experience with Airflow or other orchestration tools more »
Northamptonshire, England, United Kingdom Hybrid / WFH Options
Capgemini
Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline orchestration tools like ApacheAirflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. Understanding of data modelling Principles and best more »
Oxfordshire, England, United Kingdom Hybrid / WFH Options
Mirus Talent
mandatory, familiarity with the following technologies and tools would be advantageous: Dagster (or similar Orchestration Tools) : Experience with Dagster or other orchestration tools like Airflow for managing complex data workflows and pipelines. Qlik Sense Cloud (or similar Reporting Tools) : Knowledge of Qlik Sense Cloud or similar reporting tools such more »
Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (ApacheAirflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday more »
methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is a bonus. more »
projects and new innovations to support company growth and profitability. Our Tech Stack Python Scala Kotlin Spark Google PubSub Elasticsearch Bigquery, PostgresQL Kubernetes, Docker, Airflow Ke y Responsibilities Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. Optimising data storage and retrieval … Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as Apache Spark Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin Deep knowledge of data modelling, data access, and data more »
Maven, Jenkins, GitHub, etc. Experience with Amazon Web Services a strong plus - CloudFormation, EMR, S3, EC2, Athena etc. Experience with scheduling services such as Airflow, Oozie. Experience with Data ETL and data modeling Experience with building large-scale systems with extensive knowledge in data warehousing solutions. Developing prototypes and more »
has multiple years of experience using Snowflake as a data tool and can hit the ground running Experience Snowflake Financial services Cloud (Ideally azure) Airflow would be an advantage more »
London, England, United Kingdom Hybrid / WFH Options
Harnham
Machine Learning Engineer up to £65 000 London NEW Machine Learning Engineer Opportunity Available with a Leading Organisation! The Company: Are you an expert in Machine Learning? We are on the lookout for a Machine Learning Engineer to join an more »
quality and identify areas for improvement to implement practical solutions. Key Requirements Background in Python Development from an engineering or development environment Experience with Airflow, Cloud (AWS) and Pandas more »
have a valid visa as we Are not able to sponsor Technical Stack:- Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow Skills years of experience in python scripting. in developing applications in Python language. to python-oriented Algorithm’s libraries such as NumPy more »
effective data visualisation solutions. Key Skills: Strong Python and SQL coding experience essential Experience with scripting language Proficiency in data orchestration tools (Dagster/Airflow) Proficiency in visualisation tools such as Tableau and Power BI. Proficiency with Git and CI/CD (ideally using Azure DevOps Salary more »
Knutsford, England, United Kingdom Hybrid / WFH Options
Capgemini
Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline orchestration tools like ApacheAirflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. Understanding of data modelling Principles and best more »
for an exciting opportunity with one of our multinational retail clients. Skills & Experience: - Experience working within aglobal organisation - Google Cloud Platform (GCP) - Background using Airflow/Cloud Composer with Python - Cloud-Based data platforms, Snowflake or BigQuery - Advanced SQL - Data Transformation tools, DBT - CI/CD - TDD If you more »
of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/Fargate. Collaborative Approach: Proven ability to work effectively with more »
pipelines Know your way around Unix based operating system Experience working with any major cloud provider (AWS, GCP, Azure) Fluency in English Experience using ApacheAirflow Experience using Docker Experience using Apache Spark Benefits: Salary £40-50K per annum dependant on skills and experience 25 Days more »
analysis, and software design Travel up to 10% What Will Help You On The Job Familiarity with running software services at scale AWS Infrastructure, Airflow, Kafka and data streaming using Spark/Scala Understanding of networking fundamentals (OSI layers 2-7) Technical and software engineering background in the areas more »
building APIs & SQLAlchmy for database interactions. Strong experience in cloud-based development (AWS). Proficiency with both Docker & Kubernetes for containerisation & orchestration. Understanding or Airflow & DAGs. Experience in building applications using Kafka. Solid OOP principles & design patterns. Permanent/Full-Time Employment. Hybrid working environment (2/3 days more »
Manchester Area, United Kingdom Hybrid / WFH Options
Forsyth Barnes
data warehousing. Multiple years of experience with GCP, especially with core processing and orchestration products like BigQuery, DataFlow, DataFusion, DataStream, Cloud Functions, DataProc, and Airflow/Composer. Strong problem-solving skills and a meticulous approach to code reviews. Proven leadership qualities with the ability to uphold high standards within more »
converting SAS-based modules to Python-based solutions. Strong understanding of data management principles and experience working with Snowflake. Proficiency in Python, DBT, and Airflow or similar technologies. Excellent problem-solving skills and ability to troubleshoot complex issues. Experience working in an Agile environment and collaborating with cross-functional more »
MLOps) is a plus. Tools currently being used; - Python3, Numpy, Scipy, Xgboost - CI/CD: GitHub Actions, Jenkins, Docker - MLOps: DVC, MLflow - BI: Terraform, Airflow, BigQuery - LLMs: GPT, Claude And this is what you’ll get in return: Salary up to £120,000 depending on experience Share Program Flexible more »
range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a lot more. You’ll need to come from a strong academic background with some commercial experience in a data heavy more »
London, England, United Kingdom Hybrid / WFH Options
Jobleads-UK
Continuous Delivery Continuous integration pipelines Strong Python AWS or Azure with large-scale streaming data (Pulsar, Kafka, Kinesis, etc) ETL management; structured or custom (Airflow, Luigi, etc) Bonus Robust experience managing and developing an engineering team Delta lake or Iceberg Trino or Presto Graphql Good salary, bonus, stock options more »
of SQL with vast amount of experience developing data pipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) Demonstrated experience delivering data products in a modern BI technology (ideally Looker) or open source data frameworks Experience using programming languages (e.g. more »