Southampton, Hampshire, South East, United Kingdom Hybrid / WFH Options
Datatech Analytics
record of taking data science products from conception through to deployment in production. You've had experience deploying data science workflows with tools like Airflow or Databricks. Proven experience of leading a data science team Proven experience of working with customer data You've a strong commitment to accuracy more »
South East London, England, United Kingdom Hybrid / WFH Options
NatPower Marine
tools: Hadoop, Spark, Kafka, etc.· Experience with relational SQL and NoSQL databases, particularly Postgres.· Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.· Experience with AWS cloud services: EC2, EMR, RDS, Redshift.· Experience with multiple data architecture paradigms (relational, non-structured, streaming)· Knowledge of various data more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, ApacheAirflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of more »
have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to £450p/day (Outside IR35) and will more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt TC: £85,000 + bonus + up to 22% pension Process: 2 stages No CV? No problem. Email me at athomas@trg-uk.com more »
pipelines, such as Jenkins, Azure Pipelines. Good understanding of a cloud-based microservice architecture, Experience with specialized Python data management libraries, including SQLAlchemy, DBT, Airflow/Luigi, Pandera. Practical experience with SQL database development and data domain modelling. Good technical writing and documentation skills. Fluent English If you have more »
tooling that will empower the wider function. The core skillset: Object Oriented Python for building software & APIs (they also use Trino & Kong) Experience with Airflow or other orchestration tools (Hadoop, CloudComposer, Dagster etc) Demonstrable cloud experience developing in AWS (preferable), GCP, or Azure Experience of developing real-time streaming more »
converting SAS-based modules to Python-based solutions. Strong understanding of data management principles and experience working with Snowflake. Proficiency in Python, DBT, and Airflow or similar technologies. Excellent problem-solving skills and ability to troubleshoot complex issues. Experience working in an Agile environment and collaborating with cross-functional more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond®
ensuring best practises, quality in data transformation and modelling. Essential experience with tech including; GCP, SQL and DBT. Preferably working experience with: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. This position does not offer Visa Sponsorship , please refrain from applying if you require sponsorship at any more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
early-stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the more »
Abingdon-On-Thames, England, United Kingdom Hybrid / WFH Options
Mirus Talent
mandatory, familiarity with the following technologies and tools would be advantageous: Dagster (or similar Orchestration Tools) : Experience with Dagster or other orchestration tools like Airflow for managing complex data workflows and pipelines. Qlik Sense Cloud (or similar Reporting Tools) : Knowledge of Qlik Sense Cloud or similar reporting tools such more »
South East London, England, United Kingdom Hybrid / WFH Options
Hunter Bond
degree in Computer Science, or similar, and have 4-5 years minimum exposure to back end development in Python, and Kubernetes automation. Skills in Airflow are of extra interest.In this role, you will:Executes elite software solutions, help design, development, and technical troubleshootingCreate secure and high-quality software code more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
TMS (Tealium IQ, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pippingand modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
diverse environments, leveraging Azure and other modern technologies. Proven ability to orchestrate complex data workflows and manage Kubernetes clusters on AKS , utilizing tools like Airflow, Kubeflow, Argo, and Dagster. Familiarity with data ingestion tools such as Airbyte and Fivetran, accommodating a wide array of data sources. Mastery of large more »
Proven experience in MLOps and deploying machine learning models on Kubernetes. Proficiency in cloud technologies, AWS, GCP, Azure Experience with data orchestration tools (e.g., ApacheAirflow). Familiarity with Terraform for CI/CD and infrastructure as code. Strong programming skills in software development. A cloud-agnostic mindset more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
in Python or PySpark, we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka. Experience with Big Data solutions or Relational DB. Demonstrated knowledge of software applications and technical processes within a cloud more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
Langchain, Semantic Kernel, and tools like MS tooling, Co-Pilot Studio, ML Studio, Prompt flow, Kedro, etc. Proficiency in pipeline orchestration with tools like Airflow, Kubeflow, and Argo . Exceptional communication skills, with the ability to articulate complex statistical concepts clearly. Personal Competencies: A results-oriented professional with a more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
for advanced content analysis and indexing & developing RAG services . Experience in managing data workflows and Kubernetes clusters on AKS, utilizing tools such as Airflow, Kubeflow, Argo, and Dagster. Familiarity using scripting languages and tools such as Bash, PowerShell, Azure CLI, Terraform, and Helm Charts. Additional Information Location : This more »
Experienced creating data pipelines on a cloud (preferably AWS) environment CI/CD experience Containerization experience (Docker, Kubernetes, etc.) Experience with SQS/SNS, Apache Kafka, RabbitMQ Other interesting/bonus skills – Airflow, Trino, Apache Iceburg, Postgres, MongoDB You *must* be eligible to work in your chosen more »
Key technical experience: Ability to operate in a fast changing environment. Fluent in English Previous cloud based infrastructure experience, particularly with AWS. Experience using Airflow and dbt Expert SQL knowledge Solid understanding of Dimensional Data Modelling. Experience with at least one or more of these programming languages: Python, Scala …/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with ambiguity and change. A self-starter who's able to work independently where necessary more »
South East London, England, United Kingdom Hybrid / WFH Options
Durlston Partners
cloud-hosted data platform that would be used by the entire firm.Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster.Cloud: AWS, Lambdas, ECS servicesThis role would focus on various areas of Data Engineering including:End to End ETL pipeline development and deployment … processes and toolsImplementing Data Curation, metadata management and data quality tooling.Requirements:Strong Python/Java Software Engineering skillsExcellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and SnowflakePrevious experience with Dremio, dbt, EMR or DagsterGood Computer Science fundamentals knowledge with strong knowledge of software and data architecture.If you more »
hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End to End ETL pipeline development … Implementing Data Curation, metadata management and data quality tooling. Requirements: Strong Python/Java Software Engineering skills Excellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and Snowflake Previous experience with Dremio, dbt, EMR or Dagster Good Computer Science fundamentals knowledge with strong knowledge of software and data more »
schemas to support business requirements Develop and maintain data ingestion and processing systems using various tools and technologies, such as SQL, NoSQL, ETL, Luigi, Airflow, Argo, etc. Implement data storage solutions using different types of databases, such as relational, non-relational, or cloud-based. Working collaboratively with the client …/Azure SQL, PostgreSQL) You have framework experience within either Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving more »
financial services or energy trading industry Expertise in Python and its ecosystem of libraries and frameworks for data processing, data analysis and data visualisation Airflow, detailed understanding of architecture including schedulers, executers, operators Cloud Environments, understanding of principles, technologies and services for AWS/Azure Kubernetes EKS/AKS … including high availability Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »