with cloud-based data warehouses (e.g., Databricks, Snowflake, Redshift). Ability to optimize queries and pipelines for efficiency and reliability. Bonus: Experience with dbt, Airflow, or visualization tools like Tableau. Excellent communication skills and ability to document technical solutions effectively. For this position you need to be eligible to More ❯
SQL expertise designing and maintaining ETL/data pipelines ideally proficiency in multiple cloud infrastructures, databases and data warehousing solutions - AWS and GCP being Airflow 👍 Bonus points for: Experience working in a small, fast-growing start-up, comfortable navigating unstructured/fuzzy environments Experience with RudderStack, Expo and/ More ❯
modelling, and extracting value from large, disconnected datasets Familiarity with data warehouse design and modelling Familiarity with cloud data services (preferably Azure) Experience with Airflow 🛂 Please note : unfortunately, this role does not offer VISA sponsorship. More ❯
experience in Snowflake architecture, including data loading, transformation, and performance tuning. Proficient in ETL processes using tools such as Informatica PowerCenter and BDM, AutoSys, Airflow, and SQL Server Agent. Experience with cloud platforms preferably AWS. Strong knowledge of AWS cloud services, including EMR, RDS Postgres, Redshift Athena, S3, and More ❯
modelling, and extracting value from large, disconnected datasets Familiarity with data warehouse design and modelling Familiarity with cloud data services (preferably Azure) Experience with Airflow 🛂 Please note : unfortunately, this role does not offer VISA sponsorship. More ❯
So they offer very flexible working arrangements through both WFH options and flexi working hours. Experience required... Expert in Snowflake Strong DBT experience Strong Airflow experience Expert knowledge and understanding of Data Warehousing Strong AWS experience More ❯
london, south east england, united kingdom Hybrid / WFH Options
Tenth Revolution Group
So they offer very flexible working arrangements through both WFH options and flexi working hours. Experience required... Expert in Snowflake Strong DBT experience Strong Airflow experience Expert knowledge and understanding of Data Warehousing Strong AWS experience More ❯
Runcorn, Cheshire, North West, United Kingdom Hybrid / WFH Options
Forward Role
more Key Responsibilities Develop and maintain ETL/ELT data pipelines using Python and SQL. Work with enterprise-level cloud platforms, ideally GCP (BigQuery, Airflow, Cloud Functions). Integrate APIs and process data from multiple sources. Design and optimise data warehouses and reporting systems. Build reports and dashboards using More ❯
Python, dbt, and data modelling, as well as experience or a strong interest in blockchain technology. Twinstake utilises a modern data stack including Airbyte, Airflow, Snowflake, dbt and Quicksight. What you will contribute: Data Modelling : Building scalable data models to transform complex datasets into actionable insights, using advanced SQL More ❯
Python, dbt, and data modelling, as well as experience or a strong interest in blockchain technology. Twinstake utilises a modern data stack including Airbyte, Airflow, Snowflake, dbt and Quicksight. What you will contribute: Data Modelling : Building scalable data models to transform complex datasets into actionable insights, using advanced SQL More ❯
scikit-learn, and strong SQL skills. Deep understanding of software engineering best practices. Experience with tools like Azure, Azure ML, GitHub Actions, Terraform, Packer, Airflow, Docker, and Kubernetes. Expertise in Linux/Windows VM administration. Reference Number: BBBH248582 To apply for this role or to be considered for further More ❯
Master's or PhD in a relevant field (e.g., Computer Science, Data Science, Engineering, Applied Mathematics, Statistics, etc.). Proficiency in Python, SQL, AWS, Airflow, PySpark, PyTorch, NumPy, and related data technologies. Experience with cloud infrastructure, data pipelines, and machine learning model deployment. Proven experience leading diverse teams of More ❯
Master's or PhD in a relevant field (e.g., Computer Science, Data Science, Engineering, Applied Mathematics, Statistics, etc.). Proficiency in Python, SQL, AWS, Airflow, PySpark, PyTorch, NumPy, and related data technologies. Experience with cloud infrastructure, data pipelines, and machine learning model deployment. Proven experience leading diverse teams of More ❯
cloud provider Frontend development with React or another modern web framework DevOps Kubernetes Infrastructure engineering with Terraform, Pulumi or similar Data workload orchestration with Airflow or similar Containerisation with Docker Experience with SQL, as well as relational database design and administration Experience in other tools not listed is also More ❯
Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services: EC2, EMR, RDS, Redshift. Experience with stream-processing systems: Storm, Spark-Streaming, etc. Experience with object-oriented/ More ❯
management processes, and data profiling. Experience in developing APIs and working with WebSockets. Knowledge of React, Django, FastAPI, or equivalent technologies. Previous experience with AirFlow, Linux shell commands and setup websites on IIS. What we would like from you: Bachelor's or Master's degree in a relevant field. More ❯
methods and forecasting. Expertise in data querying languages (e.g. SQL) and scripting languages (e.g. Python, R). Experience with data architecture technologies such as Airflow, Databricks, and dbt. Preferred qualifications: Experience in technology, financial services and/or a high growth environment. Experience with Excel and Finance systems (e.g. More ❯
/or financial instruments including equities, futures, and options Strong Unix skills along with programming skills in Python and experience in Scheduling tools like Airflow Good understanding of networking and system concepts Experience with SQL and the use of databases is a plus The ability to manage multiple tasks More ❯
Data Expertise – Experience with AWS, Azure, GCP, ETL pipelines, and scalable data architectures. DevOps & CI/CD – Strong knowledge of GitOps, Kubernetes, Terraform, Jenkins, Airflow, and related tools. Strategic Thinking – Ability to drive long-term engineering strategy while delivering incremental value. Technical Debt Management – Experience identifying and remediating inefficient More ❯
Data Expertise – Experience with AWS, Azure, GCP, ETL pipelines, and scalable data architectures. DevOps & CI/CD – Strong knowledge of GitOps, Kubernetes, Terraform, Jenkins, Airflow, and related tools. Strategic Thinking – Ability to drive long-term engineering strategy while delivering incremental value. Technical Debt Management – Experience identifying and remediating inefficient More ❯
operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. Airflow, dbt, MLFlow, or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational More ❯
commerce, retail, or the travel industry Experience designing and analysing large-scale A/B test experiments Mastery of workflow orchestration technologies such as Airflow, Dagster or Prefect Expert knowledge of technologies such as: Google Cloud Platform, particularly Vertex AI Docker and Kubernetes Infrastructure as Code Experience establishing data More ❯
Technical excellence across the Data Engineering ecosystem: We primarily work in python, go and SQL. Our code (tracked via github) deploys to GCP (Bigquery, Airflow/Composer, GKE, Cloud Run), dbt Cloud and Azure via terraform. We recognise this is a broad list; if youre not deeply familiar with More ❯
Technical excellence across the Data Engineering ecosystem: We primarily work in python, go and SQL. Our code (tracked via github) deploys to GCP (Bigquery, Airflow/Composer, GKE, Cloud Run), dbt Cloud and Azure via terraform. We recognise this is a broad list; if you're not deeply familiar More ❯
React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for More ❯