solid understanding of micro-service architectures and CAP theorem. A good understanding of functional paradigms and type theory. Confident JVM knowledge. Modern Java, Ruby, or Clojure knowledge. Experience with Airflow or other Python-based workflow orchestration tools. Exposure to Kubernetes, Docker, Linux, Kafka, RabbitMQ, or git. Knowledge of financial concepts, exchange trading, or physical energy trading. More ❯
data into a data platform using Fivetran. Experience of developing BI dashboards using Power BI. Knowledge of security concepts relevant to Azure. Experience of workflow management tools such as Apache Airflow. Interested in the role? Complete the online application. We look forward to getting to know you. Discover more about LGT Wealth Management A message from our CEO Ben More ❯
London, England, United Kingdom Hybrid / WFH Options
LHV Bank
with the ability to work cross-functionally in an Agile environment Desirable: Exposure to data product management principles (SLAs, contracts, ownership models) Familiarity with orchestration tools and observability platforms (Airflow, dbt, Monte Carlo, etc.) Exposure to real-time/streaming pipelines Understanding of information security best practices Familiarity with BI tools (QuickSight, Power BI, Tableau, Looker, etc.) Interest or More ❯
experience with DBT, building and maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services ( S3, Athena, Glue ) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses ( Snowflake, Redshift, BigQuery ) A pragmatic problem solver who can balance technical excellence with business needs More ❯
and analytical databases along with languages such as Spark/PySpark. Familiarity with Power Query (M) language and advanced transformations. Strong programming skills in Python. Exposure to orchestration tools (Airflow, dbt, Dataform) and integration with Power BI. Experience with Azure-based data services (Azure Data Lake, Synapse, Data Factory, Fabric) and their integration with Power BI. Knowledge of data More ❯
data into a data platform using Fivetran. Experience of developing BI dashboards using Power BI. Knowledge of security concepts relevant to Azure. Experience of workflow management tools such as Apache Airflow. Do you feel addressed? Then simply fill out the online application. We look forward to getting to know you. Transparency is important to us. That is why you More ❯
and relevant documentation What You'll Need Expert level (5+ years' experience) Python, including data manipulation packages Expert level (5+ years' experience) SQL Object Oriented Programming (OOP) Familiar with Airflow Familiar with the Software Development Lifecycle Creative problem-solving Meticulous attention to detail Comfortable with working independently and taking ownership Willingness to work outside of area of expertise What More ❯
experience with DBT, building and maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance technical excellence with business needs More ❯
London, England, United Kingdom Hybrid / WFH Options
CreateFuture
love to talk to you if: You've led technical delivery of data engineering projects in a consultancy or client-facing environment You're experienced with Python, SQL, dbt, Airflow and cloud-native data tools (AWS, GCP or Azure) You have strong knowledge of data architecture patterns - including Lakehouse and modern warehouse design (e.g. Snowflake, BigQuery, Databricks) You know More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
PHMG
environment using agile methodologies with operational targets crossing multiple departments Nice-to-have experience: Infrastructure as code (Terraform, cloud formation etc...) Cloud platforms (Aws, Azure, GCP etc..) Data orchestration (Airflow, Dagster etc...) The Team The Data & Analytics department serves as the central hub for data engineering and insights across our organization, comprising four essential teams: Data Engineering, Reporting, Commercial More ❯
skills, with the ability to work cross-functionally in an Agile environment Exposure to data product management principles (SLAs, contracts, ownership models) Familiarity with orchestration tools and observability platforms (Airflow, dbt, Monte Carlo, etc.) Exposure to real-time/streaming pipelines Understanding of information security best practices Familiarity with BI tools (QuickSight, Power BI, Tableau, Looker, etc.) Interest or More ❯
full scientific stack: numpy, scipy, pandas, scikit-learn to name a few of the libraries used extensively. For storage, they rely heavily on MongoDB. They use Docker, Kubernetes and Airflow to streamline deployments and leverage OpenFin and React for front-end development. Because of the small team size and the dynamic nature of the business, technology choices are not More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Dept Agency
architectures, data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to More ❯
you’ve got experience doing this in Looker/Superset. You are proficient with SQL and know your way around data pipeline management tools such as Snowflake, dbt, Git, Airflow and Python (all analysts at Wise are expected to be full-stack and should be comfortable owning analytics from ingestion to insights). Nice To Have But Not Essential More ❯
such as Tableau, with a focus on optimizing underlying data structures for dashboard performance. Ingestion and orchestration tools: Skilled in using pipeline orchestration and data ingestion tools such as Airflow and Stitch, along with Python scripting for integrating diverse data sources. Large-scale data processing: Proficient with distributed query engines like AWS Athena or SparkSQL for working with datasets More ❯
architectures, data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to More ❯
data modelling. You are highly proficient in SQL and deeply understand Snowflake architecture. Tooling Mastery: Hands-on experience with DBT, Looker (or similar BI tools), and orchestration frameworks like Airflow or AWS Step Functions. Mindset: You're proactive, customer-obsessed, and see data as a product. You think scalability, self-service, and platform-first. Collaborative Leader: You enjoy partnering More ❯
London, England, United Kingdom Hybrid / WFH Options
Menlo Ventures
growth potential: Data Infrastructure: We build and maintain the data systems powering Anthropic's AI research and products. You'll design and optimize data pipelines using tools like Spark, Airflow, and dbt across GCP and AWS. Your work will ensure reliable, scalable data infrastructure while implementing governance best practices and driving continuous improvement. Core Infrastructure: The systems team is More ❯
Direct message the job poster from KBC Technologies Group The ideal candidate will have a minimum of 5+ years of experience with strong expertise in Snowflake, DBT, Python, and AWS to deliver ETL/ELT pipelines. Proficiency in Snowflake data More ❯
Social network you want to login/join with: Software Engineer - Data Reporting, London Client: Selby Jennings Location: London, United Kingdom Job Category: Other EU work permit required: Yes Job Views: 1 Posted: 23.05.2025 Expiry Date: 07.07.2025 Job Description: Software More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
Position: Data Engineer Location: Cambridge/Luton, UK (Hybrid 2-3 days onsite in a week) Duration: Long Term B2B Contract Job Description: The ideal candidate will have a minimum of 5+ years of experience working with Snowflake, DBT, Python More ❯
Social network you want to login/join with: Software Engineer - Data Reporting, slough col-narrow-left Client: Selby Jennings Location: slough, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 31.05.2025 More ❯
We're looking for a Senior Data Engineer to join Pleo and help us in our journey in our Business Analytics team. This team is responsible for delivering and enhancing high-quality, robust data solutions that drive commercial performance, revenue More ❯
AND EXPERIENCE: A successful Data Engineer will have the following skills and experience: Ability and experience interacting with key stakeholders Strong experience in SQL/Python Good understanding of Airflow/DBT Experience with GCP/AWS Background in CI/CD THE BENEFITS: You will receive a salary, dependent on experience. Salary is up to £60,000. On More ❯
data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like ApacheAirflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop … or Composer (ApacheAirflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL and experience with schema design and query optimization for large More ❯