Mongodb, Postgresql Proven expertise in Terraform and infrastructure-as-code practices Strong SQL and data modeling skills, and experience with both SQL and NoSQL data stores Strong understanding of dbt (or equivalent) and Tableau Hands-on experience with Python for data processing and automation tasks A background working in environments with high throughput data (millions of events per hour) Understanding More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
designing and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling: SQL, Python/R, pandas, scikit … learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
designing and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling: SQL, Python/R, pandas, scikit … learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO) Desired Skills and Experience 8+ years in retail/FMCG customer insights and analytics Built customer segmentation, CLV, and propensity models in Python/R Designed and analysed A/B and multivariate tests for pricing and More ❯
Terraform, GitHub, CI/CD) Proficiency in SQL and Python for data processing and automation Experience working with data modeling tools and practices (e.g., dimensional, star/snowflake schema, dbt) Solid understanding of data governance, metadata, and quality frameworks Strong collaboration and communication skills, with the ability to work cross-functionally in an Agile environment Exposure to data product management … principles (SLAs, contracts, ownership models) Familiarity with orchestration tools and observability platforms (Airflow, dbt, Monte Carlo, etc.) Exposure to real-time/streaming pipelines Understanding of information security best practices Familiarity with BI tools (QuickSight, Power BI, Tableau, Looker, etc.) Interest or experience in building internal data communities or enablement programs Working with diverse data sources (APIs, CRMs, SFTP, databases More ❯
Good working knowledge of SQL Comfortable using Git for version control Exposure to workflow orchestration tools (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (Azure SQL, Snowflake) or dbt Basic familiarity with Docker and BI tools (Power BI, Tableau) Interest in shipping, financial markets, or commodities Package: £35-40,000 basic salary + bonus Excellent career progression opportunities More ❯
to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices in Havant More ❯
to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices in Havant More ❯
working knowledge of SQL Comfortable using Git for version control Desirables: Exposure to workflow orchestration tools (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (Azure SQL, Snowflake) or dbt Basic familiarity with Docker and BI tools (Power BI, Tableau) Interest in shipping, financial markets, or commodities Package: £35-40,000 basic salary + bonus Excellent career progression opportunities More ❯
environments, managing multiple priorities and meeting deadlines. Proficiency in SQL (BigQuery), Python, Git/GitHub, and preferably Looker (Tableau or PowerBI are acceptable as well) Above average knowledge of DBT, Docker, GCP, and Airflow Experience in the cryptocurrency industry, fintech sector, or platform-type businesses is preferred but not required. Personal Attributes Analytical mindset with a passion for data-driven … principles thinking to drive efficient solutions Highly ambitious with a results-oriented attitude and continuous improvement mindset Technologies you will work with Python SQL (BigQuery) GCP EPPO for experimentation DBT, Docker, Cloud Run/Kubernetes, and Airflow for data orchestration and data pipelines Looker data visualization Git and GitHub for code collaboration Ability to leverage AI tools such as Cursor More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
high-quality data assets Strong architectural acumen and software engineering fundamentals Experience driving adoption of data governance and improving data platform usage across internal teams stack including: Snowflake AWS DBT Airflow Python Kinesis Terraform CI/CD tools BENEFITS The successful Principal Data Engineer will receive the following benefits: Salary up to £107,000 Hybrid working: 2 days per week More ❯
a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. Able to communicate data architecture clearly with internal stakeholders. Experience with Azure, Airflow, DBT, Kubernetes, GitHub. Bonus points for: open-source contributions, an active GitHub profile, and curiosity for the latest in tech. A natural problem-solver who loves making things work. Focused on More ❯
a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. Able to communicate data architecture clearly with internal stakeholders. Experience with Azure, Airflow, DBT, Kubernetes, GitHub. Bonus points for: open-source contributions, an active GitHub profile, and curiosity for the latest in tech. A natural problem-solver who loves making things work. Focused on More ❯
Hands-on experience building and integrating with RESTful APIs using FastAPI, Django REST Framework, or similar. -Data Workflows: Experience designing and maintaining real-time and batch data pipelines, including dbt Core and stream processing tools. -Infrastructure Know-How: Confident working with Terraform and CI/CD pipelines in a cloud-native environment. -Database Familiarity: Skilled in both SQL and NoSQL More ❯
Hands-on experience building and integrating with RESTful APIs using FastAPI, Django REST Framework, or similar. -Data Workflows: Experience designing and maintaining real-time and batch data pipelines, including dbt Core and stream processing tools. -Infrastructure Know-How: Confident working with Terraform and CI/CD pipelines in a cloud-native environment. -Database Familiarity: Skilled in both SQL and NoSQL More ❯
perhaps through coursework, Kaggle competitions, or personal data projects You've shown initiative in teaching yourself new technical tools or concepts beyond what was required - such as exploring BigQuery, dbt, Airflow, Docker, or other data engineering technologies on your own time Progression This is an initial six-month engagement. If you perform well, the expectation is that you'll move More ❯
role Desirable Experience Hands-on experience with APIs from major SaaS platforms (e.g., Office 365, Salesforce, Workday, Oracle, SAP) Familiarity with our core data stack: DuckDB, Dagster, Postgres, Kafka, DBT, EKS, and Databricks Understanding of Identity and Access Management (IAM) concepts and APIs from providers like Okta, Entra ID, Ping Exposure to AI-enhanced low-code tools like Microsoft Copilot More ❯
technical teams, with excellent people development skills. Strong project management skills, with experience running complex data initiatives. Strong knowledge of modern data engineering, including SQL, Python, Airflow, Dataform/DBT, Terraform, or similar tools. Understanding of data architecture patterns (e.g., lakehouse, event-driven pipelines, star/snowflake schemas). Excellent communication and stakeholder management skills. Experience working in agile environments More ❯
Modelling Be deploying applications to the Cloud (AWS) We'd love to hear from you if you Have strong experience with Python & SQL Have experience developing data pipelines using dbt, Spark and Airflow Have experience Data modelling (building optimised and efficient data marts and warehouses in the cloud) Work with Infrastructure as code (Terraform) and containerising applications (Docker) Work with More ❯
GenAI tools (e.g., Cursor, Gemini, Claude) into development workflows. Curate and manage datasets across structured and unstructured formats and diverse domains. Contribute to metadata enrichment, lineage, and discoverability using DBT, Airflow, and internal tooling. Skills & Experience We value both traditional and non-traditional career paths. You'll ideally bring: Technical Skills 3-5 years of experience in data or analytics More ❯
Due to the nature of some of the companies clients. you must have a minimum of 5 years continuous UK residency. Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Are you passionate about building scalable data solutions that drive real business impact … will work on a variety of projects including the implementation of medallion structures for clients in different industries, data migrations and designing, implementing dashboards using technologies such as python, DBT on Azure and GCP platforms. Key Skills the Senior Data Engineer will have: 3+ years Data Engineering experience Good experience with both Azure and GCP Excellent experience of DBT, SQL … data? Apply now and become part of a business where data drives every decision. Please send your CV to peter.hutchins @ circlerecruitment.com Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Circle Recruitment is acting as an Employment Agency in relation to this vacancy. More ❯
Experience Strong SQL and Python skills for building and optimising data pipelines Experience working with cloud platforms (e.g., AWS, GCP, or Azure) Familiarity with modern data stack tools (e.g., dbt, Airflow, Snowflake, Redshift, or BigQuery) Understanding of data modelling and warehousing principles Experience working with large datasets and distributed systems What's in it for you? Up to £70k Hybrid More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
week in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … with a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯