help them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from More ❯
variety of databases Working knowledge of one or more of the cloud platforms (AWS, Azure, GCP) Experience building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software More ❯
Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. A background in More ❯
pipelines and infrastructure to ensure efficient data flow across the business. Key Responsibilities Develop, support and optimise robust data solutions using tools like Snowflake, dbt, Fivetran, and Azure Cloud services Collaborate with cross-functional teams to translate business needs into actionable data architecture Design and manage data pipelines and integration More ❯
languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience managing or mentoring More ❯
data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for More ❯
efficient code and comfortable undertaking system optimisation and performance tuning tasks Comfortable working with relational databases such as Oracle, PostgreSQL, MySQL Has exposure to DBT and data quality test frameworks Has awareness of Infrastructure as Code such as Terraform and Ansible BENEFITS Competitive Salary. Company Laptop supplied. Bonus Scheme. More ❯
Cambridge, Cambridgeshire, UK Hybrid / WFH Options
Intellect Group
Nice to Have Experience working within a consultancy or client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work More ❯
the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days per week More ❯
practices. Nice-to-Have Skills Exposure to AWS Redshift, Glue, or Snowflake. Familiarity with BigQuery and Google Analytics APIs. Proficiency in Python, PySpark, or dbt for data transformations. Background in insurance, especially in pricing analytics or actuarial data. Click Apply More ❯
a leader within the business over time. The current tech stack is varied, and it’s currently made up of TypeScript, Python, PostgreSQL, Redis, DBT on AWS. You’ll be encouraged to take ownership of products and you’ll be given this autonomy from the co-founders. The products handle More ❯
a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT More ❯
timelines and quality standards are met. Required Skills & Experience: 5+ years' experience as a Data Analyst Strong skills in Python, SQL and tools like dbt, Snowflake, AWS S3 and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown Asset Management Limited
low latency data pipeline with the following skills. Data Engineering Skills Modelling Orchestration using Apache Airflow Cloud native streaming pipelines using Flink, Beam etc. DBT Snowflake Infrastructure Skills Terraform Devops Skills Experienced in developing CI/CD pipelines Integration Skills REST and Graph APIs (Desirable) Serverless API development ( e.g. Lambda More ❯
Redshift Experience in open-source technologies like Spark, Kafka, Beam understanding of Cloud tools such as AWS, Microsoft Azure or Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in an agile environment About Us We are Citation. We are far from your average service provider. Our colleagues bring More ❯
Excellent communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master’s or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
databases (Structured, non-relational, Graph etc.) CI/CD experience Python or Java experience is preferred GIS experience is desirable. ETL experience (Ideally with DBT, but not a hard requirement) Benefits ? Hybrid working – 2 days per week on site. Up to £65,000 Private Medical 4x DIS 5% employer pension More ❯
data/business catalogs). Knowledge of at least one of the following technologies/methodologies will be an additional advantage: Python, Streamlit, Matillion, DBT, Atlan, Terraform, Kubernetes, Data Vault, Data Mesh Ability to engage with principal data architects of client stakeholders Excellent presentation and communication skills. This role will More ❯
essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced data science role so advanced experience in these will not be More ❯
communicator able to interface confidently with both technical and non-technical audiences Bonus Experience • Familiarity with IaC frameworks (CloudFormation, Terraform, SAM) • Exposure to Snowflake, DBT, Airflow, or cost analytics/data pipeline tools • Knowledge of FinOps practices or cost intelligence platforms • Experience contributing to open-source platforms or cloud-native More ❯
tech stack includes: Infrastructure: Google Cloud Backend: Python for internal API codebase, running on a Postgres database hosted with Google Cloud SQL Data: BigQuery, DBT, and Tableau Frontend: Typescript and React Mobile/Apps: Flutter What’s on Offer Generous time off. Equity package. Comprehensive health coverage. Pension contributions Hybrid More ❯
. AWS based capacity management, performance logging and monitoring solutions. AWS cost management. Your Profile: Essential skills/knowledge/experience: Experience in Snowflake, DBT, Glue, Airflow. Proficiency in architecture and building solutions to support highly available and scalable web sites, batch processes, services and API’s. Understanding of IAM More ❯
Data Engineering teams, with the ability to translate technical and business needs Knowledge of Snowflake and AWS platforms Familiarity with ETL tools such as DBT and Matillion Understanding of SQL and Python—enough to contribute to technical discussions and guide decisions Excellent communication, planning, and stakeholder management skills If this More ❯
field. 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS. Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.) Hands on experience with Oracle RDBMS Data Migration experience to Snowflake Experience with AWS services such as S3, Lambda, Redshift, and Glue. Strong understanding More ❯