Stroud, south east england, United Kingdom Hybrid / WFH Options
Data Engineer
continual improvement opportunities. Knowledge and skills Experience as a Data Engineer or Analyst Databricks/Apache Spark SQL/Python BitBucket/GitHub. Advantageous dbt AWS Azure Devops Terraform Atlassian (Jira, Confluence) About Us What's in it for you... Healthcare plan, life assurance and generous pension contribution Volunteering Day More ❯
What we need from you Strong SQL development skills-experience with Snowflake is a big plus, and bonus points if you've worked with dbt or SQLMesh. A track record of designing and delivering complex, high-impact data solutions. Solid understanding of ELT processes and the technologies that enable them. More ❯
with both technical and business teams to elicit requirements and understand the business. Your Profile: Essential skills/knowledge/experience: Experience in Snowflake, DBT, Glue, Airflow. Experience with AWS Services. Good communication and stakeholder management skills. Ability to communicate with both business and technical colleagues at all levels. Knowledge More ❯
with both technical and business teams to elicit requirements and understand the business. Your Profile: Essential skills/knowledge/experience: Experience in Snowflake, DBT, Glue, Airflow. Experience with AWS Services. Good communication and stakeholder management skills. Ability to communicate with both business and technical colleagues at all levels. Knowledge More ❯
with both technical and business teams to elicit requirements and understand the business. Your Profile: Essential skills/knowledge/experience: Experience in Snowflake, DBT, Glue, Airflow. Experience with AWS Services. Good communication and stakeholder management skills. Ability to communicate with both business and technical colleagues at all levels. Knowledge More ❯
Experience with Data modelling, Data warehousing, and building ETL pipelines Experience with AWS (S3, EKS, EC2, RDS) or similar cloud services, Snowflake, Fivetran, Airbyte, dbt, Docker, Argo Experience in SQL, Python, and Terraform Experience with building Data pipelines and applications to stream and process datasets Robust understanding of DevOps principles More ❯
non-routine issues and identify improvements in the testing and validation of data accuracy. Extensive experience with Snowflake is essential and working knowledge of DBT, Airflow and AWS is highly desirable. Strong background developing, constructing, testing, and maintaining practical data architectures and drive improvements in data reliability, efficiency, and quality. More ❯
businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. Airflow, dbt, MLFlow, or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational teams We More ❯
month initial contract 1x per month in London office INSIDE IR35 Technical info Tableau/Power BI Palantir Foundry (not essential) SQL, Snowflake & DBT Python AWS If you're available for a new contract, please apply online More ❯
month initial contract 1x per month in London office INSIDE IR35 Technical info Tableau/Power BI Palantir Foundry (not essential) SQL, Snowflake & DBT Python AWS If you're available for a new contract, please apply online More ❯
month initial contract 1x per month in London office INSIDE IR35 Technical info Tableau/Power BI Palantir Foundry (not essential) SQL, Snowflake & DBT Python AWS If you're available for a new contract, please apply online More ❯
ideally Delta Lake on Databricks or similar) Deep expertise in workflow orchestration using Airflow, with production-grade DAGs and solid dependency management, knowledge of dbt is preferable Strong background in building reliable, scalable batch and streaming pipelines using Spark (ideally Scala) Python and SQL Hands-on experience implementing data quality More ❯
ideally Delta Lake on Databricks or similar) Deep expertise in workflow orchestration using Airflow, with production-grade DAGs and solid dependency management, knowledge of dbt is preferable Strong background in building reliable, scalable batch and streaming pipelines using Spark (ideally Scala) Python and SQL Hands-on experience implementing data quality More ❯
environment Drive continuous improvements to architecture, infrastructure, and workflow automation Core Tech Stack: Must-have : Google Cloud Platform (GCP), Apache Airflow Nice-to-have : dbt, Terraform, Kubernetes Bonus : Familiarity or curiosity about generative AI tools (e.g. ChatGPT) Ideal Candidate: 4+ Years experience in a Data/Platform engineering role Proven More ❯
environment Drive continuous improvements to architecture, infrastructure, and workflow automation Core Tech Stack: Must-have : Google Cloud Platform (GCP), Apache Airflow Nice-to-have : dbt, Terraform, Kubernetes Bonus : Familiarity or curiosity about generative AI tools (e.g. ChatGPT) Ideal Candidate: 4+ Years experience in a Data/Platform engineering role Proven More ❯
requirement for a Principal Data Engineer to spearhead an exciting new project for them, ideally with good knowledge of: Data Warehousing (Snowflake) Data Pipelines (DBT, Kafka) Programming (Python) Relational Databases (SQL, PostgreSQL, LookML) Data Visualisation (Looker) DevOps (Docker, K8s, Terraform) But most importantly they are looking for individuals with an More ❯
requirement for a Principal Data Engineer to spearhead an exciting new project for them, ideally with good knowledge of: Data Warehousing (Snowflake) Data Pipelines (DBT, Kafka) Programming (Python) Relational Databases (SQL, PostgreSQL, LookML) Data Visualisation (Looker) DevOps (Docker, K8s, Terraform) But most importantly they are looking for individuals with an More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Trust In SODA
requirement for a Principal Data Engineer to spearhead an exciting new project for them, ideally with good knowledge of: Data Warehousing (Snowflake) Data Pipelines (DBT, Kafka) Programming (Python) Relational Databases (SQL, PostgreSQL, LookML) Data Visualisation (Looker) DevOps (Docker, K8s, Terraform) But most importantly they are looking for individuals with an More ❯
Objectives Develop and design reports and dashboards in Tableau Update existing dashboards as part of Data Warehouse migration Data cleansing, preparation and modelling using DBT or AWS Athena Requirement gathering and translating to business needs Cloud Infrastructure knowledge will be a big advantage for this role. Experience of managing IT More ❯
Objectives Develop and design reports and dashboards in Tableau Update existing dashboards as part of Data Warehouse migration Data cleansing, preparation and modelling using DBT or AWS Athena Requirement gathering and translating to business needs Cloud Infrastructure knowledge will be a big advantage for this role. Experience of managing IT More ❯
Objectives Develop and design reports and dashboards in Tableau Update existing dashboards as part of Data Warehouse migration Data cleansing, preparation and modelling using DBT or AWS Athena Requirement gathering and translating to business needs Cloud Infrastructure knowledge will be a big advantage for this role. Experience of managing IT More ❯
ability to stay motivated, navigate challenges, and drive forward our analytics offering will be crucial for your success Preferred but not required : Familiarity with dbt and other data analysis tools would be a bonus Bonus points: Experience using cloud notebooks and AWS containerisation Experience with Streamlit Experience working in fast More ❯
to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational More ❯
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯