how products are performing and tracking metrics. Key responsibilities🔑 Design and implement high-performance, scalable, and reusable data models for their data warehouse using dbt and Snowflake. Design and implement Looker structures to enable users across the organisation to self-serve analytics. Collaborate with data analysts and business teams to … scale data environments. experience in SDLC in analytics including version control, testing & CI/CD. professional experience with SQL and data transformation, ideally with dbt or similar. with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP. Apply to the Role Roles like these are snapped up more »
Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong more »
firm grounding in SQL and Python, enabling you to lead & develop our most sophisticated work Experience of building/developing/managing data using dbt within a data architecture such as Data Vault Strong interpersonal skills with the ability to work with customers to establish requirements, designs and deliver the more »
Description: Senior Data Engineer (SAS) – They want to rewrite existing pipelines (SAS, Info, pear, shell scripts) to target state which will be Python and DBT transformation based to be then loaded to SF on Azure. SAS (primary), pearl, shell scripts (secondary) engineers who has experience to convert to Python , DBT … and load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
london, south east england, United Kingdom Hybrid / WFH Options
ByteHire
Required Skills and Experience: Extensive experience in: Data Warehousing Data Engineering, overall Data Analytics Data Visualisation Proficiency in: Google Cloud (GCP) GCP BigQuery Python DBT or similar FastAPI or similar Airflow or similar Desirable: Google Apigee (as an application developer) Exposure to Machine Learning projects Exposure to DataOps Exposure to more »
Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong more »
Required Skills and Experience: Extensive experience in: Data Warehousing Data Engineering, overall Data Analytics Data Visualisation Proficiency in: Google Cloud (GCP) GCP BigQuery Python DBT or similar FastAPI or similar Airflow or similar Desirable: Google Apigee (as an application developer) Exposure to Machine Learning projects Exposure to DataOps Exposure to more »
london, south east england, United Kingdom Hybrid / WFH Options
PURVIEW
Description: Senior Data Engineer (SAS) – They want to rewrite existing pipelines (SAS, Info, pear, shell scripts) to target state which will be Python and DBT transformation based to be then loaded to SF on Azure. SAS (primary), pearl, shell scripts (secondary) engineers who has experience to convert to Python , DBT … and load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
you will: Automate and scale datasets aligned with specific use cases, facilitating seamless integration into operational workflows. Design, develop, and maintain data models using dbt, ensuring the efficient transformation of raw data into actionable insights. Build and optimize query pipelines in dbt and SQL to extract, transform, and load data … EXPERIENCE To qualify for this Analytics Engineer role, you will require: Proficiency in Advanced SQL for data querying and manipulation. Experience managing dbt (DataBuildTool) for building and maintaining data transformation pipelines. Ability to build Tableau dashboards for data visualization and reporting. Familiarity with GCP (Google Cloud Platform) for more »
london (city of london), south east england, United Kingdom
iO Associates - UK/EU
tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems and pipelines Evaluate more »
Engineer, you'll be instrumental in maintaining their data platform in Snowflake, developing company-wide reports using Tableau, and utilizing cutting-edge tools like DBT, FiveTran, and Stitch. Your day to day will involve: Analyse Category data to gain insights into various aspects such as promotional performance compared to the … overall category. Compile these insights into coherent narratives or stories that highlight trends, patterns, and performance metrics Utilize DBT, FiveTran, and Stitch to streamline data processes. Collaborate with cross-functional teams to understand data needs and deliver actionable insights. Drive data-driven decision-making across the organization. WHAT YOU'LL … order to qualify for this Data Analytics Engineer role, you will require: Proven experience in building and maintaining data platforms. Expertise in Snowflake, Tableau, DBT, FiveTran, and Stitch. Proficiency in Tableau, SQL and/or Python. Strong analytical skills and the ability to translate data into actionable insights. Familiarity with more »
Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »
years of demonstrated commercial experience as a Data Engineer or similar role within large-scale environments dealing with large data sets. Expertise in SQL & dbt and ideally Kafka Significant Python coding skills Containerisation experience (Dockers, Kubernetes) Cloud Computing experience ( GCP/AWS/Azure ) Strong preference for a Snowflake background more »
with AWS, especially AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Confidence in coding, scripting, configuring, versioning more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the link and more »
are looking for a dynamic Analytics Engineer to join their Finance Team. Requirements To qualify for this role, you will require: · Strong experience with dbt and SQL · Experience working within Cloud Environments (Redshift, Bigquery, Snowflake) Salary A successful candidate will receive: · A Salary of up to £75,000 · Excellent progression more »
City Of London, England, United Kingdom Hybrid / WFH Options
Harnham
Gaming or Entertainment experience. Experience managing, mentoring, or coaching a small team of analytics engineers. Advanced knowledge and commercial experience with tools such as DBT, Redshift, and other AWS tools. Good educational background is preferred. Strong communication skills. A passion for gaming! THE BENEFITS A salary of up to more »
Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing *You must be based in London, and have full permanent right to work in the UK to apply for this role* I'm currently working with a leading media agency, specialised … above ASAP and I will be in touch. Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing more »
tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems and pipelines Evaluate more »
West London, London, United Kingdom Hybrid / WFH Options
Recruitment Revolution
strategy, solving complex problems and building robust solutions for a dynamic, growing business. Intrigued? Read on. Role info: eCom BI Specialist. Shopify, Big Query, DBT London Based/Remote Working £50,000 - £60,000 Plus Bonus (OTE £75,000+) Product/Service: Performance Marketing Consultancy for new and scaling brands … considered Top 150 independent agencies in Europe by Google. We are premier partners with all key platforms. Your Skills: Deep Supermetrics + GitHub + DBT + Google Big Guery + Looker Studio + Shopify About us: We are a multi award winning online eCommerce boutique agency. Google ranks us in … reporting and visualisation in Looker Studio + Highly proficient in GCP/GBQ, SQL, data modelling and ETL processes + Experience with Supermetrics, GitHub, DBT, Google Big Guery and Shopify + Strong communication and presentation skills to effectively convey insights to both technical and non-technical stakeholders + Ability to more »
a crucial role in assessing, analysing, and enhancing our data ingestion, transformation, and storage layers. Your primary focus will be on Go Programming and DBT, with secondary skills in Google Cloud Services. The successful candidate will bring 5 to 10 years of experience in developing robust data pipelines, conducting unit … to 10 years of experience as a Golang Developer Proven expertise in data warehousing and ETL processes Hands-on experience with Go Programming and DBT Familiarity with Google Cloud Services Strong problem-solving and analytical skills Excellent communication and collaboration abilities If you feel you would be a great match more »
Join a team at the heart of the global economy! The Department for Business and Trade (DBT) and Inspire People are partnering together to bring you an exciting opportunity for a Technical Architect to oversee the technical architecture, services and their design from inception to completion at portfolio level, working … pension contribution. Flexible, hybrid working from London, Cardiff, Darlington, Edinburgh, Belfast, Birmingham or Salford. You will work within a collaborative, supportive team and as DBT look to grow and deliver more services, they are looking for an experienced technical architect with a strategic mindset to lead the design of services … the opportunity to step up to a portfolio level. The primary tech stack is Python/Django, Nodejs/React, running on AWS and DBT optimises the architecture for change. DBT's Digital, Data and Technology team develops and operates tools, services, and platforms that enable the UK government to more »
on their current Centre of Excellence. Main duties include: Managing and developing the Looker platform/LookML Maintaining and managing the BI platform - Looker, Dbt, AWS, Redshift etc Ultimately own the long term BI roadmap. Create a self-serve data culture across the business Skills & Experience If this sounds like more »