professionals. Acting as a leader and mentor to team members, fostering their professional development. Working closely with technology partners such as Google Cloud (GCP), dbt Labs, and Looker. Playing a crucial role in shaping the architecture team, driving innovation and maintaining high standards of performance. Requirements of the Cloud Architect more »
Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »
Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong more »
demonstrate extensive experience having designed and scaled a Data Platform- Has strong Python skills,- Has great SQL, preferably Snowflake- Has previous experience working with dbt & Airflow- Is passionate about solving complex data problems & is interested in working with rich & diverse climate datasets- Cares deeply about the climate and ecosystems of more »
who has hands-on experience in AWS to conduct end-to-end data analysis and data pipeline build-out using Python, Glue, S3, Airflow, DBT, Redshift, RDS, etc.Very solution-driven, and highly collaborative at providing thought leadership and soliciting diverse opinionsAccountable for results. Experienced in leading team of data engineers more »
Finance, Accounting, Economics or a related field or equivalent work experience (3+ years) Experience in: Some knowledge of database orchestration technologies + ETL (Airflow, DBT, Databricks) Working understanding of financial concepts and systems Ability to recognize and diagnose potential errors or data inconsistencies between multiple reports Working knowledge of how more »
and non-routine issues and identify improvements in the testing and validation of data accuracy.Extensive experience with Snowflake is essential and working knowledge of dbt, Airflow and AWS is highly desirable.Strong background developing, constructing, testing, and maintaining practical data architectures and drive improvements in data reliability, efficiency, and quality.A proven more »
tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems and pipelines Evaluate more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry standards like CRISP more »
firm grounding in SQL and Python, enabling you to lead & develop our most sophisticated work Experience of building/developing/managing data using dbt within a data architecture such as Data Vault Strong interpersonal skills with the ability to work with customers to establish requirements, designs and deliver the more »
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information Legal more »
Required Skills and Experience: Extensive experience in: Data Warehousing Data Engineering, overall Data Analytics Data Visualisation Proficiency in: Google Cloud (GCP) GCP BigQuery Python DBT or similar FastAPI or similar Airflow or similar Desirable: Google Apigee (as an application developer) Exposure to Machine Learning projects Exposure to DataOps Exposure to more »
Lead Data Engineer - Snowflake/DBT/Python - ecommerce - £95k Leading eCommerce client are now searching for a Lead Data Engineer to play a pivotal role in delivering their innovative data strategy. This role will shape our client’s data function, delivering end to end data requirements and designing datamore »
thinking strategically and proactively identifying opportunities. You have experience collaborating with senior business stakeholders and finance teams. You have working knowledge of Python, Airflow, dbt, Bigquery, and Looker. Additional desirables: Experience in one or more finance domains, such as Financial Reporting, Treasury, Regulatory Reporting, Financial Planning & Analysis, Financial Risk, and more »
with AWS, especially AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Confidence in coding, scripting, configuring, versioning more »
production via machine learning engineering Hands-on knowledge of NoSQL and relational Databases, alongside relevant data modelling techniques ETL/ELT tooling Knowledge of DBT, Luigi or similar orchestration tooling of managing and developing a team in an agile environment Preferred Software development of a product/service provided as more »
modelling. Bonus: Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »
greenfield re-architecture from the ground up in Microsoft Azure. The Tech you'll be playing with: Azure Data Factory Azure Databricks PySpark SQL DBT What you need to bring: 1-3 year's experience in building Data Pipelines SQL experience in data warehousing Python experience would be desirable Experience more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the link and more »
the ability to inspire and mentor engineers, teaching them best practices and fostering a culture of continuous improvement within an agile framework. Experience with dbt and Snowflake or other cloud based data warehouses. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. Detail-oriented more »
on their current Centre of Excellence. Main duties include: Managing and developing the Looker platform/LookML Maintaining and managing the BI platform - Looker, Dbt, AWS, Redshift etc Ultimately own the long term BI roadmap. Create a self-serve data culture across the business Skills & Experience If this sounds like more »
that there should be learning in every role you do. However, some experience in the following is important for this position: Advanced SQL and dbt skills to clean, transform and validate data from Data warehouses or Data Lakes Experience and knowledge in data warehousing and data modelling best practices Experience more »
use Tableau) Experience in data mining and using databases in a business environment with large-scale, complex datasets. Experience building data warehouse models using dbt (data built tool) or similar solutions and experience with Python. Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to more »
are looking for a dynamic Analytics Engineer to join their Finance Team. Requirements To qualify for this role, you will require: · Strong experience with dbt and SQL · Experience working within Cloud Environments (Redshift, Bigquery, Snowflake) Salary A successful candidate will receive: · A Salary of up to £75,000 · Excellent progression more »
be delivering key projects for clients across the UK, Northern Europe, and North America. Expertise in Google Cloud and modern data stack technologies (Fivetran, DBT, Airflow, BigQuery, etc). Lead and sometimes work within project teams. Manage multiple concurrent projects and meet client deadlines. What you need: Have hands-on more »