Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »
Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong more »
Python/Javascript/C# Familiarity with statistical/machine learning/AI concepts and techniques Understanding of data pipeline/orchestration tools e.g. dbt, dataform Appreciation of GCP’s serverless technologies e.g. Cloud Run/Workflows Understanding of Google’s marketing stack, Google Analytics, Google Tag Manager, Google Ads more »
M50, Trafford Park, Trafford, Greater Manchester, United Kingdom
Hoist Finance
SQL Server, Oracle, MySQL, PostgreSQL) Knowledge of BI Stack design and implementation Any knowledge in some of the following areas is an advantage:- Snowflake, DBT, Azure Technologies including Azure Data Factory, Azure Data Lake, Azure DevOps, Powershell, GIT, Python. Excellent communication and interpersonal skills As a Data Engineer you will more »
Employment Type: Permanent
Salary: £40000 - £45000/annum + Car Allowance + Bonus
tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems and pipelines Evaluate more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry standards like CRISP more »
years of demonstrated commercial experience as a Data Engineer or similar role within large-scale environments dealing with large data sets. Expertise in SQL & dbt and ideally Kafka Significant Python coding skills Containerisation experience (Dockers, Kubernetes) Cloud Computing experience ( GCP/AWS/Azure ) Strong preference for a Snowflake background more »
business requirements spanning a number of systems At least 10 years of relevant experience Hands-on in-depth experience in the following: Snowflake/DBT/Airflow Background/working experience in the following: Azure Power BI/DAX Traditional SQL (SqlServer, MySql, Postgres) JIRA, Confluence, (Github/BitBucket) Azure more »
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information Legal more »
hands-on software engineering using a broad range of technologies including the following: Java or Python Microservices Data pipelines and database programming such as DBT, SQL, BigQuery, Cloud Composer etc CI/CD/DevOps tooling experience e.g., GIT, Jenkins etc What you'll get to learn (any previous experience more »
hands-on software engineering using a broad range of technologies including the following: Java or Python Microservices Data pipelines and database programming such as DBT, SQL, BigQuery, Cloud Composer etc CI/CD/DevOps tooling experience e.g., GIT, Jenkins etc What you'll get to learn (any previous experience more »
working with relational databases, and programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL frameworks and tools. Understanding of DataOps, data mining, and data visualization more »
with AWS, especially AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Confidence in coding, scripting, configuring, versioning more »
production via machine learning engineering Hands-on knowledge of NoSQL and relational Databases, alongside relevant data modelling techniques ETL/ELT tooling Knowledge of DBT, Luigi or similar orchestration tooling of managing and developing a team in an agile environment Preferred Software development of a product/service provided as more »
modelling. Bonus: Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »
smooth operations and reliability. YOUR EXPERIENCE From a technical perspective, you will require: Proficiency in Tableau, SQL and cloud data warehouses. Experience with BigQuery, DBT, or Python/R is beneficial. Your background should include: Demonstrated commercial acumen and adaptability in dynamic environments. Proven ability to translate data insights into more »
smooth operations and reliability. YOUR EXPERIENCE From a technical perspective, you will require: Proficiency in Tableau, SQL and cloud data warehouses. Experience with BigQuery, DBT, or Python/R is beneficial. Your background should include: Experience leading BI capabilities as an individual contributor. Demonstrated commercial acumen and adaptability in dynamic more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the link and more »
the ability to inspire and mentor engineers, teaching them best practices and fostering a culture of continuous improvement within an agile framework. Experience with dbt and Snowflake or other cloud based data warehouses. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. Detail-oriented more »
on their current Centre of Excellence. Main duties include: Managing and developing the Looker platform/LookML Maintaining and managing the BI platform - Looker, Dbt, AWS, Redshift etc Ultimately own the long term BI roadmap. Create a self-serve data culture across the business Skills & Experience If this sounds like more »
that there should be learning in every role you do. However, some experience in the following is important for this position: Advanced SQL and dbt skills to clean, transform and validate data from Data warehouses or Data Lakes Experience and knowledge in data warehousing and data modelling best practices Experience more »
are looking for a dynamic Analytics Engineer to join their Finance Team. Requirements To qualify for this role, you will require: · Strong experience with dbt and SQL · Experience working within Cloud Environments (Redshift, Bigquery, Snowflake) Salary A successful candidate will receive: · A Salary of up to £75,000 · Excellent progression more »
be delivering key projects for clients across the UK, Northern Europe, and North America. Expertise in Google Cloud and modern data stack technologies (Fivetran, DBT, Airflow, BigQuery, etc). Lead and sometimes work within project teams. Manage multiple concurrent projects and meet client deadlines. What you need: Have hands-on more »
team. Requirements: Experience in building, leading Analytics teams as well as maintaining a hands on involvement in day to day tasks Strong SQL & Python, Dbt Expertise in data warehouses such as Redshift, Snowflake, AWS If you are interested in this or other Analytics opportunities please contact Liam Wilson on liam.wilson more »
Python. SKILLS AND EXPERIENCE Commercial experience of implementing best coding practices using CI/CD. Strong knowledge of working within Azure. Strong understanding of DBT and Airflow. Experience building and maintaining Data Pipelines using Python. THE BENEFITS Flexible Working Private Healthcare Generous Holiday package. HOW TO APPLY Please register your more »