working with relational databases, and programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL frameworks and tools. Understanding of DataOps, data mining, and data visualization more »
with AWS, especially AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Confidence in coding, scripting, configuring, versioning more »
production via machine learning engineering Hands-on knowledge of NoSQL and relational Databases, alongside relevant data modelling techniques ETL/ELT tooling Knowledge of DBT, Luigi or similar orchestration tooling of managing and developing a team in an agile environment Preferred Software development of a product/service provided as more »
modelling. Bonus: Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »
greenfield re-architecture from the ground up in Microsoft Azure. The Tech you'll be playing with: Azure Data Factory Azure Databricks PySpark SQL DBT What you need to bring: 1-3 year's experience in building Data Pipelines SQL experience in data warehousing Python experience would be desirable Experience more »
smooth operations and reliability. YOUR EXPERIENCE From a technical perspective, you will require: Proficiency in Tableau, SQL and cloud data warehouses. Experience with BigQuery, DBT, or Python/R is beneficial. Your background should include: Demonstrated commercial acumen and adaptability in dynamic environments. Proven ability to translate data insights into more »
smooth operations and reliability. YOUR EXPERIENCE From a technical perspective, you will require: Proficiency in Tableau, SQL and cloud data warehouses. Experience with BigQuery, DBT, or Python/R is beneficial. Your background should include: Experience leading BI capabilities as an individual contributor. Demonstrated commercial acumen and adaptability in dynamic more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the link and more »
on their current Centre of Excellence. Main duties include: Managing and developing the Looker platform/LookML Maintaining and managing the BI platform - Looker, Dbt, AWS, Redshift etc Ultimately own the long term BI roadmap. Create a self-serve data culture across the business Skills & Experience If this sounds like more »
that there should be learning in every role you do. However, some experience in the following is important for this position: Advanced SQL and dbt skills to clean, transform and validate data from Data warehouses or Data Lakes Experience and knowledge in data warehousing and data modelling best practices Experience more »
are looking for a dynamic Analytics Engineer to join their Finance Team. Requirements To qualify for this role, you will require: · Strong experience with dbt and SQL · Experience working within Cloud Environments (Redshift, Bigquery, Snowflake) Salary A successful candidate will receive: · A Salary of up to £75,000 · Excellent progression more »
team. Requirements: Experience in building, leading Analytics teams as well as maintaining a hands on involvement in day to day tasks Strong SQL & Python, Dbt Expertise in data warehouses such as Redshift, Snowflake, AWS If you are interested in this or other Analytics opportunities please contact Liam Wilson on liam.wilson more »
Python. SKILLS AND EXPERIENCE Commercial experience of implementing best coding practices using CI/CD. Strong knowledge of working within Azure. Strong understanding of DBT and Airflow. Experience building and maintaining Data Pipelines using Python. THE BENEFITS Flexible Working Private Healthcare Generous Holiday package. HOW TO APPLY Please register your more »
to support decision-making efforts. Communicate technical information with both technical and non-technical team members and collaborators. Data Transformation : employ ETL tools like DBT to transform data, making it more accessible to the broader business. Use time series graphing services such as Grafana to build visualisations, supervise trends, and more »
to stakeholders at all levels of the organization Person we are looking for: Proficiency in programming languages and tools such as Python, SQL, Snowflake, dbt Ability to contribute to team efforts & work independently Strong Data Science understanding with exposure to Data Engineering/ML Strong understanding of ML Ops tools more »
you will need: with owning the data visualisation platform within Tableau experience in data mining, statistical analysis and predictive modelling experience using SQL to DBT, big query or other cloud data warehouses worked cross-functionally with a wide range of data teams with a commercial outlook and working with subscription more »
Manchester Area, United Kingdom Hybrid / WFH Options
Oscar
alongside a team of engineers to create, maintain, and extend the analytics platform. Promote best practice and coding standards. Must Have Skills: GCP Python DBT Google Analytics 314 Terraform is a bonus! Apply Now If you are an experienced Data Engineer skilled in GCP, Python and Google Analytics and you more »
aiding Hush to develop, maintain and draw insights from business intelligence solutions consisting of Google Big Query (Data warehouse), Domo (Visualisations), Snaplogic (ELT), and DBT (Modelling). Reporting into our Head of Technology, playing a critical role in our small Data Engineering team of 2, you’ll have the opportunity more »
Stay hands on to support your team Were just looking for someone who has come from an AWS focused background who has experience with DBT and has Led a team before. This is a fantastic opportunity for someone with leadership experience looking to take that next step in their career. more »
Data Modelling. The primary focus of this role will be the development of a Snowflake data warehouse environment and will include using ETL tools, DBT and Power BI for reporting and data visualisations. As the ideal candidate you will be adept at working with large data sets to develop advanced more »
to stakeholders at all levels of the organization Person we are looking for: Proficiency in programming languages and tools such as Python, R, Snowflake, dbt Experienced in mentoring & developing teams Strong understanding of statistical concepts, ML Ops tools, algorithms, and techniques, with practical experience in applying them to real-world more »
products and use your expertise to work collaboratively with the existing Insight team. Expected: Help to build new, scalable ELT pipelines using SQL and dbt Collaborate with other analysts and contribute to product development in Python and Plotly Dash Contribute innovative ideas to improve the product and current processes Contribute more »
Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Jefferson Frank
products and use your expertise to work collaboratively with the existing Insight team. Expected: Help to build new, scalable ELT pipelines using SQL and dbt Collaborate with other analysts and contribute to product development in Python and Plotly Dash Contribute innovative ideas to improve the product and current processes Contribute more »
City of London, London, United Kingdom Hybrid / WFH Options
Jefferson Frank
products and use your expertise to work collaboratively with the existing Insight team. Expected: Help to build new, scalable ELT pipelines using SQL and dbt Collaborate with other analysts and contribute to product development in Python and Plotly Dash Contribute innovative ideas to improve the product and current processes Contribute more »
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Jefferson Frank
products and use your expertise to work collaboratively with the existing Insight team. Expected: Help to build new, scalable ELT pipelines using SQL and dbt Collaborate with other analysts and contribute to product development in Python and Plotly Dash Contribute innovative ideas to improve the product and current processes Contribute more »