Manchester, Greater Manchester, United Kingdom Hybrid / WFH Options
AutoTrader UK
and scalable web hosting and data platforms. Our platform is a layer on top of core Open Source technologies such as Kubernetes, Istio, Airflow, dbt, running in Public Cloud. It is the glue that allows our teams to deploy into production environments 100s of times per day with the least more »
demonstrate extensive experience having designed and scaled a Data Platform- Has strong Python skills,- Has great SQL, preferably Snowflake- Has previous experience working with dbt & Airflow- Is passionate about solving complex data problems & is interested in working with rich & diverse climate datasets- Cares deeply about the climate and ecosystems of more »
firm grounding in SQL and Python, enabling you to lead & develop our most sophisticated work Experience of building/developing/managing data using dbt within a data architecture such as Data Vault Strong interpersonal skills with the ability to work with customers to establish requirements, designs and deliver the more »
to stakeholders at all levels of the organization Person we are looking for: Proficiency in programming languages and tools such as Python, R, Snowflake, dbt Experienced in mentoring & developing teams Strong understanding of statistical concepts, ML Ops tools, algorithms, and techniques, with practical experience in applying them to real-world more »
you’ll build the solution. They work in a mix of SQL, Python programming but also operate with modern tech such as Snowflake, Airflow, DBT, Kubernetes/Docker and cubeJS whilst also using Tableau CRM and Jupyter notebooks for gaining data insights. If you think there’s a better or more »
Join a team at the heart of the global economy! The Department for Business and Trade (DBT) and Inspire People are partnering together to bring you an exciting opportunity for a Technical Architect to oversee the technical architecture, services and their design from inception to completion at portfolio level, working … pension contribution. Flexible, hybrid working from London, Cardiff, Darlington, Edinburgh, Belfast, Birmingham or Salford. You will work within a collaborative, supportive team and as DBT look to grow and deliver more services, they are looking for an experienced technical architect with a strategic mindset to lead the design of services … design of these services through both Alpha and Omega. The primary tech stack is Python/Django, Nodejs/React, running on AWS and DBT optimises the architecture for change. DBT's Digital, Data and Technology team develops and operates tools, services and platforms that enable the UK government to more »
an Analytics Engineer within the Finance team, you'll play a pivotal role in shaping their financial analytics infrastructure. Your responsibilities will include: Utilizing DBT for data modeling and SQL for analysis Managing financial metrics, ensuring control and visibility over large volumes of transactions. Collaborating with various stakeholders to analyze … pricing data. Generating insightful reports to inform strategic decision-making. YOUR EXPERIENCE To qualify for this Analytics Engineer role, you'll require: Proficiency in DBT, SQL, and Redshift. Multiple years of working in this type of role, preferably within the fintech sector. Strong communication skills. If you're passionate about more »
aiding Hush to develop, maintain and draw insights from business intelligence solutions consisting of Google Big Query (Data warehouse), Domo (Visualisations), Snaplogic (ELT), and DBT (Modelling). Reporting into our Head of Technology, playing a critical role in our small Data Engineering team of 2, you’ll have the opportunity more »
Senior Data Engineer. Java/Python/AWS. Investment Management. £135,000 - £145,000 + Discretionary Bonus and Benefits. Cannot sponsor Visa. The Data Office team at my client is playing a key role in helping build the future of more »
you to deliver the more strategic business solutions and projects. As a business they have invested in tools such as Google Analytics 4, BigQuery, dbt and Google Cloud but are open to other toolings you see necessary. Strong skills in SQL is essential as you you may be required to more »
Greater London, England, United Kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
Lead Data Engineer (Java/Python/AWS/Glue/Dremio). Investment Management. £160,000 - £170,000 + 25% Bonus Target and Benefits. Hybrid 2 Days a week in St Pauls Office. CANNOT SPONSOR CANNOT SPONSOR My client more »
Senior Data Engineer. Java or Python and AWS. Investment Management. 12 month rolling Fixed Term Contract. £135,000 - £145,000 + 15% Guaranteed Bonus paid monthly + Benefits including health, holidays and 13% pension. Cannot Sponsor. CANNOT SPONSOR The Datamore »
s largest clients Develop solutions to parse and process tabular data from PDF and HTML documents Maintain, support and expand existing data pipelines using DBT, Snowflake and S3 Implement standardised data ingress/egress pipelines Onboard new, disparate data sets, sourced from many and varied data vendors, covering all asset more »
are looking for a dynamic Analytics Engineer to join their Finance Team. Requirements To qualify for this role, you will require: · Strong experience with dbt and SQL · Experience working within Cloud Environments (Redshift, Bigquery, Snowflake) Salary A successful candidate will receive: · A Salary of up to £75,000 · Excellent progression more »
modelling. Bonus: Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »
data pipelines. The successful candidate will play a crucial role in rewriting existing pipelines from SAS, Info, pearl, and shell scripts to Python and DBT-based solutions, ultimately loading data into Snowflake and Salesforce on Azure. Key Responsibilities: Lead the conversion of existing SAS-based modules to Python-based solutions. … Design, develop, and implement Python and DBT-based pipelines for efficient data transformation and loading. Collaborate with cross-functional teams to understand requirements and ensure successful pipeline migration. Utilize expertise in SAS, pearl, and shell scripts to seamlessly transition to Python and DBT technologies. Implement best practices for data management … of successfully converting SAS-based modules to Python-based solutions. Strong understanding of data management principles and experience working with Snowflake. Proficiency in Python, DBT, and Airflow or similar technologies. Excellent problem-solving skills and ability to troubleshoot complex issues. Experience working in an Agile environment and collaborating with cross more »
good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/Fargate. Collaborative Approach: Proven ability to work effectively more »
non-routine issues and identify improvements in the testing and validation of data accuracy. Extensive experience with Snowflake is essential and working knowledge of dbt, Airflow and AWS is highly desirable. Strong background developing, constructing, testing, and maintaining practical data architectures and drive improvements in data reliability, efficiency, and quality. more »
production via machine learning engineering Hands-on knowledge of NoSQL and relational Databases, alongside relevant data modelling techniques ETL/ELT tooling Knowledge of DBT, Luigi or similar orchestration tooling of managing and developing a team in an agile environment Preferred Software development of a product/service provided as more »
knowledge of ML Ops and machine learning model serving . Writing and maintaining data quality tests. Experience of industry-standard data transformation tools (e.g. DBT). Building data solutions with LLMs and related technologies (e.g. semantic search, RAG). PowerBI Dashboard design and implementation. What should you do next? This more »
to stakeholders at all levels of the organization Person we are looking for: Proficiency in programming languages and tools such as Python, SQL, Snowflake, dbt Ability to contribute to team efforts & work independently Strong Data Science understanding with exposure to Data Engineering/ML Strong understanding of ML Ops tools more »
Required Skills and Experience: Extensive experience in: Data Warehousing Data Engineering, overall Data Analytics Data Visualisation Proficiency in: Google Cloud (GCP) GCP BigQuery Python DBT or similar FastAPI or similar Airflow or similar Desirable: Google Apigee (as an application developer) Exposure to Machine Learning projects Exposure to DataOps Exposure to more »
West London, London, United Kingdom Hybrid / WFH Options
Recruitment Revolution
strategy, solving complex problems and building robust solutions for a dynamic, growing business. Intrigued? Read on. Role info: eCom BI Specialist. Shopify, Big Query, DBT London Based/Remote Working £50,000 - £60,000 Plus Bonus (OTE £75,000+) Product/Service: Performance Marketing Consultancy for new and scaling brands … considered Top 150 independent agencies in Europe by Google. We are premier partners with all key platforms. Your Skills: Deep Supermetrics + GitHub + DBT + Google Big Guery + Looker Studio + Shopify About us: We are a multi award winning online eCommerce boutique agency. Google ranks us in … reporting and visualisation in Looker Studio + Highly proficient in GCP/GBQ, SQL, data modelling and ETL processes + Experience with Supermetrics, GitHub, DBT, Google Big Guery and Shopify + Strong communication and presentation skills to effectively convey insights to both technical and non-technical stakeholders + Ability to more »
help them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from more »
Experience with building and optimizing data pipelines for large-scale datasets. Solid understanding of data modeling concepts and ETL processes. Experience with DBT (DataBuildTool). Desirable Skills: Experience with DBT (DataBuildTool). Experience with FastAPI. Familiarity with Python UI framework packages. Knowledge of Apigee (as an more »