There are opportunities for professional development, such as training programs, certifications, and career advancement paths. KEY RESPONSIBILITIES Design, develop, and maintain scalable data pipelines SQL, Azure ADF, Azure Functions, DBT Collaborate with analysts and stakeholders to understand their data needs, scoping and implementing solutions Optimising and cleaning of data warehouse, cleaning existing codebase and creating documentation. Monitor and troubleshoot dataMore ❯
working knowledge of SQL Comfortable using Git for version control Desirables: Exposure to workflow orchestration tools (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (Azure SQL, Snowflake) or dbt Basic familiarity with Docker and BI tools (Power BI, Tableau) Interest in shipping, financial markets, or commodities Package: £35-40,000 basic salary + bonus Excellent career progression opportunities More ❯
Hands-on experience building and integrating with RESTful APIs using FastAPI, Django REST Framework, or similar. -Data Workflows: Experience designing and maintaining real-time and batch data pipelines, including dbt Core and stream processing tools. -Infrastructure Know-How: Confident working with Terraform and CI/CD pipelines in a cloud-native environment. -Database Familiarity: Skilled in both SQL and NoSQL More ❯
Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge projects, ongoing learning, and More ❯
coaching, leading and management skills, able to upskill a small team and help them transition to more technical work. Strong technical skills in Python, SQL and tools such as dbt, Snowflake, AWS S3, KDB and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at least one asset class. More ❯
/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
a modern, cloud-based BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring More ❯
help clients make smarter, data-driven decisions. The Role As an Analytics Engineer, you’ll design and build scalable, production-ready data pipelines and analytics infrastructure using tools like dbt, SQL, Python, and cloud data warehouses. You’ll work end-to-end—from scoping to delivery—on projects that directly impact clients’ strategic decisions. The role is highly technical and … Key attributes of the suitable Analytics Engineer include: A few years of experience in analytics or data engineering, with strong SQL and Python skills, and hands-on experience with dbt, Airflow, and cloud platforms (AWS, GCP, or Azure). You should be confident designing ELT/ETL pipelines and working across varied technical stacks. Strong communication skills and experience working More ❯
City of London, London, United Kingdom Hybrid / WFH Options
MRK Associates
step/complex analytics is essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced data science role so advanced experience in these will not be relevant). Experience in the More ❯
join their growing data function. They're looking for a Power BI SME who has a real passion for driving analytics and insights. They have a stack of Snowflake, dbt, Python & Power BI. They're looking for someone who has end-to-end analytics experience and a genuine drive for data analytics and the impact it has on business performance. … coding ability using either normalisation, SQL, or Python). Desirable: Experience working in a Data warehouse, or lake environments e.g. Snowflake, Redshift, DataBricks, and ELT and data pipelines e.g. dbt Familiar with predictive analytics techniques Please apply if this sounds like you More ❯
As part of the technical team, this role offers hands-on experience across a range of leading technologies and platforms. Full training will be provided in SQL, Looker, and DBT for reporting, dashboarding, and data pipeline development. You’ll also get exposure to Python, Databricks, and Azure as you grow into the role. There is also future potential to gain … machine learning projects as the team evolves. Day-to-Day Responsibilities: Build and maintain SQL queries and Looker dashboards for reporting and visualisation Develop and maintain data pipelines using DBT Ingest and process data from various sources across the business Collaborate with business stakeholders to understand reporting needs Contribute to large-scale European data integration projects Support internal projects such … technology A proactive, entrepreneurial mindset with a desire to learn and solve problems Excellent communication skills and confidence working with business stakeholders Tech Stack You’ll Learn SQL Looker DBT Python Databricks Azure Let me know if you want to tailor this for LinkedIn or make it sound more casual or more technical More ❯
performance data pipelines and APIs Developing ETL processes to support analytics, reporting, and business operations Assembling complex datasets from a wide variety of sources using tools like SQL, Python, dbt, and Azure Supporting and improving data quality, data infrastructure, and performance Defining and documenting cloud-based data architecture and technical solutions Collaborating across teams and contributing to architectural decisions Troubleshooting … What we’re looking for: Strong experience with Azure cloud technologies , particularly around data services Proficient in SQL and experienced with Python for data transformation Hands-on experience with dbt , ETL development , and data warehousing best practices Comfortable with deploying infrastructure as code and building CI/CD pipelines (e.g., using GitHub, Azure DevOps) Ability to manage large, unstructured datasets More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Pulse Recruit
or analytics. Experience leading cross-functional teams and setting product vision. A commercially minded approach, always focused on business impact. Solid understanding of tools like Python, SQL, Power BI, DBT, and Google BigQuery. A desire to work in a collaborative, high-ownership environment. Curiosity about customer behaviour and a drive to uncover insights through data. 📩 Sound like you? Lauren.stuart@pulserecruit.co.uk More ❯
area. As Data Analyst , you will be responsible for delivering data-driven insights across a wide range of datasets. Using tools such as Databricks, Tableau, Looker Studio, Amplitude, and DBT, you will extract, transform, and analyse data to support key business functions. You will collaborate closely with stakeholders to understand their data needs, generate reports, and provide actionable insights that More ❯
a Lead Analyst to the team. THE ROLE AND RESPONSIBILITIES Analyse and enhance key metrics across customer acquisition, retention and revenue generation Develop and maintain advanced data models using dbt Ensure KPIs are aligned with overall business objectives, identifying areas for improvement and efficiency Ensure data insights are clear and actionable for driving performance improvements Use data to provide strategic … Strong technical experience in SQL (specifically within Windows Functions), Snowflake, Data Visualisation and Python/R Strong background in advanced analytics, predictive modelling and exploratory analysis Proven expertise in DBT and reporting Strong stakeholder management skills THE BENEFITS Up to £85,000 + bonuses + equity Hybrid London HOW TO APPLY If interested in the role please send your CV More ❯
Lead and manage all engineering activities across internal and external teams, ensuring high productivity and quality of execution. AWS Expertise: Strong expertise across AWS products, including S3, Glue, Spark, DBT, Terraform, and Redshift. Roadmap Prioritisation: Prioritize and manage engineering activities and personnel to deliver on a roadmap, ensuring alignment with our strategic goals as defined by the Head of Customer … at least 5 years’ experience in data Proven experience in managing engineering teams in a fast-paced environment. Knowledge of AWS services and tools, including S3, Step Functions, Spark, DBT, Terraform, and Redshift. Strong leadership and communication skills, with the ability to inspire and motivate a diverse team. Communication with stakeholders at all levels is essential. A passion for innovation More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Rec3 Global Ltd
service analytics across the business by building robust, well-documented, and high-quality data models. Youll be at the forefront of leveraging cutting-edge technology, including GCP BigQuery and DBT, to deliver world-class solutions that empower teams and drive smarter decision-making . What Youll Do Design, develop, and maintain comprehensive business data layers and models that are clean … full potential of our data assets 1 What Were Looking For Proven experience in analytics engineering, data modelling, or a related field Strong technical skills with GCP BigQuery and DBT (or similar technologies) Ability to translate complex business requirements into scalable data solutions Excellent communication and collaboration skills A passion for continuous improvement and staying ahead of industry trends If More ❯