data pipelines. The successful candidate will play a crucial role in rewriting existing pipelines from SAS, Info, pearl, and shell scripts to Python and DBT-based solutions, ultimately loading data into Snowflake and Salesforce on Azure. Key Responsibilities: Lead the conversion of existing SAS-based modules to Python-based solutions. … Design, develop, and implement Python and DBT-based pipelines for efficient data transformation and loading. Collaborate with cross-functional teams to understand requirements and ensure successful pipeline migration. Utilize expertise in SAS, pearl, and shell scripts to seamlessly transition to Python and DBT technologies. Implement best practices for data management … of successfully converting SAS-based modules to Python-based solutions. Strong understanding of data management principles and experience working with Snowflake. Proficiency in Python, DBT, and Airflow or similar technologies. Excellent problem-solving skills and ability to troubleshoot complex issues. Experience working in an Agile environment and collaborating with cross more »
London, England, United Kingdom Hybrid / WFH Options
RVU
looking for a product analyst to join our rapid and ever-developing London office. You’ll be using tools and databases including BigQuery, Tableau, DBT and Python, to tackle important problems that we’re only just starting to understand, in order to: Measure, quantify and optimise complex marketing funnels and … analytics Strong SQL skills; particularly in reference to Google Big Query with experience in using analytical and data tooling such as Python, Tableau, Git, dbt etc. Working experienced in setting up A/B tests including analysing results and drawing conclusions Strong Business acumen and experience building complex operational & financial more »
SAS (primary), pearl, shell scripts (secondary) engineers who has experience to convert to Python , DBT based pipelines and load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT , Airflow or similar technologies DBT , Snowflake ( added advantage more »
you’ll build the solution. They work in a mix of SQL, Python programming but also operate with modern tech such as Snowflake, Airflow, DBT, Kubernetes/Docker and cubeJS whilst also using Tableau CRM and Jupyter notebooks for gaining data insights. If you think there’s a better or more »
for SAS Senior Data Engineer also open for both contract or permanent SAS , pearl, shell scripts engineers who has experience to convert to Python , DBT based pipelines and load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in … Python, DBT , Airflow or similar technologies DBT , Snowflake more »
We’re a leading online reviews platform, free and open to all. Our mission is to be a universal symbol of trust. We are well on our way — but there’s still an exciting journey ahead of us. Do you more »
will involve: Collaborating with different teams to enhance data infrastructure, analyse business performance, and identify areas for improvement. Use SQL and other tools (Looker, DBT, Python, R, BigQuery) to extract and analyse data for business insights. Support the data engineering team in enhancing data infrastructure and platform. Analyse business metrics … of new initiatives and product features. YOUR EXPERIENCE To qualify for this Data Analyst role, you will need: Proficiency in SQL; experience with Looker, DBT, Python, R, BigQuery is advantageous. Bachelor's degree in STEM subjects from top universities. Exposure to working in a fast-paced business environment with stakeholder more »
tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems and pipelines Evaluate more »
help them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from more »
Experience with building and optimizing data pipelines for large-scale datasets. Solid understanding of data modeling concepts and ETL processes. Experience with DBT (DataBuildTool). Desirable Skills: Experience with DBT (DataBuildTool). Experience with FastAPI. Familiarity with Python UI framework packages. Knowledge of Apigee (as an more »
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information Legal more »
Python. SKILLS AND EXPERIENCE Commercial experience of implementing best coding practices using CI/CD. Strong knowledge of working within Azure. Strong understanding of DBT and Airflow. Experience building and maintaining Data Pipelines using Python. THE BENEFITS Flexible Working Private Healthcare Generous Holiday package. HOW TO APPLY Please register your more »
you to deliver the more strategic business solutions and projects. As a business they have invested in tools such as Google Analytics 4, BigQuery, dbt and Google Cloud but are open to other toolings you see necessary. Strong skills in SQL is essential as you you may be required to more »
London, England, United Kingdom Hybrid / WFH Options
Harnham
will have: Worked in a Software Engineering setting. Strong Python coding skills including testing principles (unit testing, TDD/BDD) Experience working with Airflow, DBT, and Terraform An understanding of cloud architecture RECRUITMENT PROCESS: A Technical Test 1 Hour online chat. HOW TO APPLY: Please register your interest by sending … will have: Worked in a Software Engineering setting. Strong Python coding skills including testing principles (unit testing, TDD/BDD) Experience working with Airflow, DBT, and Terraform An understanding of cloud architecture RECRUITMENT PROCESS: A Technical Test 1 Hour online chat. HOW TO APPLY: Please register your interest by sending more »
Greater London, England, United Kingdom Hybrid / WFH Options
Xcede
Senior Data Engineer (Snowflake) London - x2 days in office a month as a rough expectation. Xcede is delighted to be recruiting a Senior Data Engineer on behalf of a market leader in the retail and e-commerce space. Our continues more »
Lead Data Engineer Salary: Up to £50,000 National/Up to £60,000 London Do you like working with the latest technology and are interested in enhancing your tech abilities? We have an exciting opportunity for a highly skilled more »
We’re looking for a product analyst to join our ever-developing London office. You’ll be using tools and databases including BigQuery, Tableau, DBT and Python, to tackle important problems that we’re only just starting to understand, in order to: Measure, quantify and optimise complex marketing funnels and … analytics Strong SQL skills; particularly in reference to Google Big Query with experience in using analytical and data tooling such as Python, Tableau, Git, dbt etc. Working experienced in setting up A/B tests including analysing results and drawing conclusions Strong Business acumen and experience building complex operational & financial more »
Database Developer Contract (Inside) - London GCP/DBT/Kimball methodology We have partnered with one of Londons leading asset managers who are looking for a contractor who has used DBT extensively in last 3 yrs as developer to build a warehouse IN GCP. EXPERIENCE Must have experience with building more »
Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong more »
Required Skills and Experience: Extensive experience in: Data Warehousing Data Engineering, overall Data Analytics Data Visualisation Proficiency in: Google Cloud (GCP) GCP BigQuery Python DBT or similar FastAPI or similar Airflow or similar Desirable: Google Apigee (as an application developer) Exposure to Machine Learning projects Exposure to DataOps Exposure to more »
you will: Automate and scale datasets aligned with specific use cases, facilitating seamless integration into operational workflows. Design, develop, and maintain data models using dbt, ensuring the efficient transformation of raw data into actionable insights. Build and optimize query pipelines in dbt and SQL to extract, transform, and load data … EXPERIENCE To qualify for this Analytics Engineer role, you will require: Proficiency in Advanced SQL for data querying and manipulation. Experience managing dbt (DataBuildTool) for building and maintaining data transformation pipelines. Ability to build Tableau dashboards for data visualization and reporting. Familiarity with GCP (Google Cloud Platform) for more »
Description: Senior Data Engineer (SAS) – They want to rewrite existing pipelines (SAS, Info, pear, shell scripts) to target state which will be Python and DBT transformation based to be then loaded to SF on Azure. SAS (primary), pearl, shell scripts (secondary) engineers who has experience to convert to Python , DBT … and load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
how products are performing and tracking metrics. Key responsibilities🔑 Design and implement high-performance, scalable, and reusable data models for their data warehouse using dbt and Snowflake. Design and implement Looker structures to enable users across the organisation to self-serve analytics. Collaborate with data analysts and business teams to … scale data environments. experience in SDLC in analytics including version control, testing & CI/CD. professional experience with SQL and data transformation, ideally with dbt or similar. with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP. Apply to the Role Roles like these are snapped up more »
Engineer, you'll be instrumental in maintaining their data platform in Snowflake, developing company-wide reports using Tableau, and utilizing cutting-edge tools like DBT, FiveTran, and Stitch. Your day to day will involve: Analyse Category data to gain insights into various aspects such as promotional performance compared to the … overall category. Compile these insights into coherent narratives or stories that highlight trends, patterns, and performance metrics Utilize DBT, FiveTran, and Stitch to streamline data processes. Collaborate with cross-functional teams to understand data needs and deliver actionable insights. Drive data-driven decision-making across the organization. WHAT YOU'LL … order to qualify for this Data Analytics Engineer role, you will require: Proven experience in building and maintaining data platforms. Expertise in Snowflake, Tableau, DBT, FiveTran, and Stitch. Proficiency in Tableau, SQL and/or Python. Strong analytical skills and the ability to translate data into actionable insights. Familiarity with more »
Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »