demonstrate extensive experience having designed and scaled a Data Platform- Has strong Python skills,- Has great SQL, preferably Snowflake- Has previous experience working with dbt & Airflow- Is passionate about solving complex data problems & is interested in working with rich & diverse climate datasets- Cares deeply about the climate and ecosystems of more »
Greater London, England, United Kingdom Hybrid / WFH Options
Xcede
Senior Data Engineer (Snowflake) London - x2 days in office a month as a rough expectation. Xcede is delighted to be recruiting a Senior Data Engineer on behalf of a market leader in the retail and e-commerce space. Our continues more »
We’re looking for a product analyst to join our ever-developing London office. You’ll be using tools and databases including BigQuery, Tableau, DBT and Python, to tackle important problems that we’re only just starting to understand, in order to: Measure, quantify and optimise complex marketing funnels and … analytics Strong SQL skills; particularly in reference to Google Big Query with experience in using analytical and data tooling such as Python, Tableau, Git, dbt etc. Working experienced in setting up A/B tests including analysing results and drawing conclusions Strong Business acumen and experience building complex operational & financial more »
Senior Digital Marketing Analyst 🚀 Start-up Beauty and Self-Care 🚀 Salary: £60,000 - £70,000 Location: Hybrid/London ***Client cannot offer visa sponsorship*** 🌟 Must be a SQL superstar 📍 Join a disruptive challenger beauty brand that is in a phase more »
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information Legal more »
Database Developer Contract (Inside) - London GCP/DBT/Kimball methodology We have partnered with one of Londons leading asset managers who are looking for a contractor who has used DBT extensively in last 3 yrs as developer to build a warehouse IN GCP. EXPERIENCE Must have experience with building more »
Experience with building and optimizing data pipelines for large-scale datasets. Solid understanding of data modeling concepts and ETL processes. Experience with DBT (DataBuildTool). Desirable Skills: Experience with DBT (DataBuildTool). Experience with FastAPI. Familiarity with Python UI framework packages. Knowledge of Apigee (as an more »
help them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from more »
insights. Experience needed: - Snowflake expertise, demonstrated by performance optimisation, cost management, and utilisation of advanced features, validated by Snowflake certification (e.g., SnowPro Core). - DBT expertise coupled with advanced Python skills, specifically in data pipeline development and data task automation. - Mastery of SQL (PostgreSQL, MySQL) for intricate queries and performance more »
Product Manager (Product Owner Business Analysis Data Lake Data Mesh Datamesh SQL Architecture Big Data AWS Python Dremio Apache Iceburg Iceberg Arrow DBT Tableau PowerBI Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Buy Side) required by my asset management … management) Role: Product Manager (Product Owner Business Analysis Data Lake Data Mesh Datamesh SQL Architecture Big Data AWS Python Dremio Apache Iceburg Iceberg Arrow DBT Tableau PowerBI Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Buy Side) required by my more »
for the entire organisation and proper data governance. Utilise and improve our current AWS-based data platform. Work with our tech stack, which includes dbt/DuckDB for transformation, Kafka/RabbitMQ as a streaming platform, Deltalake as a data format, Dagster for managing data assets, and Terraform, Kubernetes, and more »
Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong more »
who has hands-on experience in AWS to conduct end-to-end data analysis and data pipeline build-out using Python, Glue, S3, Airflow, DBT, Redshift, RDS, etc.Very solution-driven, and highly collaborative at providing thought leadership and soliciting diverse opinionsAccountable for results. Experienced in leading team of data engineers more »
tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems and pipelines Evaluate more »
Lead Data Engineer - Snowflake/DBT/Python - ecommerce - £95k Leading eCommerce client are now searching for a Lead Data Engineer to play a pivotal role in delivering their innovative data strategy. This role will shape our client’s data function, delivering end to end data requirements and designing datamore »
thinking strategically and proactively identifying opportunities. You have experience collaborating with senior business stakeholders and finance teams. You have working knowledge of Python, Airflow, dbt, Bigquery, and Looker. Additional desirables: Experience in one or more finance domains, such as Financial Reporting, Treasury, Regulatory Reporting, Financial Planning & Analysis, Financial Risk, and more »
how operations are carried out on the front end!YOUR EXPERIENCEPythonCloud experience - AWS/GCP/AzureCI/CDData modeling experience will be usefulAirflow & DBT experience will be usefulTHE BENEFITSAn education budget is available to learn and develop with the companyMatched pensionTravel budget in placeWork from anywhereOption to visit offices more »
jobBenefitsThings you need to knowApply and further informationLocationBelfast, Birmingham, Cardiff, Darlington, Edinburgh, London, SalfordAbout the jobJob summaryAbout us The Department for Business and Trade (DBT) is the department for economic growth. The Digital, Data and Technology (DDaT) directorate develops and operates tools and services to support businesses to invest, grow … We code in the open (see our code here: ) and our primary tech stack is Python/Django, Nodejs/React, running on AWS. DBT optimises the architecture for change. We are aware that we don’t know all the future needs of our users and this needs to be … asked to complete a short, pre-recorded video screening interview (alternately provide written answers to questions). These applications will then be sifted by DBT hiring managers. If you are successful, you will be invited to interview. DBT sift will be from Thursday 6th June Interviews will be from week more »
Your new role At NewDay This position needs someone with energy and passion to complement our existing development team, to contribute to our existing projects as well as work on strategic new projects. In this role you must have a more »
Description: Senior Data Engineer (SAS) – They want to rewrite existing pipelines (SAS, Info, pear, shell scripts) to target state which will be Python and DBT transformation based to be then loaded to SF on Azure. SAS (primary), pearl, shell scripts (secondary) engineers who has experience to convert to Python , DBT … and load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
how products are performing and tracking metrics. Key responsibilities🔑 Design and implement high-performance, scalable, and reusable data models for their data warehouse using dbt and Snowflake. Design and implement Looker structures to enable users across the organisation to self-serve analytics. Collaborate with data analysts and business teams to … scale data environments. experience in SDLC in analytics including version control, testing & CI/CD. professional experience with SQL and data transformation, ideally with dbt or similar. with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP. Apply to the Role Roles like these are snapped up more »
documented, and verified by our team and stakeholders.Serve as the Directly Responsible Individual for major sections of the Enterprise Dimensional ModelDesign, develop, and extend dbt code that meets our internal standards for style, maintainability, and best practices to extend the Enterprise Dimensional ModelProvide data modeling expertise through code reviews, pairing … meet new business requirements2+ years managing an analytics engineering team using a scrum/agile methodologyExperience working with commercial data warehouses (Redshift), ETL tools (Dbt), data visualization (Python notebook, Thoughtspot, Looker, Tableau, Hex), and Data Dictionary tools (Atlan)Demonstrated experience leading 2 or more multi-department analytics projects from inception more »