Data Engineer – DBT & GCP Location: London – Hybrid Contract: 6 Months Inside IR35 (potential for additional 6 months) Skills: Data Engineer, DBT, DataBuildTool, GCP, Google Cloud Platform, ETL, ELT, SQL, PowerBI, Data Warehouse, Snowflake We are looking for a skilled Data Engineer to work on a dynamic and innovative … robust solutions to meet business needs. - Optimise data architecture and performance to ensure scalability, reliability, and efficiency. Skills and Qualifications: - Proficiency in DBT (DataBuildTool) for modelling and transforming data in a cloud environment. - Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, and Pub/Sub. … from you! Don't miss this exciting opportunity to join our team and shape the future of our data ecosystem. Data Engineer, DBT, DataBuildTool, GCP, Google Cloud Platform, ETL, ELT, SQL, PowerBI, Data Warehouse, Snowflake more »
Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by … led small teams on the delivery of projects AWS Glue Dremio Agile The following is DESIRABLE, not essential: Snowflake Spark, Airflow, Apache Iceberg, Arrow, DBT Trading, Front Office finance Some appreciation of asset classes such as Fixed Income, equities, FX or commodities Role: Python Software Engineer Team Lead (Architecture Programmer … Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Arrow DBT gRPC protobuf TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my more »
experience with shell-scripting languages Working knowledge of orchestration tools, e.g. Apache Airflow Experience of ETL/ELT tooling – for example Pentaho, AWS Glue, DBT, airflow etc. GIT and experience in building CI/CD pipelines DBA Experience in AWS cloud environment managing AWS Aurora, and Amazon RDS (MySQL, Postgres more »
tools such as FiveTran or Stitch Experience using Business Intelligence tools such as PowerBI, Looker or Tableau Experience with data pipeline tools such as DBT, Airflow or Luigi are a plus! Experience using cloud environments e.g. Azure or AWS Understanding of the Agile delivery method Working Conditions: · Permanent, London Chiswick more »
Manchester, North West, United Kingdom Hybrid / WFH Options
N Brown Group
Cloud products (Cloyd SQL, BigQuery, RedShift, Snowflake, Apache Beam, Spark) or similar products. Experience with open-source data-stack tools such as Airflow, Airbyte, DBT, Kafka etc. Awareness of data visualisation tools such as PowerBI, Tableau and/or Looker Knowledge of Teradata, Mainframe and/or Google Analytics is more »
London, England, United Kingdom Hybrid / WFH Options
Honu
to meet in the UK or Europe for a 3-day team working session every 8-10 weeks Nice to haves Airflow, Airbyte, Python, DBT, Postgres, and BigQuery on GCP Knowledge of OAuth/API user credential handling and feed acquisition from sources such as Shopify, QuickBooks, Google Analytics, Facebook more »
Python/Javascript/C# Familiarity with statistical/machine learning/AI concepts and techniques Understanding of data pipeline/orchestration tools e.g. dbt, dataform Appreciation of GCP’s serverless technologies e.g. Cloud Run/Workflows Understanding of Google’s marketing stack, Google Analytics, Google Tag Manager, Google Ads more »
development in new & emerging programming languages and technologies. You'll be conversant in Quality Engineering (QE) and DevOps processes and technologies. As well as DBT and code execution, you'll also be responsible for crafting and maintaining appropriate user documentation. You'll collaborate closely with internal partners across the Group more »
Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »
LookML Experience working with numerous AWS services A detailed understanding of CI/CD practices & tooling A research or mathematical background Experience working with dbt & dbt cloud Experience working with Data Orchestration tooling (e.g. Dagster, Prefect, etc) Experience working with Data Ingestion tooling (e.g. FiveTran, Keboola, etc) Experience working with more »
professionals. Acting as a leader and mentor to team members, fostering their professional development. Working closely with technology partners such as Google Cloud (GCP), dbt Labs, and Looker. Playing a crucial role in shaping the architecture team, driving innovation and maintaining high standards of performance. Requirements of the Cloud Architect more »
London, England, United Kingdom Hybrid / WFH Options
Legal & General
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information The more »
on the front end! YOUR EXPERIENCE Python Cloud experience - AWS/GCP/Azure CI/CD Data modeling experience will be useful Airflow & DBT experience will be useful THE BENEFITS An education budget is available to learn and develop with the company Matched pension Travel budget in place Work more »
A world-leading hedge fund seeks a Senior Data Engineer with exceptional Python, SQL, and DBT skills for its Algorithmic Trading team. We seek an exceptional Financial Markets Data Engineer with front-office trading experience and experience building and managing large (TB+) quantitative research data pipelines . Ideally, you will more »
value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. 🛠Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt 🌳Environment: Agile ✍️Process: 3 stages No CV? No problem. Email me at athomas@trg-uk.com, and let’s arrange a call more »
experience working with relational databases. Programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL frameworks and tools. Understanding of DataOps, data mining, and data visualisation more »
with AWS, especially AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Confidence in coding, scripting, configuring, versioning more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond®
leading a Data Engineering team ensuring best practises, quality in data transformation and modelling. Vast knowledge on tech such as; SQL, Kafka, Dataform, Airflow, DBT, Tableau, PowerBI, Redshift, Snowflake and BigQuery. This position does not offer Visa Sponsorship , please refrain from applying if you require sponsorship at any stage. If more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry standards like CRISP more »
value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt TC: £85,000 + bonus + up to 22% pension Process: 2 stages No CV? No problem. Email me at athomas@trg-uk.com, and let more »
knowledge of ML Ops and machine learning model serving . Writing and maintaining data quality tests. Experience of industry-standard data transformation tools (e.g. DBT). Building data solutions with LLMs and related technologies (e.g. semantic search, RAG). PowerBI Dashboard design and implementation. What should you do next? This more »