Data Engineer – DBT & GCP Location: London – Hybrid Contract: 6 Months Inside IR35 (potential for additional 6 months) Skills: Data Engineer, DBT, DataBuildTool, GCP, Google Cloud Platform, ETL, ELT, SQL, PowerBI, Data Warehouse, Snowflake We are looking for a skilled Data Engineer to work on a dynamic and innovative … robust solutions to meet business needs. - Optimise data architecture and performance to ensure scalability, reliability, and efficiency. Skills and Qualifications: - Proficiency in DBT (DataBuildTool) for modelling and transforming data in a cloud environment. - Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, and Pub/Sub. … from you! Don't miss this exciting opportunity to join our team and shape the future of our data ecosystem. Data Engineer, DBT, DataBuildTool, GCP, Google Cloud Platform, ETL, ELT, SQL, PowerBI, Data Warehouse, Snowflake more »
Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by … led small teams on the delivery of projects AWS Glue Dremio Agile The following is DESIRABLE, not essential: Snowflake Spark, Airflow, Apache Iceberg, Arrow, DBT Trading, Front Office finance Some appreciation of asset classes such as Fixed Income, equities, FX or commodities Role: Python Software Engineer Team Lead (Architecture Programmer … Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Arrow DBT gRPC protobuf TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my more »
Engineering, Management Information Systems, Mathematics, a related field, or equivalent work experience (3+ years) Experience in: Database orchestration technologies, specifically Airflow and/or DBT Experience with streaming data architectures, specifically Kafka Knowledge of semi structured data: Parquet, Avro, JSONA deep understanding of AWS Cloud Data technologies (RDS, DynamoDB, Aurora more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, Apache Airflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure as code and other DevOps practices. more »
Manchester, North West, United Kingdom Hybrid / WFH Options
N Brown Group
Cloud products (Cloyd SQL, BigQuery, RedShift, Snowflake, Apache Beam, Spark) or similar products. Experience with open-source data-stack tools such as Airflow, Airbyte, DBT, Kafka etc. Awareness of data visualisation tools such as PowerBI, Tableau and/or Looker Knowledge of Teradata, Mainframe and/or Google Analytics is more »
Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »
professionals. Acting as a leader and mentor to team members, fostering their professional development. Working closely with technology partners such as Google Cloud (GCP), dbt Labs, and Looker. Playing a crucial role in shaping the architecture team, driving innovation and maintaining high standards of performance. Requirements of the Cloud Architect more »
professionals. Acting as a leader and mentor to team members, fostering their professional development. Working closely with technology partners such as Google Cloud (GCP), dbt Labs, and Looker. Playing a crucial role in shaping the architecture team, driving innovation and maintaining high standards of performance. Requirements of the Cloud Architect more »
London, England, United Kingdom Hybrid / WFH Options
Legal & General
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information The more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Legal & General
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information The more »
with AWS, especially AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Confidence in coding, scripting, configuring, versioning more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Starling Bank
and use data to propose solutions for effective decision making Translate data requirements from across the organisation into robust and reusable data models via dbt Develop data models in Looker, ensuring consistent and clear documentation Perform data analysis to create models or test ideas Collaborate with the wider data team more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry standards like CRISP more »
analysis. Communication Skills: Excellent English communication and storytelling abilities. Technical Skills: Proficient in SQL with hands-on experience. Familiarity with tools such as Github, dbt, Python, and modern BI platforms (e.g., Apache Superset, Looker, Tableau) is highly desirable. Educational Background: Bachelor's or Master's degree in Business Analytics, Statistics more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
code.Implement TMS (Tealium IQ, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pippingand modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development more »
DBT Developer Contract £600 - £700 per day London working in the office 3 days per week 2 working from home. You'll be working for an Asset Manager as they build a new database for their investment portfolios, performance, market trends, asset allocation and risk data. Once the database is … you'll work on the BI visualisation phase Key skills required: Finance services experience - Asset Management is highly preferred. 5+ years of DBT (DataBuildTool) Development Strong skills in cloud data warehouse building - Google BigQuery and Snowflake SQL database or relational database skills OLAP Cubes Please apply immediately for more »
Greater London, England, United Kingdom Hybrid / WFH Options
Saragossa
members, and you’ll be given ownership on the direction of data engineering across the business. They’re currently using Python (Pandas), PostgreSQL and DBT, on AWS as the core technologies within data engineering, but you’ll be able to come in and make a decision as to whether or more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Legal & General
positions are to focus on the retirements side of the Retail division and will build out new data pipelines utilising tools such as Synapse, DBT, Azure Devops and Snowflake. This role will see you responsible for designing, building, and implementing a variety of data solutions using modern ETL techniques and more »
accurate up to date data We have a modern data stack already in place comprising of a Snowplow data pipeline, a Snowflake data warehouse, dbt as the data transformation tool and our BI tool, Looker. We don’t expect you to be fluent with all these technologies, but we do more »