help them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from More ❯
variety of databases Working knowledge of one or more of the cloud platforms (AWS, Azure, GCP) Experience building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software More ❯
Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. A background in More ❯
languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience managing or mentoring More ❯
data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for More ❯
the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days per week More ❯
a leader within the business over time. The current tech stack is varied, and it’s currently made up of TypeScript, Python, PostgreSQL, Redis, DBT on AWS. You’ll be encouraged to take ownership of products and you’ll be given this autonomy from the co-founders. The products handle More ❯
timelines and quality standards are met. Required Skills & Experience: 5+ years' experience as a Data Analyst Strong skills in Python, SQL and tools like dbt, Snowflake, AWS S3 and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at More ❯
essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced data science role so advanced experience in these will not be More ❯
tech stack includes: Infrastructure: Google Cloud Backend: Python for internal API codebase, running on a Postgres database hosted with Google Cloud SQL Data: BigQuery, DBT, and Tableau Frontend: Typescript and React Mobile/Apps: Flutter What’s on Offer Generous time off. Equity package. Comprehensive health coverage. Pension contributions Hybrid More ❯
. AWS based capacity management, performance logging and monitoring solutions. AWS cost management. Your Profile: Essential skills/knowledge/experience: Experience in Snowflake, DBT, Glue, Airflow. Proficiency in architecture and building solutions to support highly available and scalable web sites, batch processes, services and API’s. Understanding of IAM More ❯
Data Engineering teams, with the ability to translate technical and business needs Knowledge of Snowflake and AWS platforms Familiarity with ETL tools such as DBT and Matillion Understanding of SQL and Python—enough to contribute to technical discussions and guide decisions Excellent communication, planning, and stakeholder management skills If this More ❯
in Python for data engineering tasks. Solid understanding of ELT pipelines, data architecture, and data modeling. Experience with modern data transformation tools such as dbt . Familiarity with AWS-based data pipeline services and cloud architecture. Comfortable working in agile teams and cross-functional environments. Knowledge of statistical concepts and More ❯
e.g., MongoDB). Experience with data workflow orchestration tools (e.g., Dagster). Hands-on experience with ETL/ELT frameworks and transformation tools (e.g., dbt). Proficiency in programming languages such as Python. Experience with cloud platforms such as Azure. Strong understanding of data modelling, data warehousing, and data architecture More ❯
test and scale innovations that drive measurable value. The Data Engineer will be responsible for: Building and managing ELT pipelines and analytics infrastructure (BigQuery, dbt, Fivetran, Sigma, etc.) Supporting web analytics, conversion funnels, and traffic segmentation Collaborating with business stakeholders to model meaningful metrics and enable self-service reporting Working More ❯
with both technical and business teams to elicit requirements and understand the business. Your Profile: Essential skills/knowledge/experience: Experience in Snowflake, DBT, Glue, Airflow. Experience with AWS Services. Good communication and stakeholder management skills. Ability to communicate with both business and technical colleagues at all levels. Knowledge More ❯
environment Drive continuous improvements to architecture, infrastructure, and workflow automation Core Tech Stack: Must-have : Google Cloud Platform (GCP), Apache Airflow Nice-to-have : dbt, Terraform, Kubernetes Bonus : Familiarity or curiosity about generative AI tools (e.g. ChatGPT) Ideal Candidate: 4+ Years experience in a Data/Platform engineering role Proven More ❯
requirement for a Principal Data Engineer to spearhead an exciting new project for them, ideally with good knowledge of: Data Warehousing (Snowflake) Data Pipelines (DBT, Kafka) Programming (Python) Relational Databases (SQL, PostgreSQL, LookML) Data Visualisation (Looker) DevOps (Docker, K8s, Terraform) But most importantly they are looking for individuals with an More ❯
Objectives Develop and design reports and dashboards in Tableau Update existing dashboards as part of Data Warehouse migration Data cleansing, preparation and modelling using DBT or AWS Athena Requirement gathering and translating to business needs Cloud Infrastructure knowledge will be a big advantage for this role. Experience of managing IT More ❯
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
City of London, Greater London, UK Hybrid / WFH Options
Realtime Recruitment
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
platform standards for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud dataMore ❯
Monitoring tools (Grafana, Prometheus, etc.) Mentoring and knowledge sharing within the team Senior Engineer - Desirable Skills: Experience in energy supply or trading Familiarity with dbt or modular analytics tooling Exposure to forecasting or optimisation workflows Knowledge of React or frontend tools for internal apps What they offer: A high-autonomy More ❯
also building reports in Tableau and enabling data-driven decision-making for optimised performance. Utilising your modern stack experience, you will model data using dbt, ensuring it’s set up for scalability and reporting needs. Advise and recommend different ways to improve data architecture and processes to improve reliability, dataMore ❯