help them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from more »
source and public cloud technologies. Strong experience with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams more »
engineering experience. Proficiency in Python and Java 11+. Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. Hands-on experience with AWS. Ability to work effectively with both business and technical stakeholders, owning end-to-end solution delivery. more »
for the entire organisation and proper data governance. Utilise and improve our current AWS-based data platform. Work with our tech stack, which includes dbt/DuckDB for transformation, Kafka/RabbitMQ as a streaming platform, Deltalake as a data format, Dagster for managing data assets, and Terraform, Kubernetes, and more »
good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/Fargate. Collaborative Approach: Proven ability to work effectively more »
experience: Ability to operate in a fast changing environment. Fluent in English Previous cloud based infrastructure experience, particularly with AWS. Experience using Airflow and dbt Expert SQL knowledge Solid understanding of Dimensional Data Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience more »
AWS infrastructure, including S3, Redshift, Lambda, Step Functions, DynamoDB, AWS Glue, RDS, Athena, Kinesis, Quicksight. We also widely use other tech such as Snowflake, DBT, Databricks, Informatica, Matillion, Airflow, Tableau, Power BI etc. The Lead Data Architect will liaise with clients to define requirements, refine solutions and ultimately hand over more »
s largest clients Develop solutions to parse and process tabular data from PDF and HTML documents Maintain, support and expand existing data pipelines using DBT, Snowflake and S3 Implement standardised data ingress/egress pipelines Onboard new, disparate data sets, sourced from many and varied data vendors, covering all asset more »
professionals. Acting as a leader and mentor to team members, fostering their professional development. Working closely with technology partners such as Google Cloud (GCP), dbt Labs, and Looker. Playing a crucial role in shaping the architecture team, driving innovation and maintaining high standards of performance. Requirements of the Cloud Architect more »
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information Legal more »
professionals. Acting as a leader and mentor to team members, fostering their professional development. Working closely with technology partners such as Google Cloud (GCP), dbt Labs, and Looker. Playing a crucial role in shaping the architecture team, driving innovation and maintaining high standards of performance. Requirements of the Cloud Architect more »
Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »
Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong more »
Previous experience in a Data Engineering role Strong SQL and Python development skills Hands-on experience with cloud-based data warehousing technologies (e.g., Snowflake, DBT, FiveTran, AirFlow) Effective communication skills for both technical and non-technical audiences Analytical mindset with attention to detail High energy, enthusiasm, and passion for learning more »
Python/Javascript/C# Familiarity with statistical/machine learning/AI concepts and techniques Understanding of data pipeline/orchestration tools e.g. dbt, dataform Appreciation of GCP’s serverless technologies e.g. Cloud Run/Workflows Understanding of Google’s marketing stack, Google Analytics, Google Tag Manager, Google Ads more »
Finance, Accounting, Economics or a related field or equivalent work experience (3+ years) Experience in: Some knowledge of database orchestration technologies + ETL (Airflow, DBT, Databricks) Working understanding of financial concepts and systems Ability to recognize and diagnose potential errors or data inconsistencies between multiple reports Working knowledge of how more »
tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems and pipelines Evaluate more »
Google Cloud Platform (GCP) - Background using Airflow/Cloud Composer with Python - Cloud-Based data platforms, Snowflake or BigQuery - Advanced SQL - Data Transformation tools, DBT - CI/CD - TDD If you're open to exploring this opportunity and believe your skills align with what we're looking for, I'd more »