similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc) Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a required field First Name Last Name More ❯
such as MySQL, PostgreSQL, or Oracle. Experience with big data technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as Amazon Redshift, GoogleBigQuery, or Apache Airflow. Proficiency in Python and at least one other programming language such as Java, or Scala. Willingness to mentor more junior members of the team. Strong analytical More ❯
similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a required field First Name Last Name More ❯
with big data technologies (e.g., Hadoop, Spark, Kafka). Familiarity with AWS and its data services (e.g. S3, Athena, AWS Glue). Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). Familiarity of data orchestration tools (e.g. Prefect, Apache Airflow). Familiarity with CI/CD pipelines More ❯
such as MySQL, PostgreSQL, or Oracle. Experience with big data technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as Amazon Redshift, GoogleBigQuery, or Apache Airflow. Proficiency in at least one programming language such as Python, Java, or Scala. Strong analytical and problem-solving skills with the ability to work independently and More ❯
commercially viable, and aligned with client expectations. Enterprise Solution Design : Architect and lead the delivery of large-scale data platforms (including lakes, lakehouses, and warehouses) using GCP, Cloud Storage, BigQuery, Databricks, Snowflake. Cloud Data Strategy: Own cloud migration and modernisation strategy, leveraging GCP, and tools such as Terraform, Azure DevOps, GitHub, and CI/CD pipelines. Data Modelling: Apply More ❯
similar scripting languages for data science Experience with data processes and building ETL pipelines Experience with cloud data warehouses such as Snowflake, Azure Data Warehouse, Amazon Redshift, or GoogleBigQuery Proficiency in creating visualizations using Power BI or Tableau Experience designing ETL/ELT solutions with tools like SSIS, Alteryx, AWS Glue, Databricks, IBM DataStage Strong analytical and technical More ❯
experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic More ❯
skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and infrastructure-as-code practices Familiarity with More ❯
skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and infrastructure-as-code practices Familiarity with More ❯
and non-technical stakeholders alike. The Head of Data Engineering & Insight will work within a modern, cloud-based BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, GoogleBigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics More ❯
and non-technical stakeholders alike. The Head of Data Engineering & Insight will work within a modern, cloud-based BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, GoogleBigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics More ❯
Strong SQL Server skills, including query optimisation and performance tuning. Familiarity with scheduling and orchestration tools (e.g. Control-M). Hands-on experience with Google Cloud tools such as BigQuery, Composer, and VertexAI. Proficient in Python for data manipulation and modelling. Strong skills in data visualisation using Tableau Desirable: Background in insurance or financial services, with exposure to industry More ❯
years experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins More ❯
statistics (hypothesis testing, regression, significance, p-value pitfalls) Ability to translate data into plain-English insights and present to C-level audiences Experience working in cloud data warehouses (Snowflake, BigQuery, Redshift) and version control (Git) What we can offer Bonus Hybrid working; meaning you'll be in our Farringdon office Tuesdays to Thursdays 25 days annual leave, plus the More ❯
either leverage third party tools such as Fivetran, Airbyte, Stitch or build custom pipelines. We use the main data warehouses for dbt modelling and have extensive experience with Redshift, BigQuery and Snowflake. Recently we've been rolling out a serverless implementation of dbt and progressing work on internal product to build modular data platforms. When initially working with clients More ❯
of at least one database technology (relational, columnar, or NoSQL) and familiarity with others (e.g., MySQL, Oracle, MSSQL, Vertica, MongoDB) Knowledge of cloud data warehouses like Snowflake, Databricks, and BigQuery Proven ability to manage technical project delivery, including scoping, planning, and risk assessment Strong organisational skills to oversee multiple concurrent customer projects Adaptability to shifting customer requirements and priorities More ❯
of at least one database technology (relational, columnar, or NoSQL) and familiarity with others (e.g., MySQL, Oracle, MSSQL, Vertica, MongoDB) Knowledge of cloud data warehouses like Snowflake, Databricks, and BigQuery Proven ability to manage technical project delivery, including scoping, planning, and risk assessment Strong organisational skills to oversee multiple concurrent customer projects Adaptability to shifting customer requirements and priorities More ❯
Engineering/Data Engineering/BI Engineering experience Understanding of data warehousing, data modelling concepts and structuring new data tables Knowledge of cloud-based MPP data warehousing (e.g. Snowflake, BigQuery, Redshift) Nice to have Experience developing in a BI tool (Looker or similar) Good practical understanding of version control SQL ETL/ELT knowledge, experience with DAGs to manage More ❯
City of London, London, United Kingdom Hybrid / WFH Options
1st Formations
data analytics, analytics engineering, or business intelligence, preferably in a technology or SaaS environment. Expertise in SQL and data modelling, with hands-on experience working with cloud data warehouses (BigQuery a strong plus). Proficient with BI tools (Looker Studio, Superset, or similar), able to build robust dashboards and compelling visualizations. Practical experience automating data workflows (e.g., with Python More ❯
data analytics, analytics engineering, or business intelligence, preferably in a technology or SaaS environment. Expertise in SQL and data modelling, with hands-on experience working with cloud data warehouses (BigQuery a strong plus). Proficient with BI tools (Looker Studio, Superset, or similar), able to build robust dashboards and compelling visualizations. Practical experience automating data workflows (e.g., with Python More ❯
Streams)? Select Which statement best describes your hands on responsibility for architecting and tuning cloud native data lake/warehouse solutions (e.g., AWS S3 + Glue/Redshift, GCP BigQuery, Azure Synapse)? Select What best reflects your experience building ETL/ELT workflows with Apache Airflow (or similar) and integrating them into containerised CI/CD pipelines (Docker, GitHub More ❯
expertise to grow top line revenues and guide commercial initiatives from our data. You'll own the analysis of the end-to-end customer journey, using our data stack ( BigQuery, dbt, Hex ) to create data models, data products, metrics and find insights that fuel our growth. You'll work closely with other engineers, marketers, product teams, and commercial teams More ❯
used backend programming languages (Python, Node.JS, Java, PHP, GO, C#, C++) Familiar with version control tools and proper branching techniques (Gitlab preferred) Experience working with data warehouses (Google Cloud BigQuery), data governance, payments and treasury or capital markets systems So, what's in it for you? Our people are constantly striving to be the best through operational excellence. The More ❯
management practices, system development life cycle management, IT services management, agile and lean methodologies, infrastructure and operations, and EA and ITIL frameworks. Proficiency with data warehousing solutions (e.g., GoogleBigQuery, Snowflake). Expertise in data modeling tools and techniques (e.g., SAP PowerDesigner, EA Sparx). Strong knowledge of SQL and NoSQL databases (e.g., MongoDB, Cassandra). Familiarity with cloud More ❯