modeling experience to address scale and read/write performance Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDB, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc. Working knowledge of data platform related services on at least 1 Cloud platform to cover IAM and data security Hands-on skills building DevOps pipelines for data solutions More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver More ❯
need a little data support. You’ll be an internal data consultant, with loads of amazing data in our data warehouse at your fingertips, working with analytical tools including BigQuery and Looker. Commercial/Marketing: As a B2B company, understanding our customers’ journey from prospect to paid customer is key. In this role, you get to join a growing More ❯
Engineering/BI Engineering experience Excellent SQL skills Understanding of data warehousing, data modelling concepts and structuring new data tables Knowledge of cloud-based MPP data warehousing (e.g. Snowflake, BigQuery, Redshift) Nice to have Experience developing in a BI tool (Looker or similar) Good practical understanding of version control SQL ETL/ELT knowledge, experience with DAGs to manage More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver More ❯
record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but Redshift/BigQuery/ClickHouse etc. AWS, GCP, Azure) and MLOps practices. Familiarity with data visualization tools (e.g., Tableau, Power BI). COMPENSATION & BENEFITS: Competitive salary + bonus. Comprehensive benefits package including More ❯
quality, lineage, and governance through testing and monitoring Work with cost observability tools to ensure efficient platform usage Essential: Proven experience working with cloud-based data platforms (Snowflake, Redshift, BigQuery, etc.) Proficiency in Python and SQL for automation and analytics Experience with CI/CD, preferably in dbt and/or data platforms Hands-on with IaC tools like More ❯
proficiency in SQL/DDLs, databases, data lakes, and query engines Some knowledge of modern data engineering tools and practices such as Airflow, Redshift, Hive, Trino, Spark, Glue, Kubernetes, BigQuery, and Kafka would be an advantage We love hearing from anyone inspired to build a better future with us. If you're excited about the role or working at More ❯
secure, scalable, and reliable. You’ll be the ideal candidate for this position if you have: Advanced SQL skills with strong experience using relational databases. Hands-on experience with BigQuery or a similar large-scale analytics platform. Proficiency in Python or R, including libraries like Pandas. Experience working with BI tools such as Looker, Tableau, Power BI, or Mode. More ❯
London, England, United Kingdom Hybrid / WFH Options
Nansen
world impact—powering tools used daily by thousands of crypto investors, builders, and institutions. What You'll Do: Design, build, and scale performant data pipelines and infrastructure using ClickHouse, BigQuery, Python, and dbt. Handle large-scale data challenges, processing terabytes of streaming and batch data daily. Collaborate closely with crypto researchers, backend engineers, and product managers to shape data More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hartree Partners
record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but Redshift/BigQuery/ClickHouse etc. also welcome). PREFERRED QUALIFICATIONS: Meteorological understanding/experience with weather modelling Prior knowledge or experience in the power markets or energy sector. Experience with cloud More ❯
record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but Redshift/BigQuery/ClickHouse etc. also welcome). PREFERRED QUALIFICATIONS: Meteorological understanding/experience with weather modelling Prior knowledge or experience in the power markets or energy sector. Experience with cloud More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Hartree Partners
record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but Redshift/BigQuery/ClickHouse etc. also welcome). PREFERRED QUALIFICATIONS: Meteorological understanding/experience with weather modelling Prior knowledge or experience in the power markets or energy sector. Experience with cloud More ❯
in designing cloud data platform solutions • In depth knowledge of the Snowflake platform and capabilities • Relevant experience of working with other cloud data platform solutions such as Databricks, GCP BigQuery, Microsoft Azure or AWS offerings would also be advantageous • Practical knowledge of GenAI and LLM offerings in the market • Skilled in working on large scale agile transformation projects • You More ❯
automating models and advancing our engineering practices. You're familiar with cloud technologies . You have experience working with data in a cloud data warehouse (Redshift, Snowflake, Databricks, or BigQuery) Experience with a modern data modeling technology (DBT) You document and communicate clearly . Some experience with technical content writing would be a plus You are excited to work More ❯
quality, lineage, and governance through testing and monitoring Work with cost observability tools to ensure efficient platform usage Essential Proven experience working with cloud-based data platforms (Snowflake, Redshift, BigQuery, etc.) Proficiency in Python and SQL for automation and analytics Experience with CI/CD, preferably in dbt and/or data platforms Hands-on with IaC tools like More ❯
and storage solutions in cloud environments (AWS, Azure, GCP) - Writing complex queries against relational and non-relational databases - Leading or contributing to key projects involving technologies like Databricks, Snowflake, BigQuery and Fabric - Applying software engineering best practices to data engineering, including CI/CD, monitoring and alerting - Collaborating with cross-functional teams including Data Scientists, Architects and Analysts - Mentoring More ❯
secure, scalable, and reliable. You’ll be the ideal candidate for this position if you have: Advanced SQL skills with strong experience using relational databases. Hands-on experience with BigQuery or a similar large-scale analytics platform. Proficiency in Python or R, including libraries like Pandas. Experience working with BI tools such as Looker, Tableau, Power BI, or Mode. More ❯
in designing cloud data platform solutions In depth knowledge of the Snowflake platform and capabilities Relevant experience of working with other cloud data platform solutions such as Databricks, GCP BigQuery, Microsoft Azure or AWS offerings would also be advantageous Practical knowledge of GenAI and LLM offerings in the market Skilled in working on large scale agile transformation projects You More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
aPriori Technologies
collaborate with aPriori product and domain experts to build business intelligence platform tools and interfaces, leveraging your expertise to help customers extract meaningful insights from their data. Utilising dbt, BigQuery, Airflow, Cloud Composer and Looker within a multi-cloud (AWS + GCP) architecture, you will be responsible for driving cross-domain collaboration through a rapidly scalable data platform. The More ❯
London, England, United Kingdom Hybrid / WFH Options
CreateFuture
Python, SQL, dbt, Airflow and cloud-native data tools (AWS, GCP or Azure) You have strong knowledge of data architecture patterns - including Lakehouse and modern warehouse design (e.g. Snowflake, BigQuery, Databricks) You know how to build with performance, security and maintainability in mind You can comfortably influence and guide both technical and non-technical stakeholders You're passionate about More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
SQL, .NET, dbt, Airflow and cloud-native data tools (AWS, GCP or Azure) You have strong knowledge of data architecture patterns - including Lakehouse and modern warehouse design (e.g. Snowflake, BigQuery, Databricks) You know how to build with performance, security and maintainability in mind You can comfortably influence and guide both technical and non-technical stakeholders You're passionate about More ❯
London, England, United Kingdom Hybrid / WFH Options
Algolia
code quality, automated testing, and other engineering best practices Experience using one of the major cloud providers (GCP, AWS or Azure) Experience using data engineering tools (e.g. Airflow or BigQuery) Excellent spoken and written English skills NICE TO HAVE: Experience operating AI models in production environments Experience in Go or Python Sensitivity to data driven decision making, and exploring More ❯