global retail or similar data-intensive industry Expert-level proficiency in various data modeling techniques and tools Strong hands-on experience with cloud data platforms, specifically Google Cloud Platform - BigQuery Experience with leading ETL/ELT tools and data pipeline orchestration (e.g., Dataflow, Apache Airflow, Talend, Informatica) Advanced SQL skills and deep knowledge of various database technologies (relational, columnar More ❯
Engineering/BI Engineering experience Excellent SQL skills Understanding of data warehousing, data modelling concepts and structuring new data tables Knowledge of cloud-based MPP data warehousing (e.g. Snowflake, BigQuery, Redshift) Nice to have Experience developing in a BI tool (Looker or similar) Good practical understanding of version control SQL ETL/ELT knowledge, experience with DAGs to manage More ❯
systems (Kafka, Spark Streaming, Kinesis) Familiarity with schema design and semi-structured data formats Exposure to containerisation, graph databases, or machine learning concepts Proficiency with cloud-native data tools (BigQuery, Redshift, Snowflake) Enthusiasm for learning and experimenting with new technologies Why Join Capco Deliver high-impact technology solutions for Tier 1 financial institutions Work in a collaborative, flat, and More ❯
to hierarchically structure information residing in the data under analysis. Robust knowledge of statistical concepts, accompanied by expertise with a set of analytical tools ranging from databases (e.g. SQL, BigQuery, noSQL) to programming languages (e.g. R, Python), and from data visualisation (e.g. Tableau, PowerBI) to machine learning. Understanding data engineering solutions is a plus. Strong experience in MLOps, including More ❯
version control best practices in a collaborative engineering environment. Strong communicator - able to engage with both technical and non-technical colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Medialab Group
version control best practices in a collaborative engineering environment. Strong communicator - able to engage with both technical and non-technical colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to translate diverse business requirements into scalable data models and architect a More ❯
similar area 5+ years of experience working as a professional software engineer, data in industry Expertise with Python coding and type system Expertise in writing SQL (GQL, PostgreSQL, and BigQuery are a plus) Experience with building both batch and streaming ETL pipelines using data processing engines Deep understanding of building Knowledge Graphs entailing biological ontologies, and leveraging graph DBs More ❯
minimal disruption to business operations. Optimize data storage and retrieval mechanisms across cloud-based databases, preferably Azure CosmosDB, Neo4j Graph DB (or similar technology e.g., Amazon RDS, Azure SQL, BigQuery, NoSQL). Application Architecture & Security Architect microservices-based applications, ensuring modular, scalable, and API-first designs Design and implement application security controls, including IAM, Zero Trust models, encryption, and More ❯
experience of people management or desire to manage individuals on the team Nice to Have Experience with some of these- Kafka, Kinesis, Kinesis Analytics, Glue, Lambda, Kubernetes, ETL pipelines, BigQuery, Dataflow, BigTable, and SQL Enthusiastic about learning and applying new technologies (growth mindset). Ability to build new solutions and support our data solutions/products. What's in More ❯
share documentation, and support technical best practices Integrate infrastructure and data services across cloud platforms Skills and Experience Expertise in cloud platforms: Azure : Data Factory, Synapse, Monitor, ARM GCP : BigQuery, Dataflow, Cloud Composer, Pub/Sub Strong knowledge of data warehousing, storage, and modelling Experience with ETL/ELT processes and performance tuning Hands-on scripting (Python, PowerShell) for More ❯
share documentation, and support technical best practices Integrate infrastructure and data services across cloud platforms Skills and Experience Expertise in cloud platforms: Azure : Data Factory, Synapse, Monitor, ARM GCP : BigQuery, Dataflow, Cloud Composer, Pub/Sub Strong knowledge of data warehousing, storage, and modelling Experience with ETL/ELT processes and performance tuning Hands-on scripting (Python, PowerShell) for More ❯
City of London, London, United Kingdom Hybrid / WFH Options
MRK Associates
communications both internally and with external clients. Strong hands-on experience using SQL for multi-step/complex analytics is essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced More ❯
communications both internally and with external clients. Strong hands-on experience using SQL for multi-step/complex analytics is essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced More ❯
years experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver More ❯
and storage solutions in cloud environments (AWS, Azure, GCP) - Writing complex queries against relational and non-relational databases - Leading or contributing to key projects involving technologies like Databricks, Snowflake, BigQuery and Fabric - Applying software engineering best practices to data engineering, including CI/CD, monitoring and alerting - Collaborating with cross-functional teams including Data Scientists, Architects and Analysts - Mentoring More ❯
a challenging start-up environment. You have a curious mind and love problem-solving. We are looking for candidates who have Demonstrable, in-depth experience with SQL, particularly in BigQuery and PostgreSQL. Proven experience with dbt, focusing on building and maintaining efficient, scalable, and reliable DAGs. Proficiency in Python for data manipulation, scripting, or automation tasks. Are proficient with More ❯
libraries (e.g.scikit-learn, pandas, NumPy, SciPy, etc) Experience with ML frameworks such as TensorFlow, PyTorch, XGBoost, LightGBM, or similar Strong SQL skills and experience with data warehousing solutions (Snowflake, BigQuery, Redshift) Experience with cloud platforms (AWS, Azure, GCP) and their ML and AI services (SageMaker, Azure ML, Vertex AI) Knowledge of MLOps tools including Docker, MLflow, Kubeflow, or similar More ❯
Demonstrable experience deploying machine learning solutions in a production environment, and familiarity with version controls systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to More ❯
Studio), preferably within a marketing or media environment Understanding of digital media metrics (e.g., impressions, clicks, conversions, reach, frequency, CTR, CPM, ROAS) Proficient in SQL, with working knowledge of BigQuery or other cloud-based databases Familiarity with Python or R for data manipulation or statistical analysis Knowledge of advertising technology, campaign tracking, and data integration tools is a plus More ❯
data solutions Migrating from Azure to GCP Automate data lifecycle processes and optimise cloud resource usage Support analytics and self-service reporting initiatives Skills & Experience Strong experience with GCP, BigQuery, Dataflow Proficiency in Python and Terraform Skilled in ETL/ELT, data modelling, and metadata management Familiarity with CI/CD, DevOps, and Infrastructure-as-Code Knowledge of insurance More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
data solutions Migrating from Azure to GCP Automate data lifecycle processes and optimise cloud resource usage Support analytics and self-service reporting initiatives Skills & Experience Strong experience with GCP, BigQuery, Dataflow Proficiency in Python and Terraform Skilled in ETL/ELT, data modelling, and metadata management Familiarity with CI/CD, DevOps, and Infrastructure-as-Code Knowledge of insurance More ❯
London, Victoria, United Kingdom Hybrid / WFH Options
Boston Hale
relational/non-relational databases. Proficiency in Python and/or Java for data services and API integration. Proven experience with cloud platforms (GCP preferred) and data warehouses like BigQuery or Snowflake. Strong analytical skills and experience working with large, complex datasets. Excellent communication and project management abilities. 5+ years in data engineering, ideally within fast-paced, content-driven More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
relational/non-relational databases. Proficiency in Python and/or Java for data services and API integration. Proven experience with cloud platforms (GCP preferred) and data warehouses like BigQuery or Snowflake. Strong analytical skills and experience working with large, complex datasets. Excellent communication and project management abilities. 5+ years in data engineering, ideally within fast-paced, content-driven More ❯
Kafka, Apache Spark, Apache Flink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery) Expertise in building data architectures that support batch and streaming paradigms Experience with standards such as JSON, XML, YAML, Avro, Parquet Strong communication skills Open to learning new technologies More ❯