engineering or a similar role. Strong expertise in SQL with the ability to write efficient, complex queries. Proficiency in DBT for data modeling and transformation Hands-on experience with BigQuery and other GCP data services. Solid understanding of data warehousing principles and best practices. Basic to intermediate skills in Python for scripting and automation. Familiarity with version control systems More ❯
with dbt for data transformations. Hands-on experience with Apache Airflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity with Snowflake. Knowledge of object-oriented programming (Kotlin, Java, or C#). Retail data experience (sales, inventory, or related datasets). More ❯
Nuneaton, Warwickshire, England, United Kingdom Hybrid / WFH Options
Holland & Barrett International Limited
to deliver end-to-end data products. Mentor mid-level analysts and help strengthen capability across the Commercial Ops analytics team. Work with a modern data stack, including Redshift, BigQuery, Matillion, and Retool. Location: This is a hybrid role based in Nuneaton, with the requirement to be in the office at least two days per week. The Person What More ❯
ability to translate ambiguous business problems into scalable, production-ready data science solutions Hands-on experience with cloud platforms (AWS, GCP, Azure) and tools such as Databricks, Snowflake, and BigQuery Exceptional communication and stakeholder management skills, with the ability to influence decision-making across technical and non-technical teams Desirable skills Familiarity with real-time model serving, monitoring, and More ❯
Stevenage, Hertfordshire, United Kingdom Hybrid / WFH Options
GlaxoSmithKline
data solutions on cloud platforms. Key Placement Learnings: Design and build data pipelines using Python, R, Nextflow; work on GCP to automate data flows. Interact with databases (e.g., PostgreSQL, BigQuery) to manage data and create views. Create analysis tools and dashboards to communicate results for decision-making. Work in multidisciplinary teams and develop presentation and communication skills. Role More ❯
experience in cybersecurity engineering, cloud security, or IT security. Hands-on experience with: Google Admin Console, Security Center, Google Vault, Cloud Identity GCP services: Compute Engine, VPC, IAM, GKE, BigQuery Proficiency with infrastructure-as-code tools: Terraform, Ansible, Deployment Manager. Familiarity with CI/CD tools: GitLab CI, Jenkins, Cloud Build. Strong background in Linux systems, networking, and containerization More ❯
in technology delivery, with 3+ years leading cloud data projects (GCP preferred). Must have experience delivering data platforms in banking or financial services. Strong knowledge of GCP services (BigQuery, Dataflow, Pub/Sub). Familiarity with ETL/ELT, data lakehouse architectures, and cloud integration. Excellent leadership, communication, and stakeholder management skills. Understanding of data governance frameworks (GDPR More ❯
and existing diseases, and a pattern of continuous learning and development is mandatory. Key Responsibilities Build data pipelines using modern data engineering tools on Google Cloud: Python, Spark, SQL, BigQuery, Cloud Storage. Ensure data pipelines meet the specific scientific needs of data consuming applications. Responsible for high quality software implementations according to best practices, including automated test suites and More ❯
and existing diseases, and a pattern of continuous learning and development is mandatory. Key Responsibilities Build data pipelines using modern data engineering tools on Google Cloud: Python, Spark, SQL, BigQuery, Cloud Storage. Ensure data pipelines meet the specific scientific needs of data consuming applications. Responsible for high quality software implementations according to best practices, including automated test suites and More ❯
and existing diseases, and a pattern of continuous learning and development is mandatory. Key Responsibilities Build data pipelines using modern data engineering tools on Google Cloud: Python, Spark, SQL, BigQuery, Cloud Storage. Ensure data pipelines meet the specific scientific needs of data consuming applications. Responsible for high quality software implementations according to best practices, including automated test suites and More ❯
will too. What you'll need: Confirmed experience as a Data Solution Architect with hands-on expertise in GCP and/or Azure Strong proficiency in GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Composer and Looker Experience migrating data from legacy platforms to cloud-native environments Expertise in data modelling, pipeline development and real-time data streaming More ❯
with a multi-person engineering team, leveraging modern practices like code reviews, CI/CD, and trunk-based development. Experience in cloud ecosystems, particularly Google Cloud Platform services like BigQuery, Pub/Sub, or Cloud Functions. Required Skills/Abilities Proficiency in software development using modern programming languages like Python, Java, Go, or similar tools. Strong understanding of microservices More ❯
applications Hands-on experience with cloud data platforms (AWS/GCP/Azure) Proficiency with big data technologies (Spark, Kafka, or similar streaming platforms) Experience with data warehouses (Snowflake, BigQuery, Redshift) and data lakes Knowledge of containerization (Docker/Kubernetes) and infrastructure as code Preferred Experience Experience building web applications with modern frameworks (React, Vue, or Angular) API development More ❯
where relevant. Tech you'll likely use LLM frameworks : LangChain, LlamaIndex (or similar) Cloud & Dev : Azure/AWS/GCP, Docker, REST APIs, GitHub Actions/CI Data & MLOps : BigQuery/Snowflake, MLflow/DVC, dbt/Airflow (nice to have) Front ends (for internal tools) : Streamlit/Gradio/basic React Must-have experience 7+ years in Data More ❯
with the ability to effectively present findings to both technical and non-technical stakeholders. Experience of A/B testing, experience in running retailed-based tool analytics Experience of BigQuery is desirable Benefits Pension company contribution = 3% Incentive scheme up to 10% of annual salary , based on company performance. Your wellbeing is paramount so you can get away and More ❯
Elk Grove Village, Illinois, United States Hybrid / WFH Options
Bel Brands USA
preferred Minimum 7 years of experience in data management and analytics (or related fields) Passionate about the Data world Proficiency in data analysis and reporting tools such as SQL, BigQuery, PowerBI Experience in using or implementing a Data Catalog tool Professional rigor, method, and organization Pragmatism and a sense of commitment Autonomy and ability to work in a team More ❯
and Build: Design and implement a robust, cloud-native data analytics platform spanning AWS, GCP, and other emerging cloud environments. You'll leverage services like S3/GCS, Glue, BigQuery, Pub/Sub, SQS/SNS, MWAA/Composer, and more to create a seamless data experience. (Required) Data Lake , Data Zone, Data Governance: Design, build, and manage data More ❯
who are willing to relocate) GCP Data Engineer - GCP Dataflow and Apache Beam (Key skills) Primary Skills- PySpark, Spark, Python, Big Data, GCP, Apache Beam, Dataflow, Airflow, Kafka and BigQuery GFO, Google Analytics Javascript is Must Strong Experience with Dataflow and BigQuery A person should have leading the team or responsible for team as a lead or senior … Cloud Platforms (preferably GCP) provided Big Data technologies Hands-on experience with real-time streaming processing as well as high volume batch processing, and skilled in Advanced SQL, GCP BigQuery, Apache Kafka, Data-Lakes, etc. Hands-on Experience in Big Data technologies - Hadoop, Hive, and Spark, and an enterprise-scale Customer Data Platform (CDP) Experience in at least one … programming language (Python strongly preferred), cloud computing platforms (e.g., GCP), big data tools such as Spark/PySpark, columnar datastores (BigQuery preferred), DevOps processes/tooling (CI/CD, GitHub Actions), infrastructure as code frameworks (Terraform), BI Tools (e.g. DOMO, Tableau, Looker,), pipeline orchestration (eg. Airflow) Fluency in data science/machine learning basics (model types, data prep, training More ❯
future-state data strategies. As part of this role, you will be responsible for: Leading technical solution design and delivery for large-scale data platform projects using GCP and BigQuery Supporting pre-sales engagements by translating client requirements into technical solutions Take ownership of cloud migration and data platform modernization strategies Undertake data modelling tasks with a focus on More ❯
future-state data strategies. As part of this role, you will be responsible for: Leading technical solution design and delivery for large-scale data platform projects using GCP and BigQuery Supporting pre-sales engagements by translating client requirements into technical solutions Take ownership of cloud migration and data platform modernization strategies Undertake data modelling tasks with a focus on More ❯
Technical Lead for cloud data projects (GCP preferred) Strong understanding of Agile/Scrum, DevOps, CI/CD, and cloud-native delivery models Familiarity with GCP services such as BigQuery, Dataflow, Cloud Composer, and Pub/Sub Knowledge of modern data architectures including lakehouse, ELT, and schema design Awareness of compliance frameworks such as GDPR, HIPAA, and RBAC Excellent More ❯
Technical Lead for cloud data projects (GCP preferred) Strong understanding of Agile/Scrum, DevOps, CI/CD, and cloud-native delivery models Familiarity with GCP services such as BigQuery, Dataflow, Cloud Composer, and Pub/Sub Knowledge of modern data architectures including lakehouse, ELT, and schema design Awareness of compliance frameworks such as GDPR, HIPAA, and RBAC Excellent More ❯
understanding of the software development lifecycle, from conception to deployment. Capable of conceptualizing and implementing software architectures spanning multiple technologies and platforms. Technology stack: Python Flask Java Spring JavaScript BigQuery Redis ElasticSearch Airflow Google Cloud Platform Kubernetes Docker Voted "Best Places to Work," our culture is driven by self-starters, team players, and visionaries. Headquartered in Los Angeles, California More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Morgan McKinley
solutions. Experience with front-end development (React/JavaScript) and DevOps/maintenance tasks. Proficiency in building Agentic systems and multi-agent architectures. ETL and data management experience (Postgres, BigQuery, Azure, Snowflake). Ability to design solutions, validate requirements and anticipate risks independently. Experience coordinating with multiple engineering and product teams. Preferred Qualifications: Experience in graph databases and advanced More ❯
for 5+ years across Data/Software Engineering, Data Analysis. SQL & Python: schema design, transformations, query optimisation, automation, testing. Track record of buildingETL/ELT pipelines into modern warehouses (BigQuery, Snowflake, Redshift). Familiar with tools like Dagster, Airflow, Prefect, dbt, Dataform, SQLMesh. Cloud experience (we're on GCP) + containerisation (Docker, Kubernetes). Strong sense of ownership over More ❯