DevOps, integrating automated testing into a CI/CD pipeline using Jenkins or GitLab runner. Experience with either AWS, Azure OR GCP Cloud Good knowledge in GCP Storage Buckets, BigQuery, Dataflow and Cloud Functions (any Cloud). Hands-on experience with one or more programming languages (Java/JavaScript). Experience with non-functional testing using JMeter or similar More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Searchability
to define and implement data models and storage strategies. DATA ENGINEER - ESSENTIAL SKILLS: Proven experience as a Data Engineer, Data Developer, or similar role. Strong experience with GCP (e.g. BigQuery, Cloud Storage, Dataflow) and other cloud-based data solutions. Proficiency in Python for scripting and data transformation. Hands-on experience with PostgreSQL and MongoDB. Solid understanding of ETL/ More ❯
technical architects. Strong understanding of Excel/VBA constructs, macros, and common automation patterns. Experience designing governance models for Workspace automation at scale. Familiarity with GCP components such as BigQuery, Cloud Logging, Pub/Sub, and Artifact Registry (preferred but not mandatory). Strong ability to collaborate with business teams and translate requirements into practical technical frameworks. Excellent communication More ❯
testing , including JUnit . Mindset: Excellent communication skills and a commitment to continuous learning in a collaborative Agile environment. Good to Have: Experience with GCP Data Services such as BigQuery, Spanner, Cloud Composer, or Dataflow . Exposure to Generative AI tools like LangChain, LangGraph , or similar frameworks; an interest in agentic AI solutions is a plus. Skill Matrix: SkillHands More ❯
Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused The details Location: Edinburgh 2 days onsite per week Duration: 3 months initially Day Rate: c.£500/day IR35 More ❯
Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused The details Location: Edinburgh 2 days onsite per week Duration: 3 months initially Day Rate: c.£500/day IR35 More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused The details Location: Edinburgh – 2 days onsite per week Duration: 3 months initially Day Rate: c.£500/day IR35 More ❯
and analysis. Strong stakeholder management and communication skills, with the ability to translate technical outputs into business-friendly insights. Experience with Snowflake (or alternative cloud data warehouses such as BigQuery or Redshift). Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications More ❯
Contract Data Analyst (BigQuery) – Inside IR35 Location: Manchester - 2 days a week in the … office Contract: 3 months (Inside IR35) Day Rate: Competitive (Inside IR35) Start Date: ASAP About the Role I'm seeking an experienced Data Analyst with deep expertise in GoogleBigQuery to join my clients award winning analytics team on a 3 month contract. You will play a key role in transforming raw data into actionable insights, building scalable data … and security. Essential Skills & Experience Proven experience as a Data Analyst or BI Analyst in mid-to-large data environments. Advanced SQL , with significant hands-on experience using GoogleBigQuery for analytics at scale. Strong data modelling skills (star/snowflake schema, optimisation, partitioning, clustering). Experience building dashboards using Looker Studio, Looker, Power BI, or Tableau . Proficiency More ❯
autonomous agents, multi-agent orchestration, tool use, memory, planning) Strong hands-on skills in Python, LLMs, RAG, and LangChain or similar frameworks Deep understanding of GCP services: Vertex AI, BigQuery, Cloud Functions, Pub/Sub, etc. Experience with batch and real-time data pipelines Familiarity with AI assurance, governance, or ethical AI is a bonus Ability to work independently More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions Ltd
autonomous agents, multi-agent orchestration, tool use, memory, planning) Strong hands-on skills in Python, LLMs, RAG, and LangChain or similar frameworks Deep understanding of GCP services: Vertex AI, BigQuery, Cloud Functions, Pub/Sub, etc. Experience with batch and real-time data pipelines Familiarity with AI assurance, governance, or ethical AI is a bonus Ability to work independently More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
consistency of quality and approach. YOUR SKILLS AND EXPERIENCE: 5+ years in data analytics or business insight, ideally within marketplace, tech, or eCommerce. Expert-level SQL and experience with BigQuery or Snowflake. Proven stakeholder management and presentation experience. Confident working with large, complex data sets. Desirable: Experience with Python or R for modelling and forecasting. Background in partner, marketplace More ❯
Job Title: Contract Data Engineer (GCP/BigQuery) - 6-Month Remote UK Are you a Data Engineer who lives and breathes the Google Cloud stack? We are searching for a highly skilled Contract Data Engineer for a 6-month, fully remote (UK-based) engagement. This is a hands-on role for a specialist with deep technical expertise in GCP … and BigQuery . Your mission will be to design, build, and maintain robust data pipelines, leveraging your strong ETL/ELT skills to manage complex data ecosystems. Your Core Skills: 3-5 years of dedicated Data Engineering experience. Expert-level knowledge of Google Cloud Platform (GCP) - This is essential. Deep commercial experience with BigQuery (query optimization, data modeling More ❯
a) Migration of portfolio from our UW legacy CRM system to Gentrack. b) Build extract scripts to pull data from Gentrack Junifer database and load the data into UW BigQuery data warehouse to assist with management, regulatory, operational and other reports. This role will be particularly focussed on (b) above and will include: interpreting Junifer data models, build ETL … scripts, and populate energy data models in our UW BigQuery environment. We deliver progress. What you’ll do and how you will make an impact. This role would need: Experience in data migration projects (ideally energy platform migration). Experience in Gentrack Junifer application/… data models. Experience in building data warehouse data models. Proficient in SQL as both data extraction scripts and data loading scripts will be sql based. Experience working with GoogleBigQuery and Dataform environments. Experience in working collaboratively with a multitude of teams to meet the timelines and deliverables. What you’ll do Analyse Gentrack Junifer data models to understand More ❯
and troubleshooting issues in live production codebases (not just isolated development). Cloud Experience: Solid experience with any major public cloud provider (GCP, AWS, or Azure). Experience with BigQuery would be good. Machine Learning: Experience supporting and understanding ML pipelines and models in a production setting. Direct experience with Google Cloud Platform, BigQuery, and associated tooling. Experience More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
deployment. Cloud Deployment & Operations: Deploy and troubleshoot applications on Google Cloud Platform (GCP). Candidates must be able to diagnose production issues within cloud environments (e.g., Cloud Run, GCS, BigQuery). Infrastructure as Code: Develop and manage cloud infrastructure using Terraform . Monitoring: Implement and monitor production metrics to ensure system health and identify bottlenecks. (Optional) Data & AI Integration … proficiency in SQL . Nice-to-Have Skills Direct production experience with FastAPI . Experience with Google Cloud Run and Kubernetes (GKE) . Familiarity with GCP data tools like BigQuery and Google Cloud Storage (GCS) . Experience building complex agentic AI applications (beyond basic RAG or LangChain implementations). How We Evaluate Candidates Our interview process is designed to More ❯