London, South East, England, United Kingdom Hybrid / WFH Options
Interquest
rail industry's revenue allocation system through a suite of technology upgrades and methodological improvements. Key enhancements include migrating the platform to Google Cloud Platform (GCP), implementing a modern BigQuery-based Data Warehouse, and replacing the legacy solution for allocation factor calculation with an innovative, graph database-driven solution. Duties: Lead the design of data architectures and the development More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
two days of work per week. Key Requirements Proficiency with Python and SQL for data transformation, pipeline development, and model integration Experience with modern data lake technologies such as BigQuery or Snowflake Experience working with large, complex, and high-volume data sets The ability to unify, collate, and interpret data to support data-driven decisions Proficient in using analytics More ❯
ability to deliver interlocking benefits across teams and platforms. Strong statistical grounding, including expertise in forecasting, clustering, optimization, and predictive modeling. Proficiency in Python, SQL, and cloud platforms (especially BigQuery). Commercial acumen in leveraging data to shape strategy and unlock business value. Exceptional data storytelling skills, translating complex models into engaging narratives. Experience in scaling proof-of-concept More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Pontoon
ability to deliver interlocking benefits across teams and platforms. Strong statistical grounding, including expertise in forecasting, clustering, optimization, and predictive modeling. Proficiency in Python, SQL, and cloud platforms (especially BigQuery). Commercial acumen in leveraging data to shape strategy and unlock business value. Exceptional data storytelling skills, translating complex models into engaging narratives. Experience in scaling proof-of-concept More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Pontoon
ability to deliver interlocking benefits across teams and platforms. Strong statistical grounding, including expertise in forecasting, clustering, optimization, and predictive modeling. Proficiency in Python, SQL, and cloud platforms (especially BigQuery). Commercial acumen in leveraging data to shape strategy and unlock business value. Exceptional data storytelling skills, translating complex models into engaging narratives. Experience in scaling proof-of-concept More ❯
in technology delivery, with 3+ years leading cloud data projects (GCP preferred). Must have experience delivering data platforms in banking or financial services. Strong knowledge of GCP services (BigQuery, Dataflow, Pub/Sub). Familiarity with ETL/ELT, data lakehouse architectures, and cloud integration. Excellent leadership, communication, and stakeholder management skills. Understanding of data governance frameworks (GDPR More ❯
Technical Lead for cloud data projects (GCP preferred) Strong understanding of Agile/Scrum, DevOps, CI/CD, and cloud-native delivery models Familiarity with GCP services such as BigQuery, Dataflow, Cloud Composer, and Pub/Sub Knowledge of modern data architectures including lakehouse, ELT, and schema design Awareness of compliance frameworks such as GDPR, HIPAA, and RBAC Excellent More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Morgan McKinley
solutions. Experience with front-end development (React/JavaScript) and DevOps/maintenance tasks. Proficiency in building Agentic systems and multi-agent architectures. ETL and data management experience (Postgres, BigQuery, Azure, Snowflake). Ability to design solutions, validate requirements and anticipate risks independently. Experience coordinating with multiple engineering and product teams. Preferred Qualifications: Experience in graph databases and advanced More ❯
security-sensitive environments. Essential Skills: Strong hands-on experience as a DevOps engineer with Google Cloud Platform. Proven expertise with AI/ML services on GCP, particularly Vertex AI, BigQuery ML, and TensorFlow. Solid understanding of MLOps principles, including building CI/CD pipelines for machine learning. Proficiency in Infrastructure as Code (IaC) using Terraform. Experience deploying and managing More ❯
Innovation & mentoring - Champion modern approaches, introduce emerging tech, and guide data teams in best practices. About You Strong track record in cloud-native data architecture with deep GCP expertise (BigQuery, Dataform, GCS, Vertex AI). Experience designing and implementing complex data ecosystems across pipelines, storage, and integration. Skilled in simplifying and modernising architectures while balancing innovation, cost, and governance. More ❯
South West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
What you'll do Engineer and analyse large datasets to deliver insights that shape client strategy. Build scalable solutions with Python (Pandas/NumPy), SQL, and cloud tools like BigQuery on GCP . Support senior business experts by turning raw financial data into decisions that matter. Design and run data mapping projects that automate workflows and cut through complexity. … What you bring A solid grounding in investment/financial data (securities, assets, operations experience). Python SQL Cloud: GCP/BigQuery Familiarity with dbt or Databricks Experience to reinsurance or life insurance projects To hear more about the Data Engineer contract opportunity, get in touch with Connor Smyth at Anson McCade on 020 7780 6706. Reference: AMC/ More ❯
Data Engineer - Financial Services (Contract) £425 GBP Onsite WORKING Location: Central London, Greater London - United Kingdom Type: Contract Data Engineer - Financial Services (Contract) Rate: £425 per day (outside IR35) Location: UK (remote with occasional client site visits) Contract: 6-month More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
GCP ENGINEER 6-MONTH CONTRACT LONDON (FULLY REMOTE) £500-£600 PER DAY This role as a GCP BigQuery Engineer offers the chance to work with a leading energy company focused on leveraging data to improve efficiency and decision-making. You'll be working on complex data projects within Google Cloud, optimising pipelines, and ensuring reliable data processing. The role … also involves mentoring others and implementing best practices in BigQuery and dbt. THE COMPANY This energy company is a leader in sustainable and data-driven energy solutions, leveraging cutting-edge technology to optimise operations, forecast demand, and enhance efficiency. With a strong focus on digital transformation, they are investing in cloud-based data platforms to drive smarter decision-making … models. Creating clear and comprehensive documentation to support knowledge sharing and adoption. KEY SKILLS AND REQUIREMENTS To succeed in this role, you should have: Strong commercial experience with GoogleBigQuery and the GCP data ecosystem. Expertise in dbt, including building and optimising data models. Proficiency in SQL for large-scale data transformations. Experience in ELT processes and best practices More ❯