knowledge (including understanding the trade-offs of using ORMs). Experience working with and integrating complex external REST APIs. Experience working with one or more of the leading cloud platforms (GCP, AWS, Azure). Experience with serverless architectures (Lambda, Fargate, Cloud Run, et al.) and a clear understanding of when not to use them. Experience with … back-end web application frameworks (Django, FastAPI, etc.). Experience with language models and agentic workflows (e.g., Langchain, AutoGPT). Knowledge of model evaluation metrics and techniques. GoogleCloud (CloudFunctions, AppEngine, PubSub, Stackdriver, etc.). Docker, Kubernetes. Typescript, NodeJS, React, Webpack, etc. Java, C#. Experience within the advertising/media agency space. Terraform More ❯
London, England, United Kingdom Hybrid / WFH Options
DataAnalystJobs.io
Delivery : You'll be responsible for delivery of complex data solutions including the ingest of data from a wide variety of data sources into our analytics platforms (typically cloud-based but some work on our on-premise data analytics platforms), transformation and cleansing of data and modelling of data into our enterprise data warehouse for consumption by both … upskilling supported for the right individual where necessary): SQL(mandatory): A strong understanding of SQL and be comfortable reading and writing complex SQL queries ideally across multiple platforms. Cloud Platforms(highly desirable) : Experience working with key services on either GCP (preferred), AWS or Azure. Key services include cloud storage, containerisation, event-driven services, orchestration, cloudfunctions and basic security/user management. Data Warehousing(highly desirable): Experience working on a data warehouse solution irrespective of underlying technology. Experience using cloud data warehouse technology would also be beneficial - Snowflake (preferred), Google BigQuery, AWS Redshift or Azure Synapse. Data Pipeline(highly desirable): Demonstrable experience working with data from a wide variety More ❯
past and you know why you want to avoid them Experience working with and integrating complex external REST APIs. Experience working with one or more of the leading cloud platform (GCP, AWS, Azure) Experience with serverless architectures (Lambda, Fargate, Cloud Run, et al.) and a clear understanding of when not to use them. Experience with message … and when to use them. You are familiar with trunk-based development in git. Experience with back-end web application frameworks (Django, FastAPI, etc) NICE TO HAVE GoogleCloud (CloudFunctions, AppEngine, PubSub, Stackdriver, etc.) Docker, Kubernetes Typescript, NodeJS, React, Webpack etc. Java, C# Experience within the advertising/media agency space. Terraform If More ❯
success. Role Description : We are seeking an experienced Senior SRE/DevOps Engineer to play a key role in our cloud migration initiative from AWS to GoogleCloud Platform (GCP) for our high-traffic social media application. You will be responsible for driving the design, implementation, and operational excellence of our new GCP environment, working collaboratively … address issues and fine-tune system performance. Security & Compliance: Ensure the infrastructure complies with relevant security standards and policies, implementing practices like VPC configuration, IAM best practices, and Cloud … Armor for protection. Required Qualifications : Experience: 5+ years of experience in DevOps, SRE, or cloud operations. Deep expertise in AWS services and hands-on experience with GoogleCloud Platform services (Cloud SQL, GKE, CloudFunctions, Cloud Storage, Cloud CDN, etc.). Containerization & Orchestration: Proficiency with Docker More ❯
success. Role Description : We are seeking an experienced Senior SRE/DevOps Engineer to play a key role in our cloud migration initiative from AWS to GoogleCloud Platform (GCP) for our high-traffic social media application. You will be responsible for driving the design, implementation, and operational excellence of our new GCP environment, working collaboratively … address issues and fine-tune system performance. Security & Compliance: Ensure the infrastructure complies with relevant security standards and policies, implementing practices like VPC configuration, IAM best practices, and Cloud … Armor for protection. Required Qualifications : Experience: 5+ years of experience in DevOps, SRE, or cloud operations. Deep expertise in AWS services and hands-on experience with GoogleCloud Platform services (Cloud SQL, GKE, CloudFunctions, Cloud Storage, Cloud CDN, etc.). Containerization & Orchestration: Proficiency with Docker More ❯
London, England, United Kingdom Hybrid / WFH Options
Substance Global
data pipelines, ensuring data integrity, and enabling data-driven decision-making. The ideal candidate will have a strong background in data engineering and analytics, with expertise in GoogleCloud Platform (GCP) services. This role requires a blend of technical proficiency, analytical thinking, effective project & time management, and communication skills. You will have the opportunity to engage in diverse … documentation, coding best practices. Take initiative to improve and optimise analytics engineering workflows and platforms Key Responsibilities Design, develop, and maintain scalable data pipelines on GCP (Or other cloud services ie. Snowflake) using services such as BigQuery and Cloud Functions. Designing and building data models for analytics, reporting, and data science applications in Looker/Metabase … data warehousing skills demonstrated in data environments Excellent Python & SQL and data transformation skills (e.g. ideally proficient in dbt or similar) Familiarity with at least one of these Cloud technologies: Snowflake, AWS, GoogleCloud, Microsoft Azure Good attention to detail to highlight and address data quality issues Excellent time management and proactive problem-solving skills What More ❯
London, England, United Kingdom Hybrid / WFH Options
Elwood Technologies Services Limited
You’ll play a critical role in maintaining uptime, resolving incidents , and automating infrastructure for Elwood’s EMS and PMS platforms, which are built on AWS and GCP cloud environments. This is a highly visible role that blends deep technical ownership with cross-functional collaboration. In addition to core SRE responsibilities, you will support our Technical Account Managers … GCP). Experience with automation tools like Terraform . Proficient in at least one scripting language (Python, Bash, Go, etc.). Solid understanding of Linux systems, networking, and cloud-based architectures. Experience working with container orchestration platforms like Kubernetes . Proficient with CI/CD pipelines, preferably with cloud-native tools (e.g.,GitHub). Ability to … with client-impact triage , working cross-functionally with account managers or product teams. Proficiency with Datadog or similar observability platforms. Knowledge of serverless architectures (e.g., AWS Lambda, GCP CloudFunctions). Familiarity with RDBMS and NoSQL databases , such as RDS, CloudSQL, DynamoDB. Prior experience in fintech , trading platforms, or 24/7 financial infrastructure. Strong understanding More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
Engineer, you’ll be responsible for building and maintaining scalable, efficient, and reliable data pipelines. You’ll work across a modern tech stack with a strong focus on GoogleCloud Platform (GCP) and collaborate with various teams to ensure data flows securely and accurately throughout the organisation. Key Responsibilities Design, build, and maintain robust data pipelines. Work with … Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Dataflow Pub/Sub CloudFunctions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data quality and compliance with governance standards. More ❯
design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake , SQL , Python , and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: Design and maintain end-to-end data architectures, data models, and pipelines in Snowflake … and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and observability initiatives. Recommend and … for large-scale data sets. Ensure platform scalability, cost-efficiency, and system reliability. Required Skills & Experience: Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. Proficiency in SQL for complex querying, optimization, and stored procedures. Strong More ❯
design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake , SQL , Python , and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: Design and maintain end-to-end data architectures, data models, and pipelines in Snowflake … and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and observability initiatives. Recommend and … for large-scale data sets. Ensure platform scalability, cost-efficiency, and system reliability. Required Skills & Experience: Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. Proficiency in SQL for complex querying, optimization, and stored procedures. Strong More ❯
design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake , SQL , Python , and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: Design and maintain end-to-end data architectures, data models, and pipelines in Snowflake … and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and observability initiatives. Recommend and … for large-scale data sets. Ensure platform scalability, cost-efficiency, and system reliability. Required Skills & Experience: Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. Proficiency in SQL for complex querying, optimization, and stored procedures. Strong More ❯
london (city of london), south east england, united kingdom
Levy Search
design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake , SQL , Python , and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: Design and maintain end-to-end data architectures, data models, and pipelines in Snowflake … and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and observability initiatives. Recommend and … for large-scale data sets. Ensure platform scalability, cost-efficiency, and system reliability. Required Skills & Experience: Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. Proficiency in SQL for complex querying, optimization, and stored procedures. Strong More ❯
Development: Design, develop, and maintain robust and scalable backend systems and APIs Data Ingestion: Develop and maintain data pipelines to extract data from various sources and load into GoogleCloud environments Data Transformation: Implement data transformation processes including cleansing, normalization, and aggregation to ensure data quality and consistency Data Modeling: Develop and maintain data models and schemas to … support efficient data storage and retrieval on GoogleCloud platforms Data Integration: Integrate data from multiple sources (on-prem and cloud-based) using Cloud Composer or other tools Data Lakes: Build data lakes using GoogleCloud services such as BigQuery Performance Optimization: Optimize data pipelines and queries for improved performance and scalability … Experience Experience with data analytics and Big Data technologies Knowledge of cloud security best practices and compliance standards Experience with agile development methodologies GCP Certifications (e.g., GoogleCloud Certified Professional Cloud Developer) Seniority level Seniority level Mid-Senior level Employment type Employment type Contract Job function Job function Information Technology Industries Software Development Referrals More ❯
range Direct message the job poster from 83zero Building Technical Support Engineering teams in EMEA & US Our client is seeking a skilled Platform Engineer to join their innovative Cloud Engineering team. This role is crucial in utilizing AWS services to monitor, maintain, and improve the infrastructure supporting their SaaS platform. The ideal candidate will be a self-starter … CD pipelines, as well as broader monitoring and automation efforts to support high-quality code development. KEY RESPONSIBILITIES Monitoring existing cross-platform environments in AWS and Azure. Optimizing cloud resource usage and managing associated costs. Designing, building, and maintaining scalable and secure AWS cloud environments. Automating repetitive, manual tasks to enhance efficiency and reduce errors, freeing … technologies and best practices, and sharing insights with the wider engineering department. CORE COMPETENCIES AND SKILLS Experience with serverless computing services (e.g., AWS Lambda, Azure Functions, GoogleCloudFunctions). Strong knowledge of Infrastructure as Code (IaC) tools, preferably Terraform or CloudFormation. Ability to analyze issues using CloudWatch logs. Deep understanding of cloudMore ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
Engineer, you'll be responsible for building and maintaining scalable, efficient, and reliable data pipelines. You'll work across a modern tech stack with a strong focus on GoogleCloud Platform (GCP) and collaborate with various teams to ensure data flows securely and accurately throughout the organisation. Key Responsibilities Design, build, and maintain robust data pipelines. Work with … Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub CloudFunctions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools like Terraform. Ensure data quality and More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Anson McCade
of intelligent applications—supporting workflows across LLMs, autonomous agents, semantic search, RAG pipelines, and memory-augmented reasoning systems. Key Responsibilities: Design and build scalable, secure data pipelines using GoogleCloud Platform (GCP) services including BigQuery, Dataflow, CloudFunctions, Pub/Sub, and Vertex AI. Support AI engineers by managing structured and unstructured data ingestion … and observability. Optimize data workflows for performance, cost-efficiency, and latency. Maintain strong data governance, access control, and compliance practices. Tech Stack: Languages: Python, SQL Cloud: GoogleCloud Platform (BigQuery, Dataflow, Vertex AI, Cloud Run, Pub/Sub) Databases: PostgreSQL, BigQuery, Pinecone, FAISS, Chroma Tools: dbt, Airflow, Terraform, Docker, GitHub Actions AI Frameworks: LangChain More ❯
to clients' first-party data, we create innovative solutions and unlock the power of AI, delivering tangible ROIs for our clients. Our expert consultants orchestrate Google Marketing and GoogleCloud technologies to unlock ultimate AI-powered performance within our client’s organisation. About the Role As one of our Data Engineers, you will be part of a team … automation solutions and data AI Agents. You will be designing and proposing effective combinations of Google Marketing Platform tools (GA4, Campaign Manager 360, Search Ads 360, etc.) and GoogleCloud solutions (BigQuery, BQ Sharing (Analytics Hub), Cloud Storage, APIs, Compute Engine, etc.) to address specific client needs. You will be designing, maintaining, and optimising data infrastructure … and clients to uncover their unique marketing challenges, business objectives, and current architectures. You will translate their needs into actionable product roadmaps that leverage Google Marketing Platform and GoogleCloud solutions. Responsibilities Lead the design, development, and optimization of scalable data pipelines using Python and GCP. Configure and manage complex integrations between Google Marketing Platform tools, GoogleCloudMore ❯
to clients' first-party data, we create innovative solutions and unlock the power of AI, delivering tangible ROIs for our clients. Our expert consultants orchestrate Google Marketing and GoogleCloud technologies to unlock ultimate AI-powered performance within our client’s organisation. About the Role As one of our Data Engineers, you will be part of a team … automation solutions and data AI Agents. You will be designing and proposing effective combinations of Google Marketing Platform tools (GA4, Campaign Manager 360, Search Ads 360, etc.) and GoogleCloud solutions (BigQuery, BQ Sharing (Analytics Hub), Cloud Storage, APIs, Compute Engine, etc.) to address specific client needs. You will be designing, maintaining, and optimising data infrastructure … and clients to uncover their unique marketing challenges, business objectives, and current architectures. You will translate their needs into actionable product roadmaps that leverage Google Marketing Platform and GoogleCloud solutions. Responsibilities Lead the design, development, and optimization of scalable data pipelines using Python and GCP. Configure and manage complex integrations between Google Marketing Platform tools, GoogleCloudMore ❯
South East London, England, United Kingdom Hybrid / WFH Options
Anson McCade
of intelligent applications—supporting workflows across LLMs, autonomous agents, semantic search, RAG pipelines, and memory-augmented reasoning systems. Key Responsibilities: Design and build scalable, secure data pipelines using GoogleCloud Platform (GCP) services including BigQuery, Dataflow, CloudFunctions, Pub/Sub, and Vertex AI. Support AI engineers by managing structured and unstructured data ingestion … and observability. Optimize data workflows for performance, cost-efficiency, and latency. Maintain strong data governance, access control, and compliance practices. Tech Stack: Languages: Python, SQL Cloud: GoogleCloud Platform (BigQuery, Dataflow, Vertex AI, Cloud Run, Pub/Sub) Databases: PostgreSQL, BigQuery, Pinecone, FAISS, Chroma Tools: dbt, Airflow, Terraform, Docker, GitHub Actions AI Frameworks: LangChain More ❯
london, south east england, united kingdom Hybrid / WFH Options
Anson McCade
of intelligent applications—supporting workflows across LLMs, autonomous agents, semantic search, RAG pipelines, and memory-augmented reasoning systems. Key Responsibilities: Design and build scalable, secure data pipelines using GoogleCloud Platform (GCP) services including BigQuery, Dataflow, CloudFunctions, Pub/Sub, and Vertex AI. Support AI engineers by managing structured and unstructured data ingestion … and observability. Optimize data workflows for performance, cost-efficiency, and latency. Maintain strong data governance, access control, and compliance practices. Tech Stack: Languages: Python, SQL Cloud: GoogleCloud Platform (BigQuery, Dataflow, Vertex AI, Cloud Run, Pub/Sub) Databases: PostgreSQL, BigQuery, Pinecone, FAISS, Chroma Tools: dbt, Airflow, Terraform, Docker, GitHub Actions AI Frameworks: LangChain More ❯
to clients' first-party data, we create innovative solutions and unlock the power of AI, delivering tangible ROIs for our clients. Our expert consultants orchestrate Google Marketing and GoogleCloud technologies to unlock ultimate AI-powered performance within our client’s organisation. About the Role As one of our Data Engineers, you will be part of a team … automation solutions and data AI Agents. You will be designing and proposing effective combinations of Google Marketing Platform tools (GA4, Campaign Manager 360, Search Ads 360, etc.) and GoogleCloud solutions (BigQuery, BQ Sharing (Analytics Hub), Cloud Storage, APIs, Compute Engine, etc.) to address specific client needs. You will be designing, maintaining, and optimising data infrastructure … and clients to uncover their unique marketing challenges, business objectives, and current architectures. You will translate their needs into actionable product roadmaps that leverage Google Marketing Platform and GoogleCloud solutions. Responsibilities Lead the design, development, and optimization of scalable data pipelines using Python and GCP. Configure and manage complex integrations between Google Marketing Platform tools, GoogleCloudMore ❯
to clients' first-party data, we create innovative solutions and unlock the power of AI, delivering tangible ROIs for our clients. Our expert consultants orchestrate Google Marketing and GoogleCloud technologies to unlock ultimate AI-powered performance within our client’s organisation. About the Role As one of our Data Engineers, you will be part of a team … automation solutions and data AI Agents. You will be designing and proposing effective combinations of Google Marketing Platform tools (GA4, Campaign Manager 360, Search Ads 360, etc.) and GoogleCloud solutions (BigQuery, BQ Sharing (Analytics Hub), Cloud Storage, APIs, Compute Engine, etc.) to address specific client needs. You will be designing, maintaining, and optimising data infrastructure … and clients to uncover their unique marketing challenges, business objectives, and current architectures. You will translate their needs into actionable product roadmaps that leverage Google Marketing Platform and GoogleCloud solutions. Responsibilities Lead the design, development, and optimization of scalable data pipelines using Python and GCP. Configure and manage complex integrations between Google Marketing Platform tools, GoogleCloudMore ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Anson McCade
of intelligent applications—supporting workflows across LLMs, autonomous agents, semantic search, RAG pipelines, and memory-augmented reasoning systems. Key Responsibilities: Design and build scalable, secure data pipelines using GoogleCloud Platform (GCP) services including BigQuery, Dataflow, CloudFunctions, Pub/Sub, and Vertex AI. Support AI engineers by managing structured and unstructured data ingestion … and observability. Optimize data workflows for performance, cost-efficiency, and latency. Maintain strong data governance, access control, and compliance practices. Tech Stack: Languages: Python, SQL Cloud: GoogleCloud Platform (BigQuery, Dataflow, Vertex AI, Cloud Run, Pub/Sub) Databases: PostgreSQL, BigQuery, Pinecone, FAISS, Chroma Tools: dbt, Airflow, Terraform, Docker, GitHub Actions AI Frameworks: LangChain More ❯
london (city of london), south east england, united kingdom
TechYard
to clients' first-party data, we create innovative solutions and unlock the power of AI, delivering tangible ROIs for our clients. Our expert consultants orchestrate Google Marketing and GoogleCloud technologies to unlock ultimate AI-powered performance within our client’s organisation. About the Role As one of our Data Engineers, you will be part of a team … automation solutions and data AI Agents. You will be designing and proposing effective combinations of Google Marketing Platform tools (GA4, Campaign Manager 360, Search Ads 360, etc.) and GoogleCloud solutions (BigQuery, BQ Sharing (Analytics Hub), Cloud Storage, APIs, Compute Engine, etc.) to address specific client needs. You will be designing, maintaining, and optimising data infrastructure … and clients to uncover their unique marketing challenges, business objectives, and current architectures. You will translate their needs into actionable product roadmaps that leverage Google Marketing Platform and GoogleCloud solutions. Responsibilities Lead the design, development, and optimization of scalable data pipelines using Python and GCP. Configure and manage complex integrations between Google Marketing Platform tools, GoogleCloudMore ❯
HCLTech is a global technology company, home to 219,000+ people across 54 countries, delivering industry-leading capabilities centered on digital, engineering and cloud, powered by a broad portfolio of technology services and products. We work with clients across all major verticals, providing industry solutions for Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and … modelling, demonstrable experience interpreting requirements and producing design artefacts for implementation by development teams. Java 8 and higher; Spring boot (preferable); RDBMS (Oracle, Postgres) Desired; Micro services; Kubernetes GoogleCloud Platform (GCP) or any other cloud GKE Cloud SQL CloudFunctions GCS and Labelling Pub Sub and Dataflow Big Query …/2003/2000/XP, Windows 98, UNIX, RedHat, Linux, DOS. IAM Tools ForgeRock CIAM, IDM, AM, DS Cloud deployment Docker/GKE/Terraform, Googlecloud environment (GKE) App server WebSphere, Jboss, Tomcat, WebLogic Languages C, C++, Java/J2EE Web Development HTML, CSS, JavaScript, JSP, Servlets. Database MS SQL Server2000/2005/ More ❯