Data Integration Tools: Knowledge of platforms like Airflow, Apache NiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience 2+ years of More ❯
Data Integration Tools: Knowledge of platforms like Airflow, Apache NiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience 2+ years of More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
and distributed data processing frameworks. Proven experience with cloud data platforms and services (e.g., Azure Data Factory, Azure Databricks, AWS Glue, Google Cloud Dataflow, BigQuery). What's On Offer An exceptional opportunity to contribute to high-impact national initiatives within a leading government entity. Contact: Manpreet Kaur Quote More ❯
Extensive experience as a BI Developer/Analytics Engineer, specialising in Looker (dashboarding, advanced LookML). Strong SQL proficiency with cloud data warehouses (e.g., BigQuery ). Solid understanding of data warehousing and modelling. Proven background in marketing data and analytics (e.g., campaign performance, web analytics). Experience translating business More ❯
Extensive experience as a BI Developer/Analytics Engineer, specialising in Looker (dashboarding, advanced LookML). Strong SQL proficiency with cloud data warehouses (e.g., BigQuery ). Solid understanding of data warehousing and modelling. Proven background in marketing data and analytics (e.g., campaign performance, web analytics). Experience translating business More ❯
Extensive experience as a BI Developer/Analytics Engineer, specialising in Looker (dashboarding, advanced LookML). Strong SQL proficiency with cloud data warehouses (e.g., BigQuery ). Solid understanding of data warehousing and modelling. Proven background in marketing data and analytics (e.g., campaign performance, web analytics). Experience translating business More ❯
Google Cloud Platform (GCP): This includes a broad knowledge of core GCP services like Compute Engine, App Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, BigQuery, and networking components (VPC, Cloud DNS, etc.). A trainer should be able to go beyond the basics and explain the nuances of each More ❯
supportive environment that encourages growth and innovation What we’re looking for: 5+ years of experience in data analytics with advanced SQL skills (Snowflake, BigQuery, etc.) Strong experience with BI tools – ideally Looker (LookML a definite plus) Excellent communication skills and the confidence to engage with cross-functional teams More ❯
supportive environment that encourages growth and innovation What we’re looking for: 5+ years of experience in data analytics with advanced SQL skills (Snowflake, BigQuery, etc.) Strong experience with BI tools – ideally Looker (LookML a definite plus) Excellent communication skills and the confidence to engage with cross-functional teams More ❯
solutions • In depth knowledge of the Snowflake platform and capabilities • Relevant experience of working with other cloud data platform solutions such as Databricks, GCP BigQuery, Microsoft Azure or AWS offerings would also be advantageous • Practical knowledge of GenAI and LLM offerings in the market • Skilled in working on large More ❯
in Advanced SQL, SSIS packages, Python, and Shell Scripting. Knowledge of ER Studio. Working knowledge of Google Cloud Platform components such as GCS buckets, BigQuery, Kafka, etc. * Free services are subject to limitations. #J-18808-Ljbffr More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Booksy
of experience in backend and data engineering, with strong system design skills Practical proficiency in cloud technologies (ideally GCP), with expertise in tools like BigQuery, Dataflow, Pub/Sub, or similar Hands-on experience with CI/CD tools (e.g., GitLab CI) and infrastructure as code A strong focus More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
on experience or strong theoretical knowledge of AI and GenAI techniques and methodologies. Familiarity with cloud AI platforms such as GCP Vertex AI and BigQuery, Azure AI, or similar enterprise-level AI deployment environments. Experience or knowledge of regulatory requirements and frameworks relevant to AI, such as the EU More ❯
de-normalised views and star schemas. Key Skills, Knowledge, and Experience: Proficiency in ER Studio. Knowledge of Google Cloud Platform components like GCS buckets, BigQuery, Kafka, etc. Experience with advanced SQL, SSIS packages, Python, and Unix Shell scripting. * Free services are subject to limitations. #J-18808-Ljbffr More ❯
of these. Their ideal candidate would have 10+ years experience in Data Engineering/Architecture and have good knowledge within: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Leadership/Line Management Consulting/Client Facing Experience In return they would More ❯
of these. Their ideal candidate would have 10+ years experience in Data Engineering/Architecture and have good knowledge within: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Leadership/Line Management Consulting/Client Facing Experience In return they would More ❯
of these. Their ideal candidate would have 10+ years experience in Data Engineering/Architecture and have good knowledge within: Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Leadership/Line Management Consulting/Client Facing Experience In return they would More ❯
throughput backend systems Experience with BI/reporting engines or OLAP stores Deep Ruby/Rails & ActiveRecord expertise Exposure to ClickHouse/Redshift/BigQuery Event-driven or stream processing (Kafka, Kinesis) Familiarity with data-viz pipelines (we use Highcharts.js) AWS production experience (EC2, RDS, IAM, VPC) Contributions to More ❯
throughput backend systems Experience with BI/reporting engines or OLAP stores Deep Ruby/Rails & ActiveRecord expertise Exposure to ClickHouse/Redshift/BigQuery Event-driven or stream processing (Kafka, Kinesis) Familiarity with data-viz pipelines (we use Highcharts.js) AWS production experience (EC2, RDS, IAM, VPC) Contributions to More ❯
throughput backend systems Experience with BI/reporting engines or OLAP stores Deep Ruby/Rails & ActiveRecord expertise Exposure to ClickHouse/Redshift/BigQuery Event-driven or stream processing (Kafka, Kinesis) Familiarity with data-viz pipelines (we use Highcharts.js) AWS production experience (EC2, RDS, IAM, VPC) Contributions to More ❯
data infrastructure, ensuring seamless data flow and insightful analytics. Key Responsibilities: Design, develop, and maintain data pipelines using Google Cloud Platform (GCP) services like BigQuery, Cloud Storage, and Cloud Dataflow. Develop and maintain reports, dashboards, and visualizations using Power BI, ensuring data-driven insights for business stakeholders. Write efficient … scalability. Requirements: 5+ years of experience in data engineering, data analysis, or a related field. Strong experience with Google Cloud Platform (GCP) services, particularly BigQuery, Cloud Storage, and Cloud Dataflow. Proficient in Power BI for data visualization and reporting. Strong programming skills in Python, with experience in data processing More ❯
problems. And any experience of these would be really useful Familiarity with workforce planning, HR analytics, or organisational design. Experience working with GCP and BigQuery Knowledge of skills taxonomies or talent intelligence platforms. Prompt engineering skills, with the ability to design and refine prompts for generative AI tools to More ❯