Wavicle is seeking an experienced BigQuery (BigQuery) Platform Architect/DBA to provide architecture and implementation direction for projects leveraging BigQuery. The role will be the senior most expert for BigQuery, setting standards and directions for BigQuery usage and be the strategist and trusted advisor on … BigQuery and related GCP services at Clients. The role will also support pre-sales/sales as needed. What you will do: Set direction and standards for BigQuery environments, ensuring high availability, performance, and scalability. Provide direction and expertise on design, implementation and maintenance for applications leveraging BigQuery. … GCP-native tools (e.g., Cloud Functions, Dataflow). Document database designs, procedures, and best practices for the team. General Qualifications: Proven experience with GoogleBigQuery (prefer 3+ years) and Google Cloud Platform services (5+ years) Strong proficiency in SQL and BigQuery-specific SQL functions and optimization. Familiarity with More ❯
critical for structuring data for analysis. Proficiency in cloud platforms, such as AWS and GCP, with hands-on experience in services like Databricks, Redshift, BigQuery, and Snowflake, is highly valued. Advanced Python skills for data manipulation, automation, and scripting, using libraries like Pandas and NumPy, are necessary for effective More ❯
needed. Requirements: Strong proficiency in Python for data engineering tasks. Experience with cloud platforms (e.g., AWS, Azure, or GCP), including services like S3, Lambda, BigQuery, or Databricks. Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such More ❯
on experience with cloud-based data solutions (AWS, Azure, or Google Cloud). Strong understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery). Experience working with structured and unstructured data. Knowledge of data governance, security, and compliance best practices. Education and Experience: Bachelor's degree in More ❯
relational databases (e.g., MySQL, PostgreSQL, Oracle). Experience with cloud-based data platforms (AWS, GCP, Azure) and data warehouse solutions (e.g., Amazon Redshift, GoogleBigQuery, Snowflake). Experience in scripting languages such as Python, Shell, or Bash. Strong understanding of data integration, data transformation, and data quality practices. Familiarity More ❯
and automation. Familiarity with cloud environments (AWS, Azure, or Google Cloud) for data infrastructure management. Hands-on experience with data warehousing solutions (e.g., Snowflake, BigQuery, Redshift). Excellent analytical skills and a problem-solving mindset, with attention to detail. Experience in procurement, sustainability, or finance is highly valued. English More ❯
and automation. Familiarity with cloud environments (AWS, Azure, or Google Cloud) for data infrastructure management. Hands-on experience with data warehousing solutions (e.g., Snowflake, BigQuery, Redshift). Excellent analytical skills and a problem-solving mindset, with attention to detail. Experience in procurement, sustainability, or finance is highly valued. English More ❯
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
london, south east england, united kingdom Hybrid / WFH Options
Noir
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
Washington, Washington DC, United States Hybrid / WFH Options
SMX
and NoSQL databases. Proficiency in writing complex queries and applying database optimization techniques. Data Warehousing: Experience with data warehousing solutions like Amazon Redshift, GoogleBigQuery, or Microsoft Azure SQL Data Warehouse. Soft Skills: Strong communication and collaboration skills. Excellent problem-solving skills. US Citizenship is required to obtain a More ❯
techniques, including star and snowflake schemas, for efficient data analysis. Familiarity with cloud platforms such as AWS or GCP, including services like Databricks, Redshift, BigQuery, and Snowflake. Strong Python skills for data manipulation, scripting, and automation using libraries like Pandas and NumPy. Experience managing data architecture within data warehouses More ❯
in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). • In-depth knowledge of data warehousing concepts and tools (e.g., Redshift, Snowflake, GoogleBigQuery). • Experience with big data platforms (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud More ❯
in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). • In-depth knowledge of data warehousing concepts and tools (e.g., Redshift, Snowflake, GoogleBigQuery). • Experience with big data platforms (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud More ❯
Expertise in data architecture, data modelling, data analysis, and data migration. Strong knowledge of SQL, NoSQL, cloud-based databases (e.g., AWS Redshift, Snowflake, GoogleBigQuery, Azure Synapse). Experience in ETL development, data pipeline automation, and data integration strategies. Familiarity with AI-driven analytics Strong understanding of data governance More ❯
Expertise in data architecture, data modelling, data analysis, and data migration. Strong knowledge of SQL, NoSQL, cloud-based databases (e.g., AWS Redshift, Snowflake, GoogleBigQuery, Azure Synapse). Experience in ETL development, data pipeline automation, and data integration strategies. Familiarity with AI-driven analytics Strong understanding of data governance More ❯
attention to detail and commitment to data accuracy. Preferred Skills:• Experience with cloud-based Data Warehouse solutions (e.g., Oracle FDIP, Snowflake, Amazon Redshift, GoogleBigQuery).• Relevant certifications in data management or ETL tools are a plus. Min Citizenship Status Required: Must be a U.S. Citizen or Work Visa More ❯
and Drive. What will I be doing? Design, build, and maintain scalable and reliable data pipelines. Manage Zeelo's serverless centralized data architecture (Fivetran, BigQuery, dbt, and other tools) that supports analytical functions across the business. Design, build, and maintain ETL, ELT and other data pipelines for purposes to More ❯
designing, building, and maintaining scalable data pipelines and ETL processes. Proficiency in SQL and experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one More ❯
for data analysis, machine learning, and data visualization. In-depth knowledge of cloud platforms (AWS, Azure, Google Cloud) and related data services (e.g., S3, BigQuery, Redshift, Data Lakes). Expertise in SQL for querying large datasets and optimizing performance. Experience working with big data technologies such as Hadoop, Apache More ❯
for data analysis, machine learning, and data visualization. In-depth knowledge of cloud platforms (AWS, Azure, Google Cloud) and related data services (e.g., S3, BigQuery, Redshift, Data Lakes). Expertise in SQL for querying large datasets and optimizing performance. Experience working with big data technologies such as Hadoop, Apache More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Careerwise
for data analysis, machine learning, and data visualization. In-depth knowledge of cloud platforms (AWS, Azure, Google Cloud) and related data services (e.g., S3, BigQuery, Redshift, Data Lakes). Expertise in SQL for querying large datasets and optimizing performance. Experience working with big data technologies such as Hadoop, Apache More ❯
a production environment. • 3+ years of experience in programming with Python • 3+ years of hands-on experience utilizing Google Cloud Platform (GCP) services, including BigQuery and Google Cloud Storage to efficiently manage and process large datasets, as well as Cloud Composer and/or Cloud Run. • Experience with version More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Digital Management, Inc
in theTask Order• Excellent communication skills Preferred Skills:• Experience with cloud-based Data Warehouse platforms and solutions (e.g., Oracle FDIP, Snowflake, Amazon Redshift, GoogleBigQuery).• Relevant certifications in data management or ETL tools are a plus Experience with one or more of the following systems: Maximo, PeopleSoft FSCM More ❯