Wavicle is seeking an experienced BigQuery (BigQuery) Platform Architect/DBA to provide architecture and implementation direction for projects leveraging BigQuery. The role will be the senior most expert for BigQuery, setting standards and directions for BigQuery usage and be the strategist and trusted advisor on … BigQuery and related GCP services at Clients. The role will also support pre-sales/sales as needed. What you will do: Set direction and standards for BigQuery environments, ensuring high availability, performance, and scalability. Provide direction and expertise on design, implementation and maintenance for applications leveraging BigQuery. … GCP-native tools (e.g., Cloud Functions, Dataflow). Document database designs, procedures, and best practices for the team. General Qualifications: Proven experience with GoogleBigQuery (prefer 3+ years) and Google Cloud Platform services (5+ years) Strong proficiency in SQL and BigQuery-specific SQL functions and optimization. Familiarity with More ❯
critical for structuring data for analysis. Proficiency in cloud platforms, such as AWS and GCP, with hands-on experience in services like Databricks, Redshift, BigQuery, and Snowflake, is highly valued. Advanced Python skills for data manipulation, automation, and scripting, using libraries like Pandas and NumPy, are necessary for effective More ❯
needed. Requirements: Strong proficiency in Python for data engineering tasks. Experience with cloud platforms (e.g., AWS, Azure, or GCP), including services like S3, Lambda, BigQuery, or Databricks. Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such More ❯
on experience with cloud-based data solutions (AWS, Azure, or Google Cloud). Strong understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery). Experience working with structured and unstructured data. Knowledge of data governance, security, and compliance best practices. Education and Experience: Bachelor's degree in More ❯
relational databases (e.g., MySQL, PostgreSQL, Oracle). Experience with cloud-based data platforms (AWS, GCP, Azure) and data warehouse solutions (e.g., Amazon Redshift, GoogleBigQuery, Snowflake). Experience in scripting languages such as Python, Shell, or Bash. Strong understanding of data integration, data transformation, and data quality practices. Familiarity More ❯
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
Washington, Washington DC, United States Hybrid / WFH Options
SMX
and NoSQL databases. Proficiency in writing complex queries and applying database optimization techniques. Data Warehousing: Experience with data warehousing solutions like Amazon Redshift, GoogleBigQuery, or Microsoft Azure SQL Data Warehouse. Soft Skills: Strong communication and collaboration skills. Excellent problem-solving skills. US Citizenship is required to obtain a More ❯
techniques, including star and snowflake schemas, for efficient data analysis. Familiarity with cloud platforms such as AWS or GCP, including services like Databricks, Redshift, BigQuery, and Snowflake. Strong Python skills for data manipulation, scripting, and automation using libraries like Pandas and NumPy. Experience managing data architecture within data warehouses More ❯
technology approach combining talent with software and service expertise. Tasks & Responsibilities: Design, develop, and maintain scalable data pipelines on GCP using services such as BigQuery and Cloud Functions. Collaborate with internal consulting and client teams to understand data requirements and deliver data solutions that meet business needs. Implement data More ❯
in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). • In-depth knowledge of data warehousing concepts and tools (e.g., Redshift, Snowflake, GoogleBigQuery). • Experience with big data platforms (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud More ❯
Expertise in data architecture, data modelling, data analysis, and data migration. Strong knowledge of SQL, NoSQL, cloud-based databases (e.g., AWS Redshift, Snowflake, GoogleBigQuery, Azure Synapse). Experience in ETL development, data pipeline automation, and data integration strategies. Familiarity with AI-driven analytics Strong understanding of data governance More ❯
attention to detail and commitment to data accuracy. Preferred Skills:• Experience with cloud-based Data Warehouse solutions (e.g., Oracle FDIP, Snowflake, Amazon Redshift, GoogleBigQuery).• Relevant certifications in data management or ETL tools are a plus. Min Citizenship Status Required: Must be a U.S. Citizen or Work Visa More ❯
and Drive. What will I be doing? Design, build, and maintain scalable and reliable data pipelines. Manage Zeelo's serverless centralized data architecture (Fivetran, BigQuery, dbt, and other tools) that supports analytical functions across the business. Design, build, and maintain ETL, ELT and other data pipelines for purposes to More ❯
designing, building, and maintaining scalable data pipelines and ETL processes. Proficiency in SQL and experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one More ❯
for data analysis, machine learning, and data visualization. In-depth knowledge of cloud platforms (AWS, Azure, Google Cloud) and related data services (e.g., S3, BigQuery, Redshift, Data Lakes). Expertise in SQL for querying large datasets and optimizing performance. Experience working with big data technologies such as Hadoop, Apache More ❯
london, south east england, united kingdom Hybrid / WFH Options
Careerwise
for data analysis, machine learning, and data visualization. In-depth knowledge of cloud platforms (AWS, Azure, Google Cloud) and related data services (e.g., S3, BigQuery, Redshift, Data Lakes). Expertise in SQL for querying large datasets and optimizing performance. Experience working with big data technologies such as Hadoop, Apache More ❯
a production environment. • 3+ years of experience in programming with Python • 3+ years of hands-on experience utilizing Google Cloud Platform (GCP) services, including BigQuery and Google Cloud Storage to efficiently manage and process large datasets, as well as Cloud Composer and/or Cloud Run. • Experience with version More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Digital Management, Inc
in theTask Order• Excellent communication skills Preferred Skills:• Experience with cloud-based Data Warehouse platforms and solutions (e.g., Oracle FDIP, Snowflake, Amazon Redshift, GoogleBigQuery).• Relevant certifications in data management or ETL tools are a plus Experience with one or more of the following systems: Maximo, PeopleSoft FSCM More ❯
/ELT pipelines and database technologies like PostgreSQL and MongoDB Familiar with major cloud platforms and tools, ideally Amazon Web Services and Snowflake or BigQuery Solid understanding of data transformation, advanced analytics, and API-based data delivery Ability to work across departments with a collaborative, problem-solving mindset and More ❯
/ELT pipelines and database technologies like PostgreSQL and MongoDB Familiar with major cloud platforms and tools, ideally Amazon Web Services and Snowflake or BigQuery Solid understanding of data transformation, advanced analytics, and API-based data delivery Ability to work across departments with a collaborative, problem-solving mindset and More ❯
data from diverse sources. Strong knowledge of SQL/NoSQL databases and cloud data warehouse technology such as Snowflake/Databricks/Redshift/BigQuery, including performance tuning and optimisation. Understanding of best practices for designing scalable and efficient data models, leveraging dbt for transformations. Familiarity with CircleCI, Terraform More ❯
tools such as Tableau • Experience working with APIs • Experience working with large-scale spatial datasets (billions of rows) and performing geospatial at scale using BigQuery GIS or similar tools • Experience with advanced analytical modelling techniques, including statistical analysis and predictive modelling, particularly applying these to large-scale datasets to More ❯
stakeholders alike. The Head of BI will work within a modern, cloud-based BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, Apache Kafka, GoogleBigQuery, Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake More ❯
ELT workflows. Strong analytic skills related to working with unstructured datasets. Engineering best practices and standards. Experience with data warehouse software (e.g. Snowflake, GoogleBigQuery, Amazon Redshift). Experience with data tools: Hadoop, Spark, Kafka, etc. Code versioning (Github integration and automation). Experience with scripting languages such as More ❯
databases. Hands on experience working with visualization tools including ThoughtSpot, Power BI or Tableau. Familiarity with leading cloud-based data warehouses such as Azure, BigQuery, AWS Redshift, or Snowflake. Strong analytical and problem-solving abilities to address complex data challenges. Detail-oriented mindset with a focus on data accuracy More ❯