Wavicle is seeking an experienced BigQuery (BigQuery) Platform Architect/DBA to provide architecture and implementation direction for projects leveraging BigQuery. The role will be the senior most expert for BigQuery, setting standards and directions for BigQuery usage and be the strategist and trusted advisor on … BigQuery and related GCP services at Clients. The role will also support pre-sales/sales as needed. What you will do: Set direction and standards for BigQuery environments, ensuring high availability, performance, and scalability. Provide direction and expertise on design, implementation and maintenance for applications leveraging BigQuery. … GCP-native tools (e.g., Cloud Functions, Dataflow). Document database designs, procedures, and best practices for the team. General Qualifications: Proven experience with GoogleBigQuery (prefer 3+ years) and Google Cloud Platform services (5+ years) Strong proficiency in SQL and BigQuery-specific SQL functions and optimization. Familiarity with More ❯
Airflow). Deep knowledge of relational and non-relational databases (e.g., SQL Server, PostgreSQL, MongoDB, Cassandra). Experience with data warehouse solutions (Snowflake, Redshift, BigQuery, etc.). Cloud experience with AWS, Azure, or Google Cloud Platform. Familiarity with CI/CD processes and tools (e.g., Git, Jenkins). Strong More ❯
MySQL, PostgreSQL, MS SQL Server). •Experience with NoSQL databases (e.g., MongoDB, Cassandra, HBase). •Familiarity with data warehousing solutions (e.g., Amazon Redshift, GoogleBigQuery, Snowflake). •Hands-on experience with ETL frameworks and tools (e.g., Apache NiFi, Talend, Informatica, Airflow). •Knowledge of big data technologies (e.g., Hadoop More ❯
Tableau, MongoDB (or similar) Strong understanding of ETL processes, data modeling, and data warehousing concepts. Familiarity with cloud-based data platforms such as GoogleBigQuery, Azure Synapse, or AWS Redshift. Experience working with Python or other programming languages for data processing and automation. Knowledge of APIs, JSON, and integration More ❯
critical for structuring data for analysis. Proficiency in cloud platforms, such as AWS and GCP, with hands-on experience in services like Databricks, Redshift, BigQuery, and Snowflake, is highly valued. Advanced Python skills for data manipulation, automation, and scripting, using libraries like Pandas and NumPy, are necessary for effective More ❯
needed. Requirements: Strong proficiency in Python for data engineering tasks. Experience with cloud platforms (e.g., AWS, Azure, or GCP), including services like S3, Lambda, BigQuery, or Databricks. Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such More ❯
R, SAS, or other similar technologies. Hands-on experience with big data technologies (e.g., Hadoop, Spark, Databricks) and cloud platforms (e.g., AWS, Azure, GoogleBigQuery, Snowflake). Strong skills in selecting and managing the appropriate database management technologies, tools, and interfaces to meet business needs. Experience optimizing data systems More ❯
on experience with cloud-based data solutions (AWS, Azure, or Google Cloud). Strong understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery). Experience working with structured and unstructured data. Knowledge of data governance, security, and compliance best practices. Education and Experience: Bachelor's degree in More ❯
relational databases (e.g., MySQL, PostgreSQL, Oracle). Experience with cloud-based data platforms (AWS, GCP, Azure) and data warehouse solutions (e.g., Amazon Redshift, GoogleBigQuery, Snowflake). Experience in scripting languages such as Python, Shell, or Bash. Strong understanding of data integration, data transformation, and data quality practices. Familiarity More ❯
and automation. Familiarity with cloud environments (AWS, Azure, or Google Cloud) for data infrastructure management. Hands-on experience with data warehousing solutions (e.g., Snowflake, BigQuery, Redshift). Excellent analytical skills and a problem-solving mindset, with attention to detail. Experience in procurement, sustainability, or finance is highly valued. English More ❯
and automation. Familiarity with cloud environments (AWS, Azure, or Google Cloud) for data infrastructure management. Hands-on experience with data warehousing solutions (e.g., Snowflake, BigQuery, Redshift). Excellent analytical skills and a problem-solving mindset, with attention to detail. Experience in procurement, sustainability, or finance is highly valued. English More ❯
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
london, south east england, united kingdom Hybrid / WFH Options
Noir
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
Washington, Washington DC, United States Hybrid / WFH Options
SMX
and NoSQL databases. Proficiency in writing complex queries and applying database optimization techniques. Data Warehousing: Experience with data warehousing solutions like Amazon Redshift, GoogleBigQuery, or Microsoft Azure SQL Data Warehouse. Soft Skills: Strong communication and collaboration skills. Excellent problem-solving skills. US Citizenship is required to obtain a More ❯
techniques, including star and snowflake schemas, for efficient data analysis. Familiarity with cloud platforms such as AWS or GCP, including services like Databricks, Redshift, BigQuery, and Snowflake. Strong Python skills for data manipulation, scripting, and automation using libraries like Pandas and NumPy. Experience managing data architecture within data warehouses More ❯
in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). • In-depth knowledge of data warehousing concepts and tools (e.g., Redshift, Snowflake, GoogleBigQuery). • Experience with big data platforms (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud More ❯
Atlanta, Georgia, United States Hybrid / WFH Options
Matlen Silver
committed to data quality. Familiarity with Agile development methodologies. Preferred Qualifications: Snowflake certifications. Experience with cloud-based data warehousing platforms (e.g., AWS Redshift, GoogleBigQuery). Knowledge of data integration and ETL/ELT processes. Understanding of banking industry data and regulatory requirements. More ❯
Expertise in data architecture, data modelling, data analysis, and data migration. Strong knowledge of SQL, NoSQL, cloud-based databases (e.g., AWS Redshift, Snowflake, GoogleBigQuery, Azure Synapse). Experience in ETL development, data pipeline automation, and data integration strategies. Familiarity with AI-driven analytics Strong understanding of data governance More ❯
attention to detail and commitment to data accuracy. Preferred Skills:• Experience with cloud-based Data Warehouse solutions (e.g., Oracle FDIP, Snowflake, Amazon Redshift, GoogleBigQuery).• Relevant certifications in data management or ETL tools are a plus. Min Citizenship Status Required: Must be a U.S. Citizen or Work Visa More ❯
and load data from diverse sources. Leverage cloud platforms (AWS, Azure, GCP) and their relevant services for data engineering tasks (e.g., S3, Redshift, Databricks, BigQuery, Azure Data Factory), AIrflow Use Pandas for complex data transformations and analysis. Set up and manage CI/CD pipelines for automated testing and More ❯
solutions. Strong programming skills in Python, Scala, or Java. Understanding of data governance, data security, and best practices. Preferred Qualifications: Experience with Snowflake, Redshift, BigQuery, or similar data warehouse solutions. Familiarity with containerization (Docker, Kubernetes). Hands-on experience with real-time data processing (Kafka, Flink). Knowledge of More ❯
and Drive. What will I be doing? Design, build, and maintain scalable and reliable data pipelines. Manage Zeelo's serverless centralized data architecture (Fivetran, BigQuery, dbt, and other tools) that supports analytical functions across the business. Design, build, and maintain ETL, ELT and other data pipelines for purposes to More ❯
designing, building, and maintaining scalable data pipelines and ETL processes. Proficiency in SQL and experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one More ❯