GCP Data Engineer

GCP Data Engineer

Job Summary:

We are seeking a highly motivated and experienced GCP Data Engineer to join our growing team. The ideal candidate will be responsible for designing, developing, and maintaining scalable and robust data pipelines and architectures on Google Cloud Platform. You will work closely with data scientists, analysts, and other stakeholders to ensure data availability, quality, and usability, contributing to our data-driven initiatives.

Required Qualifications:

  • Bachelor's degree in Computer Science, Engineering, Information Technology, or a related quantitative field.
  • Proven experience (typically 5+ years) as a Data Engineer, with a strong focus on Google Cloud Platform.
  • In-depth knowledge and hands-on experience with core GCP data services such as:
  • BigQuery: For data warehousing, querying, and analytics.
  • Cloud Dataflow: For batch and stream data processing.
  • Cloud Pub/Sub: For real-time messaging and event ingestion.
  • Cloud Storage: For scalable and durable object storage.
  • Cloud Composer (Apache Airflow): For workflow orchestration and pipeline management.
  • Cloud Dataproc: For managed Hadoop and Spark clusters (good to have).
  • Cloud SQL/Cloud Spanner/Cloud Bigtable: Experience with relational and NoSQL databases.
  • Strong proficiency in SQL for data manipulation, querying, and optimization.
  • Expertise in at least one programming language commonly used in data engineering, preferably Python. Scala or Java is a plus.
  • Solid understanding of data warehousing concepts, data modeling techniques (dimensional modeling, Kimball, Inmon), and ETL/ELT processes.
  • Experience with version control systems (e.g., Git).
  • Strong analytical, problem-solving, and debugging skills.
  • Excellent communication and collaboration skills, with the ability to work effectively in a team environment.

Preferred Qualifications:

  • Google Cloud Professional Data Engineer certification.
  • Experience with data visualization tools (e.g., Looker, Tableau, Power BI).
  • Familiarity with CI/CD pipelines for data engineering workflows.
  • Knowledge of data governance frameworks and tools.
  • Experience with real-time data processing and streaming architectures.
  • Basic understanding of machine learning concepts and how to operationalize ML models.

Job Details

Company
HCLTech
Location
London Area, United Kingdom
Posted