Senior Data Scientist
This is a senior leadership position where you'll drive business transformation through data, from developing strategic roadmaps to delivering complex, large-scale architectures. You are a leader in data and IT consulting with a proven track record of delivering business value. You are comfortable in high-stakes, business-critical environments and have a natural ability to attract trust at a board level. Your passion for innovative data solutions is matched by your technical depth and ability to simplify complex concepts for diverse audiences.
Responsibilities
- Success at GlobalLogic is built on strong core competencies. We seek a professional who excels in:
- Driving outcomes through influencing, decision-making, and negotiation.
- Proactively engaging clients and stakeholders to deliver high-value solutions, nurturing relationships, and providing/receiving candid feedback.
- Excelling at stakeholder management, relationship building, active listening, and client empathy to understand needs.
- Confidently conveying complex information with exceptional communication and presentation skills, using emotional intelligence to resolve conflict constructively.
- Possessing strong business acumen to link technical solutions to strategic value.
- Demonstrating high resilience and adaptability, effectively managing competing priorities, and championing continual learning.
Additionally, you will:
- Define data architecture vision, strategy, and roadmaps with C-level and board-level stakeholders, leading strategic IT/business transformations.
- Architect scalable data processing and analytics solutions, including enterprise data lakes and data warehouses, and lead business architecture solutions across cloud and associated technologies.
- Hands-on develop and implement data solutions, design robust data pipelines for machine learning and business intelligence, and integrate various data sources.
- Interface with clients to understand their needs, communicate data-driven insights, foster trust, and provide subject matter expertise for proposals and RFPs.
Requirements
- Cloud Data Architecture: Proven hands-on experience designing and implementing cloud-native data platforms, with deep expertise in AWS, GCP, or Azure. Must be proficient in modern services such as Amazon EMR, Redshift, Athena, Glue, Lambda, BigQuery, and Google Dataflow.
- Experience with cloud data warehouses (e.g., Snowflake, Databricks) is highly desirable.
- Modern Data Engineering Frameworks: Advanced proficiency in big data ecosystems—Hadoop, Spark (including PySpark), Flink, and contemporary MLOps or DataOps tooling (Airflow, dbt, Prefect).
- Database Expertise: Strong command of both traditional (SQL Server, Oracle, PostgreSQL, MySQL) and modern NoSQL platforms (MongoDB, Cassandra, DynamoDB, Cosmos DB).
- Data Visualisation & Analytics: Familiarity with leading BI platforms (PowerBI, Tableau, Qlik). Experience integrating with embedded analytics and self-service data models is a plus.
- Programming & Automation: Strong coding skills in Python, with hands-on experience in R and/or Scala. Proficiency in infrastructure-as-code using Terraform, CloudFormation, or Pulumi. Experience integrating CI/CD pipelines and deploying containerised solutions (Docker, Kubernetes).
- Agile & Architecture Standards: Experience working within Agile and DevOps environments. Skilled in applying architectural frameworks such as TOGAF, Zachman, or the Data Management Body of Knowledge (DMBOK).
- Data Modelling & Pipeline Design: Expertise in conceptual/logical/physical data modelling, metadata management, building robust ETL/ELT pipelines, event streaming (Kafka, Kinesis, Pub/Sub), and supporting cloud and hybrid data migrations.
- Data Governance & Compliance: Deep understanding of data protection, privacy, and governance frameworks (GDPR, CCPA, ISO 27001, DAMA). Experience implementing data quality, auditing, cataloging (DataHub, Collibra, Alation, Open Metadata), and master data management capabilities.
- Emerging Technology Awareness: Familiarity or hands-on exposure to AI/ML integration, knowledge graphs, data mesh/data fabric architectures, and observability tooling (e.g., Monte Carlo, DataDog, OpenLineage).
- Strong Communication & Leadership: Excellent stakeholder engagement, requirements gathering, and technical leadership skills. Proven ability to bridge business needs and technical execution.