Azure Data Engineer- Hybrid
We are looking for a hands-on Azure Data Engineer who will lead the final phase of our Client's cloud migration and design the enterprise-grade data platform from the ground up. This is a hybrid role with a strong technical focus—blending architecture, automation, and data engineering—to empower s next generation of AI and BI capabilities.
About The CompanyThe Company is a dynamic, global procurement consultancy operating across Europe, the US, and APAC. As they scale globally and accelerate their AI capabilities, they are completing their transition to the cloud and building a company-wide data platform to power insight-driven transformation for their consultants and clients
Required Skills & ExperienceMust-Haves:
-
3+ years of hands-on Azure engineering experience (IaaS PaaS), including Infra as Code.
-
Strong SQL skills and proficiency in Python or PySpark.
-
Built or maintained data lakes/warehouses using Synapse, Fabric, Databricks, Snowflake, or Redshift.
-
Experience hardening cloud environments (NSGs, identity, Defender).
-
Demonstrated automation of backups, CI/CD deployments, or DR workflows.
Nice-to-Haves:
-
Experience with Azure OpenAI, vector databases, or LLM integrations.
-
Power BI data modeling, DAX, and RLS.
-
Certifications: AZ-104, AZ-305, DP-203, or AI-102.
-
Knowledge of ISO 27001, Cyber Essentials+, or SOC 2 frameworks.
-
Exposure to consulting or professional services environments.
-
Familiarity with the Power Platform.
-
Awareness of data privacy regulations (e.g., GDPR, CCPA).
-
Consultative mindset - can turn business questions into technical outcomes.
-
Comfortable switching hats: architect, hands-on builder, and mentor.
-
Clear communicator, able to work effectively across time zones and teams.
-
Thrives in a small, high-trust, high-autonomy team culture.
- Day-to-Day Responsibilities
-
Infrastructure & Automation: Deploy and manage infrastructure using Bicep/Terraform, GitHub Actions, and PowerShell/DSC.
-
Data Engineering: Architect and implement scalable ETL/ELT solutions; model schemas, optimize performance, and apply lakehouse best practices.
-
Security & Resilience: Implement best-practice cloud security (NSGs, Defender, Conditional Access), automate DR/backups, and run quarterly restore drills.
-
Collaboration: Partner with AI Product Owners, Business Performance, and Data Analysts to translate business needs into robust data solutions.
-
Mentorship & Knowledge Sharing: Act as a data SME—guiding system administrators and upskilling junior technical team members.
-
Months 3-12:
-
Design and build their Azure data lake using Synapse, Fabric, or an alternative strategy.
-
Ingest data from core platforms: NetSuite, HubSpot, and client RFP datasets.
-
Automate data pipelines using ADF, Fabric Dataflows, PySpark, or SQL.
-
Publish governed datasets with Power BI, enabling row-level security (RLS).
By Year-End:
-
Deliver a production-ready lakehouse powering BI and ready for AI/Gen-AI initiatives.
-
Position the business to rapidly scale data products across regions and services.
-
Greenfield opportunity: Shape and deliver the first enterprise data platform.
-
Career growth: Scale with the company into Lead Data, Cloud, or Solution Architect roles.
-
Hybrid flexibility: Remote-first with 2-3 days/week onsite in Cardiff office .
-
Development: Funded certifications, dedicated R&D time, access to Company networks and resources.
- Company
- Octad Recruitment Ltd
- Location
- Cardiff, South Glamorgan, Wales, United Kingdom
Hybrid / WFH Options - Employment Type
- Full-Time
- Salary
- £60,000 - £90,000 per annum
- Posted
- Company
- Octad Recruitment Ltd
- Location
- Cardiff, South Glamorgan, Wales, United Kingdom
Hybrid / WFH Options - Employment Type
- Full-Time
- Salary
- £60,000 - £90,000 per annum
- Posted