Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯
Learning, Deep Learning or LLM Frameworks) Desirable Minimum 2 years experience in Data related field Minimum 2 years in a Business or Management Consulting field Experience of Docker, Hadoop, PySpark, Apache or MS Azure Minimum 2 years NHS/Healthcare experience Disclosure and Barring Service Check This post is subject to the Rehabilitation of Offenders Act (Exceptions Order More ❯
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
and development plan beyond generic certifications. Provide a Rough Order of Magnitude (ROM) cost for implementing the proposed roadmap. Essential Deep expertise in the Databricks Lakehouse Platform, including Python, PySpark, and advanced SQL. Strong practical knowledge of Microsoft Fabric. Proven experience in senior, client-facing roles with a consultancy mindset. Background in technical coaching, mentorship, or skills assessment. Excellent More ❯
and development plan beyond generic certifications. Provide a Rough Order of Magnitude (ROM) cost for implementing the proposed roadmap. Essential: Deep expertise in the Databricks Lakehouse Platform, including Python, PySpark, and advanced SQL. Strong practical knowledge of Microsoft Fabric. Proven experience in senior, client-facing roles with a consultancy mindset. Background in technical coaching, mentorship, or skills assessment. Excellent More ❯
and machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
also identifying transformational opportunities for sales productivity. 🔑 What You’ll Bring Must-Have Skills 5+ years in Sales Ops/Rev Ops roles Proficient in SQL and Python (Pandas, PySpark) Strong experience with cloud data platforms (Databricks, Snowflake, GCP) Background in building ETL/ELT solutions and data modelling Advanced Excel/Power BI/VBA skills 2+ years More ❯
slough, south east england, united kingdom Hybrid / WFH Options
83zero
controls. AI & Technology Enablement Build tools and processes for metadata management, data quality, and data sharing. Leverage AI and automation tools to improve data governance capabilities. Use Python, SQL, PySpark, Power BI, and related tools for data processing and visualization. Strategy & Stakeholder Engagement Provide subject matter expertise in data governance and AI governance. Collaborate with business, data, and tech More ❯
estate and venture capital domain”. "• Lead the design and implementation of robust data architectures to support business needs and data strategy. • Utilize extensive experience in Azure Synapse, Python, PySpark, and ADF to architect scalable and efficient data solutions. • Oversee and optimize SSIS, SSRS, and SQL Server environments, ensuring high performance and reliability. • Write/review complex SQL queries More ❯