Data Engineer (Fabric-Platforms)
Methods Analytics (MA) is recruiting for a Data Engineer to join our team within the Public Sector Business unit on a permanent basis. This role will be mainly remote but require flexibility to travel to client sites, and our offices based in London, Sheffield, and Bristol. Salary: £50k - £65k What You'll Be Doing as a Data Engineer:
- Work closely with cross-functional teams, translating complex technical concepts into clear, accessible language for non-technical audiences and aligning data solutions with business needs.
- Collaborate with a dynamic delivery team on innovative projects, transforming raw data into powerful insights that shape strategic decisions and drive business transformation.
- Utilise platforms and tools such as Microsoft Fabric, Azure Data Factory, Azure Synapse, Databricks, and PowerBI to build robust, scalable, and future-proof end-to-end data solutions.
- Design and implement efficient ETL and ELT pipelines, ensuring seamless integration and transformation of data from various sources to deliver clean, reliable data.
- Develop and maintain sophisticated data models, employing dimensional modelling techniques to support comprehensive data analysis and reporting.
- Implement and uphold best practices in data governance, security, and compliance, using tools like Azure Purview, Unity Catalog, and Apache Atlas to maintain data integrity and trust.
- Ensure data quality and integrity through meticulous attention to detail and rigorous QA processes, continually refining and optimising data queries for performance and cost-efficiency.
- Develop intuitive and visually compelling Power BI dashboards that provide actionable insights to stakeholders across the organisation.
- Monitor and tune solution performance, identifying opportunities for optimisation to enhance the reliability, speed, and functionality of data systems.
- Stay ahead of industry trends and advancements, continuously enhancing your skills and incorporating the latest Data Engineering tools, languages, and methodologies into your work.
- Enable business leaders to make informed decisions with confidence by providing them with timely, accurate, and actionable data insights.
- Be at the forefront of data innovation, driving the adoption and understanding of modern tooling, architectures, and platforms.
- Deliver seamless and intuitive data solutions that enhance the user experience, from real-time streaming data services to interactive dashboards.
- Play a key role in cultivating a data-driven culture within the organisation, mentoring team members, and contributing to the continuous improvement of the Engineering Practice
- Proficiency in SQL and Python: You are highly proficient in SQL and Python, enabling you to handle complex data problems with ease.
- Understanding of Data Lakehouse Architecture: You have a strong grasp of the principles and implementation of Data Lakehouse architecture.
- Hands-On Experience with Spark-Based Solutions: You possess experience with Spark-based platforms like Azure Synapse, Databricks, Microsoft Fabric, or even on-premise Spark clusters, using PySpark or Spark SQL to manage and process large datasets.
- Expertise in Building ETL and ELT Pipelines: You are skilled in building robust ETL and ELT pipelines, mostly in Azure, utilising Azure Data Factory and Spark-based solutions to ensure efficient data flow and transformation.
- Efficiency in Query Writing: You can craft and optimise queries to be both cost-effective and high-performing, ensuring fast and reliable data retrieval.
- Experience in Power BI Dashboard Development: You possess experience in creating insightful and interactive Power BI dashboards that drive business decisions.
- Proficiency in Dimensional Modelling: You are adept at applying dimensional modelling techniques, creating efficient and effective data models tailored to business needs.
- CI/CD Mindset: You naturally work within Continuous Integration and Continuous Deployment (CI/CD) environments, ensuring automated builds, deployments, and unit testing are integral parts of your development workflow.
- Business Requirements Translation: You have a knack for understanding business requirements and translating them into precise technical specifications that guide data solutions.
- Strong Communication Skills: Ability to effectively translate complex technical topics into clear, accessible language for non-technical audiences
- Continuous Learning and Development: Commitment to continuous learning and professional development, staying up to date with the latest industry trends, tools, and technologies.
- Exposure to Microsoft Fabric: Familiarity with Microsoft Fabric and its capabilities would be a significant advantage.
- Experience with High-Performance Data Systems: Handling large-scale data systems with high performance and low latency, such as managing 1 billion+ records or terabyte-sized databases.
- Knowledge of Delta Tables or Apache Iceberg: Understanding and experience with Delta Tables or Apache Iceberg for managing large-scale data lakes efficiently.
- Knowledge of Data Governance Tools: Experience with data governance tools like Azure Purview, Unity Catalog, or Apache Atlas to ensure data integrity and compliance.
- Exposure to Streaming/Event-Based Technologies: Experience with technologies such as Kafka, Azure Event Hub, and Spark Streaming for real-time data processing and event-driven architectures.
- Understanding of SOLID Principles: Familiarity with the SOLID principles of object-oriented programming.
- Understanding of Agile Development Methodologies: Familiarity with iterative and agile development methodologies such as SCRUM, contributing to a flexible and responsive development environment.
- Familiarity with Recent Innovations: Knowledge of recent innovations such as GenAI, RAG, and Microsoft Copilot, as well as certifications with leading cloud providers and in areas of data science, AI, and ML.
- Experience with Data for Data Science/AI/ML: Experience working with data tailored for data science, AI, and ML applications,
- Experience with Public Sector Clients: Experience working with public sector clients and understanding their specific needs and requirements
- Autonomy to develop and grow your skills and experience
- Be part of exciting project work that is making a difference in society
- Strong, inspiring, and thought-provoking leadership
- A supportive and collaborative environment
- Development access to LinkedIn Learning, a management development programme and training
- Wellness 24/7 Confidential employee assistance programme
- Social - office parties, pizza Friday and commitment to charitable causes
- Time off - 25 days of annual leave a year, plus bank holidays, with the option to buy 5 extra days each year
- Volunteering - 2 paid days per year to volunteer in our local communities or within a charity organisation
- Pension Salary Exchange Scheme with 4% employer contribution and 5% employee contribution
- Life Assurance of 4 times base salary
- Private Medical Insurance which is non-contributory (spouse and dependants included)
- Worldwide Travel Insurance which is non-contributory (spouse and dependants included)