The Role The Company is seeking a skilled and detail-oriented Business Analyst to join our Data Management team. This role plays a key part in delivering high-impact data solutions to our clients, supporting business intelligence, data migration and reporting initiatives across core insurance domains. The ideal candidate will have a strong analytical … mindset, a solid understanding of insurance operations, and hands-on experience working with data throughout its lifecycle — from analysis and mapping through to validation and user acceptance testing. You will work closely with cross-functional teams including project managers, actuaries, data engineers, and business stakeholders to ensure that data solutions meet business needs and … quality standards. This is a great opportunity to contribute to transformational projects within the insurance industry, leveraging your skills in data analysis, stakeholder engagement, and documentation, while growing within a dynamic and collaborative environment. Key Responsibilities Analyze and understand business requirements related to core insurance data (e.g., policies, claims, customers, billing) to support modeling, dataMore ❯
Harvey Nash is now inviting candidates to apply for the role of Solution Data Architect, an initial 6-month contract working inside of IR35, fully remote. This is a critical role that will lead the ongoing development of our Business Data logical model and support key strategic initiatives. Key Responsibilities Lead the ongoing development and evolution … of the Business Data logical model. Collaborate closely with teams responsible for physical model development. Support ongoing design work and delivery of inflight activities using Microsoft information architecture. Contribute to ERP implementation projects and data migration activities. Ensure data governance and compliance standards are maintained throughout all initiatives. Core Requirements Data Modelling … integration workflows. Business Intelligence: Knowledge of reporting and analytics platforms (Power BI, SSRS, or similar) DataWarehousing: Experience with dimensional modelling and datawarehouse architecture patterns API Integration: Understanding of REST/SOAP APIs and data service architectures. Data Security: Knowledge of data privacy regulations (GDPR) and More ❯
through an umbrella company Requirements 'must have': Education: Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience). 4+ years of experience developing data pipelines and datawarehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building … and managing scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud-based datawarehousing solutions 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Solid understanding of ETL principles, data modelling, datawarehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. Experience with code versioning tools (e.g., Git) Knowledge of Linux operating systems Familiarity with REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g., Power BI) Background in More ❯
Glasgow, Scotland, United Kingdom Hybrid/Remote Options
NLB Services
Role: Data Engineer Location: Glasgow (Hybrid, 3 days onsite) Contract: 06-12months with possible extensions (No Sponsorship Available ) Skills/Qualifications: · 4+ years of experience developing data pipelines and datawarehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. · 3+ years hands-on experience with cloud services, especially Databricks, for … building and managing scalable data pipelines · 3+ years of proficiency in working with Snowflake or similar cloud-based datawarehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. · Solid understanding of ETL principles, data modelling, datawarehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. · Experience with code versioning tools (e.g., Git) · Knowledge of Linux operating systems · Familiarity with REST APIs and integration techniques · Familiarity with data visualization tools and libraries (e.g., Power BI) · Background in More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
markets or alternative asset classes. About the Department Join Aberdeen's Investments department, where we help clients plan and invest with confidence for a brighter future. Our dynamic Data Tribe, led by the Chief Data Officer, is revolutionizing data accessibility and innovation through a cutting-edge Data Mesh architecture. Collaborate with passionate … teams to unlock powerful data insights that drive sustainable investment and fuel scalable growth. Be part of a forward-thinking environment making a real impact. About the Role As a Data Platform Engineer at Aberdeen, you will be at the heart of building and evolving a modern, cloud-based data platform using Azure, Snowflake … DBT, and Microsoft Fabric. Working within a Data Mesh architecture, you will enable decentralized data ownership and empower domain teams through self-service data capabilities. Your expertise will drive the design, implementation, and automation of scalable, secure data solutions, ensuring high performance, governance, and compliance. This is a pivotal role where you More ❯
About Synechron: Synechron is a leading digital transformation consulting firm, delivering innovative solutions across Financial Services, Insurance, and more. We empower organizations to harness the power of data, technology, and strategy to drive growth and efficiency. Role Overview: We are seeking a skilled Data Engineer to join our UK team. In this role, you will be … responsible for designing, developing, and maintaining scalable ETL data pipelines leveraging Python and DataBricks. You will collaborate with cross-functional teams to understand data requirements and ensure the delivery of reliable, high-quality data solutions that support our business objectives. Key Responsibilities: Collaborate with stakeholders and cross-functional teams to gather data requirements and translate them into efficient ETL processes. Develop, test, and deploy end-to-end ETL pipelines extracting data from various sources, transforming it to meet business needs, and loading it into target systems. Take ownership of the full engineering lifecycle, including data extraction, cleansing, transformation, and loading, maintaining data accuracy and More ❯
Lead Data Architect | Fabric | Azure | Kimball | 2 days per week in Edinburgh | £100,000-£110,000 plus a brilliant benefits package One of our long-standing clients in the private investment space is building out their data capability and are hiring a Principal Data Architect on a permanent basis. The benefits are genuinely strong. … other perks. This would suit someone who still enjoys being hands on with the tech but also wants a role with influence and ownership. You will design the data architecture that supports everything from reporting and analytics through to operational systems. You will work with engineering and senior stakeholders to define how data is modelled, integrated … position to move towards a Head of role and take on team leadership alongside the Head of Engineering. Experience my client are looking for: * A solid background in Data Architecture * Hands on experience across data and software design * Experience with cloud data platforms with Microsoft Fabric as the preferred option * Experience with modern dataMore ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
scale to capitalise on the key themes shaping the market, through either public markets or alternative asset classes. About the Department The Client Domain is responsible for shaping data capabilities that drive client-focused outcomes. The department works closely with architects, technical leads, and business stakeholders to design, evolve, and communicate data models that underpin our … digital and analytical solutions. About the Role We are looking for a technically capable and motivated Data Engineer to join our Data & Analytics team. You will contribute to the development of clean, efficient, and scalable data solutions across cloud-native platforms, supporting the delivery of high-quality data products and services. Key … business users and cross-functional teams to deliver user-centric solutions aligned with strategic goals About the Candidate The ideal candidate will possess the following: Experience as a Data/Integration Engineer or similar role Understanding of datawarehousing, ELT/ETL processes, and data modelling Knowledge of Cloud-native development (Azure, Snowflake, dbt More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
scale to capitalise on the key themes shaping the market, through either public markets or alternative asset classes. About the Department The Client Domain is responsible for shaping data capabilities that drive client-focused outcomes. The department works closely with architects, technical leads, and business stakeholders to design, evolve, and communicate data models that underpin our … digital and analytical solutions. About the Role We are seeking a detail-oriented and technically proficient Senior Engineer to join our Data & Analytics team. This role is instrumental in delivering clean, modern, and efficient data solutions across cloud-native platforms. Key Responsibilities Develop solutions across Snowflake, Azure, and DBT platforms. Lead migration and optimisation of applications … and junior team members through technical guidance. Collaborate with stakeholders to deliver user-centric solutions. About the Candidate The ideal candidate will possess the following: Strong understanding of datawarehousing, ELT/ETL processes, and data modelling. Proficiency in Azure, Snowflake, and DBT. Experience in application modernisation and migration. Ability to produce clean, testable, maintainable code. More ❯