Glasgow, Scotland, United Kingdom Hybrid/Remote Options
NLB Services
Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. · Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. · Experience with code More ❯
database platforms Cloud Architecture: Experience with Azure data services (Azure Data Factory, Azure Synapse, Azure Data Lake) Data Governance: Understanding of data quality, data lineage, and metadata management principles. ETL/ELT Processes: Experience designing and implementing data integration workflows. Business Intelligence: Knowledge of reporting and analytics platforms (Power BI, SSRS, or similar) Data Warehousing: Experience with dimensional modelling andMore ❯
data modelling techniques. Hands-on expertise in dbt for data transformation and pipeline orchestration. Solid background in data engineering with proficiency in SQL and Python. Experience with ELT/ETL frameworks and modern data stack tools. Knowledge of data governance, access control, and privacy best practices. Proficiency in applying AI foundations to ensure data is aligned for consumption and content More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
with strategic goals About the Candidate The ideal candidate will possess the following: Experience as a Data/Integration Engineer or similar role Understanding of data warehousing, ELT/ETL processes, and data modelling Knowledge of Cloud-native development (Azure, Snowflake, dbt) Agile methodologies and collaboration to drive Innovation and continuous improvement Excellent communication skills and ability to work with More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
members through technical guidance. Collaborate with stakeholders to deliver user-centric solutions. About the Candidate The ideal candidate will possess the following: Strong understanding of data warehousing, ELT/ETL processes, and data modelling. Proficiency in Azure, Snowflake, and DBT. Experience in application modernisation and migration. Ability to produce clean, testable, maintainable code. CI/CD pipeline implementation and test More ❯
Glasgow, Scotland, United Kingdom Hybrid/Remote Options
Vanloq - Workforce Solutions
days per week on-site), contributing to key data initiatives that support risk and analytics functions across the business. Key Responsibilities: Design, build, and optimise scalable data pipelines andETL solutions. Work with complex datasets to enable data-driven insights and reporting. Collaborate closely with data scientists, analysts, and business stakeholders to deliver robust data solutions. Support the migration andMore ❯
gap between data engineering and strategic decision making. Required skills: Experienced in Data Engineering with strong knowledge of Data Architecture Advanced SQL for data manipulation and querying Experience with ETL tools in Azure Knowledge of BI tools such as Power BI, Tableau, or fabric Strong communication skills and the ability to explain technical concepts to non-technical users If this More ❯
gap between data engineering and strategic decision making. Required skills: Experienced in Data Engineering with strong knowledge of Data Architecture Advanced SQL for data manipulation and querying Experience with ETL tools in Azure Knowledge of BI tools such as Power BI, Tableau, or fabric Strong communication skills and the ability to explain technical concepts to non-technical users If this More ❯
experience in a cloud environment (AWS, Azure, or GCP). Strong understanding of ML libraries such as scikit-learn, TensorFlow, or MLflow. Solid background in data modelling, ELT/ETL processes, and analytics best practices. If youre ready to make an impact in a growing tech company and bring your MLOps expertise to the table GET IN TOUCH today! Bright More ❯
Edinburgh, Roxburgh's Court, City of Edinburgh, United Kingdom
Bright Purple
experience in a cloud environment (AWS, Azure, or GCP). • Strong understanding of ML libraries such as scikit-learn, TensorFlow, or MLflow. • Solid background in data modelling, ELT/ETL processes, and analytics best practices. If you’re ready to make an impact in a growing tech company and bring your MLOps expertise to the table — GET IN TOUCH today More ❯
Edinburgh, Scotland, United Kingdom Hybrid/Remote Options
Luxoft
GCP, Azure) and AI/ML services. Knowledge of MLOps tools and practices (e.g., MLflow, Kubeflow, Vertex AI, Azure ML). Strong understanding of data engineering, data pipelines, andETL workflows. Excellent problem-solving, communication, and stakeholder engagement skills. Bachelor's or Master's degree in Computer Science, AI/ML, Data Science, or related field. Luxoft is committed to More ❯
complex systems talk to each other. Youll thrive in a collaborative environment, enjoy working on diverse projects, and take pride in building solutions that make a difference. Experience with ETL/ELT tools , data lake or data mesh architectures , or automation for data integration would be a real plus. Additional Information: We will be reviewing and speaking to candidates on More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
as Logic Apps, Function Apps, Service Bus, Event Grid, Event Hub, and API Management. Experience with RESTful APIs, JSON, and integration patterns (eg, pub/sub, request/response, ETL). Understanding of DevOps practices and tools (Azure DevOps, GitHub, CI/CD). Knowledge of security and identity management in Azure (eg, OAuth2, Managed Identities, RBAC). Understanding of More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen Group
the Candidate: Achieved Architecture Certification e.g. TOGAF or BCS (Desirable). Understand modern application, data and security architectures with a focus on Microsoft technologies and Corporate systems integration, including ETL, APIs, Azure Data Pipelines, Data Product Architectures, Identity & Access Management, Zero Trust, AI and Cloud-based technologies. Work with autonomy. Demonstrate self-leadership and ability to own and drive results More ❯
data and validate by profiling in a data environment Understand data structures and data model (dimensional & relational) concepts like Star schema or Fact & Dimension tables, to design and develop ETL patterns/mechanisms to ingest, analyse, validate, normalize and cleanse data Understand and produce ‘Source to Target mapping’ (STTM) documents, containing data structures, business & data transformation logics Liaise with data More ❯
to ensure best practices and continuous improvement • Maintain transparency and communication through reporting and leadership updates Required Skills: 20+ years’ experience in Change Delivery Strong background in Software Delivery, ETL & Data Migration programs Familiar with SEF Azure DevOps environment & tooling; VDIs, AKS Cluster etc. Strong risk management expertise – including for complex data migrations Strong understanding of Agile methodologies Experienced in More ❯
AWS services. The ideal candidate will have a solid background in data engineering, Python development, and cloud-native architecture. YOUR PROFILE Design, develop, and maintain robust data pipelines andETL workflows using AWS services. Implement scalable data processing solutions using PySpark and AWS Glue. Build and manage infrastructure as code using CloudFormation. Develop and deploy serverless applications using AWS Lambda More ❯
shaping the organisation's BI landscape, ensuring data integrity, clarity, and accessibility while supporting innovative data solutions. We are seeking a candidate with a solid understanding of data manipulation andETL processes, along with expertise in SQL, to deliver insightful business intelligence solutions that … align with organisational goals. Responsibilities Design, develop, and optimise BI solutions using Power BI, Tableau, or Fabric to meet business requirements. Develop and maintain ETL workflows to extract, transform, andload data from diverse sources. Utilise SQL for data querying, data modelling, and ensuring data accuracy and consistency. Collaborate with stakeholders to gather requirements and translate them into effective data … reporting. Qualifications Proven experience working in BI and data manipulation roles, with a strong focus on Power BI and SQL. Excellent knowledge of data extraction, transformation, and loading processes (ETL). Strong understanding of data visualisation tools such as Power BI, Tableau, or Fabric. Experience designing scalable, high-performance BI solutions. Ability to communicate technical concepts clearly to non-technical More ❯
with Microsoft Fabric as the preferred option * Experience with modern data warehousing * Strong data modelling skills across relational and NoSQL * Experience with governance, metadata, master data and integration including ETL ELT and API based approaches NO SPONSORSHIP can be offered for this role. if that sounds of interest to you then please apply, or reach out to for more information. More ❯
and efficiency. Role Overview: We are seeking a skilled Data Engineer to join our UK team. In this role, you will be responsible for designing, developing, and maintaining scalable ETL data pipelines leveraging Python and DataBricks. You will collaborate with cross-functional teams to understand data requirements and ensure the delivery of reliable, high-quality data solutions that support our … business objectives. Key Responsibilities: Collaborate with stakeholders and cross-functional teams to gather data requirements and translate them into efficient ETL processes. Develop, test, and deploy end-to-end ETL pipelines extracting data from various sources, transforming it to meet business needs, and loading it into target systems. Take ownership of the full engineering lifecycle, including data extraction, cleansing, transformation … Hands-on experience working with DataBricks and cloud services for building scalable data pipelines. Strong knowledge of cloud data warehousing solutions such as Snowflake or similar. Solid understanding of ETL workflows, data modeling, data warehousing, and data integration best practices. Experience working within agile teams, demonstrating collaboration and adaptability in fast-paced environments. Familiarity with version control tools such as More ❯
data analysis to identify data structures, relationships, quality issues, and anomalies across various source systems. Translate business rules and requirements into clear, actionable specifications for data development teams (e.g., ETL, reporting). Work closely with developers and data engineers to ensure that implemented data solutions meet business needs and functional expectations. Develop and execute data validation strategies and test cases … clear functional and technical documentation. Nice-to-Haves Experience with core insurance systems (e.g., Guidewire, Duck Creek, Insis, SICS, Insurity, or other legacy/modern platforms). Familiarity with ETL tools and processes used in data migration or data integration projects. Understanding data governance, metadata management, and data quality best practices. Experience in Agile, Scrum, or hybrid delivery methodologies. Knowledge More ❯
process Strong specialist experience across several of the following areas: Cloud services (SaaS, PaaS) (AWS preferred) Enterprise integration patterns and tooling (MuleSoft preferred) Enterprise data, analytics and information management, ETL knowledge High volume transactional online systems Security and identity management Service and micro-service architecture Knowledge of continuous integration tools and techniques Design and/or development of applications using More ❯
will collaborate closely with internal stakeholders and external partners to optimise existing systems, with a particular focus on enhancing fan engagement through digital platforms. Key Responsibilities Design and develop ETL/ELT pipelines in Azure and Databricks, ensuring reliability and performance. Construct Kimball-style dimensional models to support analytics and reporting. Implement automated testing for data quality assurance and validation. … wider Data team to optimise pipelines and enhance platform capabilities. Essential Skills & Experience Hands-on expertise with Databricks, PySpark, and Delta Lake . Proven ability to build production-grade ETL/ELT pipelines , including integration with SFTP and REST APIs. Strong knowledge of Kimball methodology within Lakehouse frameworks. Advanced proficiency in Azure data services (ADF, ADLS Gen2, Event Hubs) andMore ❯
environment that values expertise, innovation, and collaboration. Responsibilities: Design, develop, and maintain scalable data pipelines and transformation processes utilizing modern tools and frameworks Implement and optimise data workflows andETL procedures within Snowflake Create robust data models to support advanced analytics and machine learning initiatives Collaborate with cross-functional stakeholders to understand business data requirements and deliver effective solutions Establish … of Data Engineers Candidate Profile: strong background in building enterprise data solutions Extensive hands-on experience with Python and data transformation techniques Expertise in Snowflake cloud data platform andETL process optimisation Familiarity with machine learning tools such as TensorFlow or scikit-learn Strong communication skills, capable of translating complex technical concepts to non-technical stakeholders Experience managing end-to More ❯