Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NLB Services
Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. · Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. · Experience with code More ❯
database platforms Cloud Architecture: Experience with Azure data services (Azure Data Factory, Azure Synapse, Azure Data Lake) Data Governance: Understanding of data quality, data lineage, and metadata management principles. ETL/ELT Processes: Experience designing and implementing data integration workflows. Business Intelligence: Knowledge of reporting and analytics platforms (Power BI, SSRS, or similar) Data Warehousing: Experience with dimensional modelling andMore ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid / WFH Options
Harvey Nash
database platforms Cloud Architecture: Experience with Azure data services (Azure Data Factory, Azure Synapse, Azure Data Lake) Data Governance: Understanding of data quality, data lineage, and metadata management principles. ETL/ELT Processes: Experience designing and implementing data integration workflows. Business Intelligence: Knowledge of reporting and analytics platforms (Power BI, SSRS, or similar) Data Warehousing: Experience with dimensional modelling andMore ❯
data modelling techniques. Hands-on expertise in dbt for data transformation and pipeline orchestration. Solid background in data engineering with proficiency in SQL and Python. Experience with ELT/ETL frameworks and modern data stack tools. Knowledge of data governance, access control, and privacy best practices. Proficiency in applying AI foundations to ensure data is aligned for consumption and content More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
with strategic goals About the Candidate The ideal candidate will possess the following: Experience as a Data/Integration Engineer or similar role Understanding of data warehousing, ELT/ETL processes, and data modelling Knowledge of Cloud-native development (Azure, Snowflake, dbt) Agile methodologies and collaboration to drive Innovation and continuous improvement Excellent communication skills and ability to work with More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen Group
data processing, transformation, and analysis. Apply data architecture frameworks such as TOGAF and DAMA-DMBOK to guide enterprise data strategy and governance. Design and implement data integration pipelines using ETL/ELT methodologies and API-driven architectures. Oversee data governance initiatives including metadata management, data quality, and master data management (MDM). Evaluate and integrate big data technologies and streaming More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
members through technical guidance. Collaborate with stakeholders to deliver user-centric solutions. About the Candidate The ideal candidate will possess the following: Strong understanding of data warehousing, ELT/ETL processes, and data modelling. Proficiency in Azure, Snowflake, and DBT. Experience in application modernisation and migration. Ability to produce clean, testable, maintainable code. CI/CD pipeline implementation and test More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Luxoft
GCP, Azure) and AI/ML services. Knowledge of MLOps tools and practices (e.g., MLflow, Kubeflow, Vertex AI, Azure ML). Strong understanding of data engineering, data pipelines, andETL workflows. Excellent problem-solving, communication, and stakeholder engagement skills. Bachelor's or Master's degree in Computer Science, AI/ML, Data Science, or related field. Luxoft is committed to More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Luxoft
GCP, Azure) and AI/ML services. Knowledge of MLOps tools and practices (e.g., MLflow, Kubeflow, Vertex AI, Azure ML). Strong understanding of data engineering, data pipelines, andETL workflows. Excellent problem-solving, communication, and stakeholder engagement skills. Bachelor's or Master's degree in Computer Science, AI/ML, Data Science, or related field. Luxoft is committed to More ❯
livingston, central scotland, united kingdom Hybrid / WFH Options
Luxoft
GCP, Azure) and AI/ML services. Knowledge of MLOps tools and practices (e.g., MLflow, Kubeflow, Vertex AI, Azure ML). Strong understanding of data engineering, data pipelines, andETL workflows. Excellent problem-solving, communication, and stakeholder engagement skills. Bachelor's or Master's degree in Computer Science, AI/ML, Data Science, or related field. Luxoft is committed to More ❯
broughton, central scotland, united kingdom Hybrid / WFH Options
Luxoft
GCP, Azure) and AI/ML services. Knowledge of MLOps tools and practices (e.g., MLflow, Kubeflow, Vertex AI, Azure ML). Strong understanding of data engineering, data pipelines, andETL workflows. Excellent problem-solving, communication, and stakeholder engagement skills. Bachelor's or Master's degree in Computer Science, AI/ML, Data Science, or related field. Luxoft is committed to More ❯
dunfermline, north east scotland, united kingdom Hybrid / WFH Options
Luxoft
GCP, Azure) and AI/ML services. Knowledge of MLOps tools and practices (e.g., MLflow, Kubeflow, Vertex AI, Azure ML). Strong understanding of data engineering, data pipelines, andETL workflows. Excellent problem-solving, communication, and stakeholder engagement skills. Bachelor's or Master's degree in Computer Science, AI/ML, Data Science, or related field. Luxoft is committed to More ❯
complex systems talk to each other. Youll thrive in a collaborative environment, enjoy working on diverse projects, and take pride in building solutions that make a difference. Experience with ETL/ELT tools , data lake or data mesh architectures , or automation for data integration would be a real plus. Additional Information: We will be reviewing and speaking to candidates on More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
as Logic Apps, Function Apps, Service Bus, Event Grid, Event Hub, and API Management. Experience with RESTful APIs, JSON, and integration patterns (eg, pub/sub, request/response, ETL). Understanding of DevOps practices and tools (Azure DevOps, GitHub, CI/CD). Knowledge of security and identity management in Azure (eg, OAuth2, Managed Identities, RBAC). Understanding of More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Genpact
and opportunities to work from home). Role Responsibilities You will be responsible for: Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. Taking ownership of the end-to-end engineering … to maintain a high quality. Developing and maintain tooling and automation scripts to streamline repetitive tasks. Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes Utilizing REST APIs and other integration techniques to connect various data sources Maintaining documentation, including data flow diagrams, technical specifications, and processes. You have: Proficiency in Python programming, including … on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines Proficiency in working with Snowflake or similar cloud-based data warehousing solutions Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. Experience with code More ❯
milton, central scotland, united kingdom Hybrid / WFH Options
Genpact
and opportunities to work from home). Role Responsibilities You will be responsible for: Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. Taking ownership of the end-to-end engineering … to maintain a high quality. Developing and maintain tooling and automation scripts to streamline repetitive tasks. Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes Utilizing REST APIs and other integration techniques to connect various data sources Maintaining documentation, including data flow diagrams, technical specifications, and processes. You have: Proficiency in Python programming, including … on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines Proficiency in working with Snowflake or similar cloud-based data warehousing solutions Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. Experience with code More ❯
paisley, central scotland, united kingdom Hybrid / WFH Options
Genpact
and opportunities to work from home). Role Responsibilities You will be responsible for: Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. Taking ownership of the end-to-end engineering … to maintain a high quality. Developing and maintain tooling and automation scripts to streamline repetitive tasks. Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes Utilizing REST APIs and other integration techniques to connect various data sources Maintaining documentation, including data flow diagrams, technical specifications, and processes. You have: Proficiency in Python programming, including … on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines Proficiency in working with Snowflake or similar cloud-based data warehousing solutions Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. Experience with code More ❯
to ensure best practices and continuous improvement • Maintain transparency and communication through reporting and leadership updates Required Skills: 20+ years' experience in Change Delivery Strong background in Software Delivery, ETL & Data Migration programs Familiar with SEF Azure DevOps environment & tooling; VDIs, AKS Cluster etc. Strong risk management expertise - including for complex data migrations Strong understanding of Agile methodologies Experienced in More ❯
to ensure best practices and continuous improvement • Maintain transparency and communication through reporting and leadership updates Required Skills: 20+ years’ experience in Change Delivery Strong background in Software Delivery, ETL & Data Migration programs Familiar with SEF Azure DevOps environment & tooling; VDIs, AKS Cluster etc. Strong risk management expertise – including for complex data migrations Strong understanding of Agile methodologies Experienced in More ❯
to ensure best practices and continuous improvement • Maintain transparency and communication through reporting and leadership updates Required Skills: 20+ years’ experience in Change Delivery Strong background in Software Delivery, ETL & Data Migration programs Familiar with SEF Azure DevOps environment & tooling; VDIs, AKS Cluster etc. Strong risk management expertise – including for complex data migrations Strong understanding of Agile methodologies Experienced in More ❯
to ensure best practices and continuous improvement • Maintain transparency and communication through reporting and leadership updates Required Skills: 20+ years’ experience in Change Delivery Strong background in Software Delivery, ETL & Data Migration programs Familiar with SEF Azure DevOps environment & tooling; VDIs, AKS Cluster etc. Strong risk management expertise – including for complex data migrations Strong understanding of Agile methodologies Experienced in More ❯
to ensure best practices and continuous improvement • Maintain transparency and communication through reporting and leadership updates Required Skills: 20+ years’ experience in Change Delivery Strong background in Software Delivery, ETL & Data Migration programs Familiar with SEF Azure DevOps environment & tooling; VDIs, AKS Cluster etc. Strong risk management expertise – including for complex data migrations Strong understanding of Agile methodologies Experienced in More ❯
data platforms (Microsoft Fabric as a preference) and modern data warehousing. Deep proficiency in data modelling (conceptual, logical, physical) across relational, dimensional, and NoSQL paradigms. Skilled in data integration, ETL/ELT design, and API-driven architectures. Experience with data governance, metadata management, and master data management They're big on expertise, not hierarchy, so you'll be trusted with More ❯
data platforms (Microsoft Fabric as a preference) and modern data warehousing. Deep proficiency in data modelling (conceptual, logical, physical) across relational, dimensional, and NoSQL paradigms. Skilled in data integration, ETL/ELT design, and API-driven architectures. Experience with data governance, metadata management, and master data management They're big on expertise, not hierarchy, so you'll be trusted with More ❯
data platforms (Microsoft Fabric as a preference) and modern data warehousing. Deep proficiency in data modelling (conceptual, logical, physical) across relational, dimensional, and NoSQL paradigms. Skilled in data integration, ETL/ELT design, and API-driven architectures. Experience with data governance, metadata management, and master data management They're big on expertise, not hierarchy, so you'll be trusted with More ❯