Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NLB Services
Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. · Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. · Experience with code More ❯
database platforms Cloud Architecture: Experience with Azure data services (Azure Data Factory, Azure Synapse, Azure Data Lake) Data Governance: Understanding of data quality, data lineage, and metadata management principles. ETL/ELT Processes: Experience designing and implementing data integration workflows. Business Intelligence: Knowledge of reporting and analytics platforms (Power BI, SSRS, or similar) Data Warehousing: Experience with dimensional modelling andMore ❯
data modelling techniques. Hands-on expertise in dbt for data transformation and pipeline orchestration. Solid background in data engineering with proficiency in SQL and Python. Experience with ELT/ETL frameworks and modern data stack tools. Knowledge of data governance, access control, and privacy best practices. Proficiency in applying AI foundations to ensure data is aligned for consumption and content More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
with strategic goals About the Candidate The ideal candidate will possess the following: Experience as a Data/Integration Engineer or similar role Understanding of data warehousing, ELT/ETL processes, and data modelling Knowledge of Cloud-native development (Azure, Snowflake, dbt) Agile methodologies and collaboration to drive Innovation and continuous improvement Excellent communication skills and ability to work with More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen Group
data processing, transformation, and analysis. Apply data architecture frameworks such as TOGAF and DAMA-DMBOK to guide enterprise data strategy and governance. Design and implement data integration pipelines using ETL/ELT methodologies and API-driven architectures. Oversee data governance initiatives including metadata management, data quality, and master data management (MDM). Evaluate and integrate big data technologies and streaming More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
members through technical guidance. Collaborate with stakeholders to deliver user-centric solutions. About the Candidate The ideal candidate will possess the following: Strong understanding of data warehousing, ELT/ETL processes, and data modelling. Proficiency in Azure, Snowflake, and DBT. Experience in application modernisation and migration. Ability to produce clean, testable, maintainable code. CI/CD pipeline implementation and test More ❯
Snowflake or similar cloud-based data warehousing solutions 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. Experience with code … have': Experience in financial services Knowledge of regulatory requirements in the financial industry Tasks: Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and Databricks Developing and deploying ETL jobs that extract data from various sources, transforming them to meet business needs. Taking ownership of the end-to-end engineering … to maintain a high quality. Developing and maintaining tooling and automation scripts to streamline repetitive tasks. Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes Utilizing REST APIs and other integration techniques to connect various data sources Maintaining documentation, including data flow diagrams, technical specifications, and processes. Designing and implementing tailored data solutions to More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Ubique Systems
on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines Proficiency in working with Snowflake or similar cloud-based data warehousing solutions Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing Familiarity with data orchestration tools, such More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Luxoft
GCP, Azure) and AI/ML services. Knowledge of MLOps tools and practices (e.g., MLflow, Kubeflow, Vertex AI, Azure ML). Strong understanding of data engineering, data pipelines, andETL workflows. Excellent problem-solving, communication, and stakeholder engagement skills. Bachelor's or Master's degree in Computer Science, AI/ML, Data Science, or related field. Luxoft is committed to More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Infoplus Technologies UK Limited
Mandate skills required: Pyspark Python SQL Snowflake Role Responsibilities You will be responsible for: • Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks • Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. • Taking ownership of the end-to-end engineering … to maintain a high quality. • Developing and maintain tooling and automation scripts to streamline repetitive tasks. • Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes • Utilizing REST APls and other integration techniques to connect various data sources • Maintaining documentation, including data flow diagrams, technical specifications, and processes. You Have: • Proficiency in Python programming, including … on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines • Proficiency in working with Snowflake or similar cloud-based data warehousing solutions • Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. • Experience with code More ❯
complex systems talk to each other. Youll thrive in a collaborative environment, enjoy working on diverse projects, and take pride in building solutions that make a difference. Experience with ETL/ELT tools , data lake or data mesh architectures , or automation for data integration would be a real plus. Additional Information: We will be reviewing and speaking to candidates on More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
as Logic Apps, Function Apps, Service Bus, Event Grid, Event Hub, and API Management. Experience with RESTful APIs, JSON, and integration patterns (eg, pub/sub, request/response, ETL). Understanding of DevOps practices and tools (Azure DevOps, GitHub, CI/CD). Knowledge of security and identity management in Azure (eg, OAuth2, Managed Identities, RBAC). Understanding of More ❯
data and validate by profiling in a data environment Understand data structures and data model (dimensional & relational) concepts like Star schema or Fact & Dimension tables, to design and develop ETL patterns/mechanisms to ingest, analyse, validate, normalize and cleanse data Understand and produce ‘Source to Target mapping’ (STTM) documents, containing data structures, business & data transformation logics Liaise with data More ❯
to ensure best practices and continuous improvement • Maintain transparency and communication through reporting and leadership updates Required Skills: 20+ years’ experience in Change Delivery Strong background in Software Delivery, ETL & Data Migration programs Familiar with SEF Azure DevOps environment & tooling; VDIs, AKS Cluster etc. Strong risk management expertise – including for complex data migrations Strong understanding of Agile methodologies Experienced in More ❯
data platforms (Microsoft Fabric as a preference) and modern data warehousing. Deep proficiency in data modelling (conceptual, logical, physical) across relational, dimensional, and NoSQL paradigms. Skilled in data integration, ETL/ELT design, and API-driven architectures. Experience with data governance, metadata management, and master data management They're big on expertise, not hierarchy, so you'll be trusted with More ❯
AWS services. The ideal candidate will have a solid background in data engineering, Python development, and cloud-native architecture. YOUR PROFILE Design, develop, and maintain robust data pipelines andETL workflows using AWS services. Implement scalable data processing solutions using PySpark and AWS Glue. Build and manage infrastructure as code using CloudFormation. Develop and deploy serverless applications using AWS Lambda More ❯
with Microsoft Fabric as the preferred option * Experience with modern data warehousing * Strong data modelling skills across relational and NoSQL * Experience with governance, metadata, master data and integration including ETL ELT and API based approaches NO SPONSORSHIP can be offered for this role. if that sounds of interest to you then please apply, or reach out to for more information. More ❯
to Have Experience in financial or banking data environments. Familiarity with AI/ML frameworks (e.g., LangChain, OpenAI API, Hugging Face). Exposure to large-scale data migration or ETL projects. What’s Offered Competitive salary and benefits package. Opportunity to work on a high-impact data modernization initiative. Collaborative and innovative environment focused on cutting-edge technology. More ❯
and efficiency. Role Overview: We are seeking a skilled Data Engineer to join our UK team. In this role, you will be responsible for designing, developing, and maintaining scalable ETL data pipelines leveraging Python and DataBricks. You will collaborate with cross-functional teams to understand data requirements and ensure the delivery of reliable, high-quality data solutions that support our … business objectives. Key Responsibilities: Collaborate with stakeholders and cross-functional teams to gather data requirements and translate them into efficient ETL processes. Develop, test, and deploy end-to-end ETL pipelines extracting data from various sources, transforming it to meet business needs, and loading it into target systems. Take ownership of the full engineering lifecycle, including data extraction, cleansing, transformation … Hands-on experience working with DataBricks and cloud services for building scalable data pipelines. Strong knowledge of cloud data warehousing solutions such as Snowflake or similar. Solid understanding of ETL workflows, data modeling, data warehousing, and data integration best practices. Experience working within agile teams, demonstrating collaboration and adaptability in fast-paced environments. Familiarity with version control tools such as More ❯
process Strong specialist experience across several of the following areas: Cloud services (SaaS, PaaS) (AWS preferred) Enterprise integration patterns and tooling (MuleSoft preferred) Enterprise data, analytics and information management, ETL knowledge High volume transactional online systems Security and identity management Service and micro-service architecture Knowledge of continuous integration tools and techniques Design and/or development of applications using More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NEC Software Solutions
SQL expertise, particularly in MySQL , who can take ownership of migration tasks, proactively identify risks, and collaborate closely with technical and business stakeholders. Key Responsibilities Design, develop, and execute ETL processes to migrate data from legacy systems to target platforms. Write and optimise complex SQL queries (preferably MySQL) to support data extraction, transformation, and validation. Apply the full data quality … structures and workflows. Qualifications Essential: Proven experience in public sector large-scale data migration projects . Strong SQL development skills within MySQL environments. Solid understanding of data transformations andETL workflows . Knowledge and application of data quality dimensions in migration processes. Familiarity with case management systems and associated data structures. Strong problem-solving mindset, with the ability to take More ❯
environment that values expertise, innovation, and collaboration. Responsibilities: Design, develop, and maintain scalable data pipelines and transformation processes utilizing modern tools and frameworks Implement and optimise data workflows andETL procedures within Snowflake Create robust data models to support advanced analytics and machine learning initiatives Collaborate with cross-functional stakeholders to understand business data requirements and deliver effective solutions Establish … inform business decisions Candidate Profile: strong background in building enterprise data solutions Extensive hands-on experience with Python and data transformation techniques Expertise in Snowflake cloud data platform andETL process optimisation Familiarity with machine learning tools such as TensorFlow or scikit-learn Strong communication skills, capable of translating complex technical concepts to non-technical stakeholders Experience managing end-to More ❯
Aberdeen, Aberdeenshire, Scotland, United Kingdom Hybrid / WFH Options
Bright Purple Resourcing
week into the office Pension scheme Career progression opportunities City centre parking What you will be doing: Developing and maintaining data integrations between internal and client systems. Building reliable ETL processes to manage andtransform large data sets. Working closely with clients and internal teams to support smooth onboarding and data flows. Contributing to the ongoing development of our SaaS … clearly, and enjoys finding elegant technical solutions. Youll bring: Strong C#/.Net Core and SQL skills. Experience working with REST APIs , JSON, and data pipelines. An understanding of ETL processes and data transformation. The ability to collaborate across technical and non-technical teams. Experience with Azure Knowledge of Power BI is a nice to have APPLY NOW via the More ❯
such as Pandas, NumPy, PySpark, etc. You will also have a number of years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines. ETL process expertise is essential. Proficiency in working with Snowflake or similar cloud-based data warehousing solutions is also essential. Experience in data development and solutions in highly complex data environments … with large data volumes is also required. You will be responsible for collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and Databricks. You will also develop and deploy ETL jobs that extract data from various sources, transforming them to meet business needs. Please apply ASAP if this is of More ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid / WFH Options
KBC Technologies UK LTD
About the Role: We are looking for Data Engineer for Glasgow location. Mode of Work - hybrid Databricks being (primarily) a Managed Spark engine – strong Spark experience is a must-have. Databricks (BigData/Spark) & Snowflake specialists – and general Data Engineer More ❯