Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NLB Services
Role: DataEngineer Location: Glasgow (Hybrid, 3 days onsite) Contract: 06-12months with possible extensions (No Sponsorship Available ) Skills/Qualifications: · 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. · 3+ years hands-on experience with cloud services, especially Databricks, for building and … managing scalable data pipelines · 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. · Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices … work collaboratively in a fast-paced, dynamic environment. · Experience with code versioning tools (e.g., Git) · Knowledge of Linux operating systems · Familiarity with REST APIs and integration techniques · Familiarity with data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as Apache Airflow · Previous exposure to big dataMore ❯
Role: DataEngineer (Python, Databricks, Snowflake, ETL) Location: Glasgow, UK (3days/week On-Site) Job Type: Contract Skills/Qualifications: 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building and managing … scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity … work collaboratively in a fast-paced, dynamic environment. Experience with code versioning tools (e.g., Git) Knowledge of Linux operating systems Familiarity with REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big dataMore ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NLB Services
DataEngineer Location - Glasgow (hybrid) 3 days in a week Contract role (6 to 12 Months) Skills/Qualifications: · 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. · 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines · 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Experience with code versioning tools (e.g., Git) · Knowledge of Linux operating systems · Familiarity with REST APIs and integration techniques · Familiarity with … data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as Apache Airflow · Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing More ❯
Glasgow, Lanarkshire, United Kingdom Hybrid / WFH Options
note, ideally we would seek a candidate who can be hyrbid from Glasgow or London, but this role can be remote subject to location. We are seeking an experienced Data Migration Engineer to support a large-scale migration project. The successful candidate will play a key role in designing, developing, and executing migration strategies to ensure data is transferred accurately, securely, and efficiently into the target environment. This role requires a hands-on engineer with strong SQL expertise, particularly in MySQL , who can take ownership of migration tasks, proactively identify risks, and collaborate closely with technical and business stakeholders. Design, develop, and execute ETL processes to migrate data from legacy systems to target platforms. … Write and optimise complex SQL queries (preferably MySQL) to support data extraction, transformation, and validation. Apply the full data quality framework (accuracy, completeness, consistency, timeliness, validity, uniqueness, integrity) across all migration activities. Conduct data profiling, cleansing, and quality assurance checks to ensure accuracy and completeness of migrated data. Translate business and technical requirements into data transformation More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NEC Software Solutions
note, ideally we would seek a candidate who can be hyrbid from Glasgow or London, but this role can be remote subject to location. We are seeking an experienced Data Migration Engineer to support a large-scale migration project. The successful candidate will play a key role in designing, developing, and executing migration strategies to ensure data is transferred accurately, securely, and efficiently into the target environment. This role requires a hands-on engineer with strong SQL expertise, particularly in MySQL , who can take ownership of migration tasks, proactively identify risks, and collaborate closely with technical and business stakeholders. Key Responsibilities Design, develop, and execute ETL processes to migrate data from legacy systems to … target platforms. Write and optimise complex SQL queries (preferably MySQL) to support data extraction, transformation, and validation. Apply the full data quality framework (accuracy, completeness, consistency, timeliness, validity, uniqueness, integrity) across all migration activities. Conduct data profiling, cleansing, and quality assurance checks to ensure accuracy and completeness of migrated data. Translate business and technical requirements into dataMore ❯
I am recruiting for a DataEngineer to be based in Glasgow 3 days a week, 2 days remote. The role falls inside IR35 so you will need to work through an umbrella company for the duration of the contract. You must have several years of experience developing data pipelines and data warehousing solutions using Python … and libraries such as Pandas, NumPy, PySpark, etc. You will also have a number of years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines. ETL process expertise is essential. Proficiency in working with Snowflake or similar cloud-based data warehousing solutions is also essential. Experience in data development and solutions … in highly complex data environments with large data volumes is also required. You will be responsible for collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and Databricks. You will also develop and deploy ETL jobs that extract data from various sources, transforming them to meet More ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid / WFH Options
KBC Technologies UK LTD
About the Role: We are looking for DataEngineer for Glasgow location. Mode of Work - hybrid Databricks being (primarily) a Managed Spark engine – strong Spark experience is a must-have. Databricks (BigData/Spark) & Snowflake specialists – and general DataEngineer skills, with RDBMS Fundamentals, SQL, ETL. More ❯
Role Responsibilities You will be responsible for: Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. Taking ownership of the end-to-end engineering lifecycle, including data extraction … cleansing, transformation, and loading, ensuring accuracy and consistency. Creating and manage data pipelines, ensuring proper error handling, monitoring and performance optimizations Working in an agile environment, participating in sprint planning, daily stand-ups, and retrospectives. Conducting code reviews, provide constructive feedback, and enforce coding standards to maintain a high quality. Developing and maintain tooling and automation scripts to streamline … repetitive tasks. Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes Utilizing REST APls and other integration techniques to connect various data sources Maintaining documentation, including data flow diagrams, technical specifications, and processes. You Have: Proficiency in Python programming, including experience in writing efficient and maintainable code. Hands-on experience with cloud More ❯
the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. YOUR ROLE We are seeking a skilled and hands-on AWS DataEngineer with strong coding expertise and deep experience in building scalable data solutions using AWS services. The ideal candidate will have a solid background in data engineering, Python development, and cloud-native architecture. YOUR PROFILE Design, develop, and maintain robust data pipelines and ETL workflows using AWS services. Implement scalable data processing solutions using PySpark and AWS Glue. Build and manage infrastructure as code using CloudFormation. Develop and deploy serverless applications using AWS Lambda, Step Functions, and S3. Perform data querying and … analysis using Athena. Collaborate with Data Scientists to operationalize models using SageMaker. Ensure secure and compliant data handling using IAM, KMS, and VPC configurations. Containerize applications using ECS for scalable deployment. Write clean, testable code in Python, with a strong emphasis on unit testing. Use GitLab for version control, CI/CD, and collaboration. Strong coding background in More ❯
Role Title: Sr. Databricks Engineer Location: Glasgow Duration: 31/12/2026 Days on site: 2-3 MUST BE PAYE THROUGH UMBRELLA Role Description: We are currently migrating our data pipelines from AWS to Databricks, and are seeking a Senior Databricks Engineer to lead and contribute to this transformation. This is a hands-on engineering role … focused on designing, building, and optimizing scalable data solutions using the Databricks platform. Key Responsibilities: • Lead the migration of existing AWS-based data pipelines to Databricks. • Design and implement scalable data engineering solutions using Apache Spark on Databricks. • Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines. • Optimize performance and … cost-efficiency of Databricks workloads. • Develop and maintain CI/CD workflows for Databricks using GitLab or similar tools. • Ensure data quality and reliability through robust unit testing and validation frameworks. • Implement best practices for data governance, security, and access control within Databricks. • Provide technical mentorship and guidance to junior engineers. Must-Have Skills: • Strong hands-on experience More ❯