Lead DataEngineer Excellent package and bonus on offer Ideally Central Belt Scotland although not essential This can be a short term contract although we are recruiting for this being a perm role. IF YOU HAVE WORKED IN THE ENERGY/CHEMICAL INDUSTRY THIS IS A HUGE PLUS Net Talent is pleased to present an exciting opportunity for … a Lead DataEngineer with our esteemed client, a prominent energy company based in Edinburgh. This organisation is renowned for its innovative approach to data, analytics, and consultancy within the energy and chemicals sectors. As they embark on a significant greenfield data transformation project, they seek a talented professional to spearhead their data pipeline development … leveraging advanced technologies such as Snowflake. This role offers a unique chance to shape a new data ecosystem from scratch, working within a supportive environment that values expertise, innovation, and collaboration. Responsibilities: Design, develop, and maintain scalable data pipelines and transformation processes utilizing modern tools and frameworks Implement and optimise data workflows and ETL procedures within Snowflake More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
ECOM
Senior DataEngineer – Hybrid (Edinburgh) | £65K–£85K Ecom Recruitment are working with a fast-growing digital consultancy delivering cutting-edge data and tech solutions for some of the UK’s biggest brands. We’re looking for a Senior DataEngineer to join their expanding data team, someone who’s passionate about building cloud-native … data platforms and pipelines that actually make an impact. You’ll be hands-on with AWS, Azure, or GCP, working in a collaborative environment that values innovation, quality, and teamwork. 💡 What you’ll bring: Solid experience with Python, SQL, Spark, and Airflow Confident working across AWS, Azure, or GCP Great communication skills — able to work with both tech and … in the Edinburgh office 35 days holiday (including flexible bank holidays) Private medical insurance Enhanced parental and adoption leave Pension matched up to 5% If you’re a Senior DataEngineer who loves solving complex problems, enjoys variety, and wants to work with modern cloud technologies — we’d love to hear from you. Drop me a message or More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Morson Edge
Senior DataEngineer – Edinburgh £75K - £95K Salary: £65,000 – £85,000 Location : Edinburgh (Hybrid – 2 days per week in office) ECOM Recruitment are working with a growing digital consultancy that delivers data and technology solutions for some of the UK's best-known brands. We're looking for a Senior DataEngineer to join their … growing data team. You'll be working with a talented group of engineers to design and build modern, cloud-based data platforms and pipelines that make a real impact. This is a great opportunity to get hands-on with the latest tools and technologies within a business that truly values collaboration, innovation, and quality. The Role As a … senior member of the team, you'll be responsible for building and maintaining scalable data pipelines that drive insights and decision-making for clients in fast-moving industries including the gambling sector, so it's important you're comfortable working in that space. You'll work closely with other engineers, analysts, and client stakeholders to deliver reliable, automated, and More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NLB Services
Role: DataEngineer Location: Glasgow (Hybrid, 3 days onsite) Contract: 06-12months with possible extensions (No Sponsorship Available ) Skills/Qualifications: · 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. · 3+ years hands-on experience with cloud services, especially Databricks, for building and … managing scalable data pipelines · 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. · Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices … work collaboratively in a fast-paced, dynamic environment. · Experience with code versioning tools (e.g., Git) · Knowledge of Linux operating systems · Familiarity with REST APIs and integration techniques · Familiarity with data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as Apache Airflow · Previous exposure to big dataMore ❯
Role: DataEngineer (Python, Databricks, Snowflake, ETL) Location: Glasgow, UK (3days/week On-Site) Job Type: Contract Skills/Qualifications: 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building and managing … scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity … work collaboratively in a fast-paced, dynamic environment. Experience with code versioning tools (e.g., Git) Knowledge of Linux operating systems Familiarity with REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big dataMore ❯
Role: DataEngineer (Python, Databricks, Snowflake, ETL) Location: Glasgow, UK (3days/week On-Site) Job Type: Contract Skills/Qualifications: 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building and managing … scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity … work collaboratively in a fast-paced, dynamic environment. Experience with code versioning tools (e.g., Git) Knowledge of Linux operating systems Familiarity with REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big dataMore ❯
Role: DataEngineer (Python, Databricks, Snowflake, ETL) Location: Glasgow, UK (3days/week On-Site) Job Type: Contract Skills/Qualifications: 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building and managing … scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity … work collaboratively in a fast-paced, dynamic environment. Experience with code versioning tools (e.g., Git) Knowledge of Linux operating systems Familiarity with REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big dataMore ❯
DataEngineer Remote (to +/- 2 hrs of UK Time Zone) Are you a skilled DataEngineer with a passion for financial markets? We're working with a pioneering tech-driven company building innovative financial data solutions. This is an exciting opportunity to join a collaborative and forward-thinking engineering team working on a … cutting-edge data platform for financial datasets. Youll help shape how complex market data is processed, structured, and made accessible for analytics, research, and insight generation. What youll be doing: Designing and maintaining data pipelines Working with time-series and structured/unstructured datasets Applying your industry know-how within financial data to ensure continual product … development Tech youll work with: Functional programming i.e. Ruby, Python, Elixir PostgreSQL OpenAPI integrations Various data streaming and processing tools Whats in it for you: Fully remote working Flexible hours Trust and ownership Chance to build the future of financial data platforms! Apply now to learn more. Bright Purple are an equal opportunities employer: we are proud to More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Bright Purple Resourcing
DataEngineer Remote (to +/- 2 hrs of UK Time Zone) Are you a skilled DataEngineer with a passion for financial markets? We're working with a pioneering tech-driven company building innovative financial data solutions. This is an exciting opportunity to join a collaborative and forward-thinking engineering team working on a … cutting-edge data platform for financial datasets. Youll help shape how complex market data is processed, structured, and made accessible for analytics, research, and insight generation. What youll be doing: Designing and maintaining data pipelines Working with time-series and structured/unstructured datasets Applying your industry know-how within financial data to ensure continual product … development Tech youll work with: Functional programming i.e. Ruby, Python, Elixir PostgreSQL OpenAPI integrations Various data streaming and processing tools Whats in it for you: Fully remote working Flexible hours Trust and ownership Chance to build the future of financial data platforms! Apply now to learn more. Bright Purple are an equal opportunities employer: we are proud to More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NLB Services
DataEngineer Location - Glasgow (hybrid) 3 days in a week Contract role (6 to 12 Months) Skills/Qualifications: · 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. · 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines · 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Experience with code versioning tools (e.g., Git) · Knowledge of Linux operating systems · Familiarity with REST APIs and integration techniques · Familiarity with … data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as Apache Airflow · Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing More ❯
paisley, central scotland, united kingdom Hybrid / WFH Options
NLB Services
DataEngineer Location - Glasgow (hybrid) 3 days in a week Contract role (6 to 12 Months) Skills/Qualifications: · 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. · 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines · 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Experience with code versioning tools (e.g., Git) · Knowledge of Linux operating systems · Familiarity with REST APIs and integration techniques · Familiarity with … data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as Apache Airflow · Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing More ❯
milton, central scotland, united kingdom Hybrid / WFH Options
NLB Services
DataEngineer Location - Glasgow (hybrid) 3 days in a week Contract role (6 to 12 Months) Skills/Qualifications: · 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. · 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines · 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Experience with code versioning tools (e.g., Git) · Knowledge of Linux operating systems · Familiarity with REST APIs and integration techniques · Familiarity with … data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as Apache Airflow · Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing More ❯
Senior DataEngineer Contract | Edinburgh (2 days onsite) £500/day (Likely Outside IR35) 3 months initially Bright Purple is delighted to be working with an exciting, product-focused consultancy delivering some of the UKs most high-profile and widely used consumer applications . Their client list features some of the biggest names in tech and beyond. Were … seeking an experienced Senior DataEngineer to join their growing Data Practice on a 3-month engagement, helping shape and deliver scalable, cloud-native data solutions for household-name clients. What youll be doing Designing, building and maintaining robust data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across … leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud dataMore ❯
Glasgow, Lanarkshire, United Kingdom Hybrid / WFH Options
note, ideally we would seek a candidate who can be hyrbid from Glasgow or London, but this role can be remote subject to location. We are seeking an experienced Data Migration Engineer to support a large-scale migration project. The successful candidate will play a key role in designing, developing, and executing migration strategies to ensure data is transferred accurately, securely, and efficiently into the target environment. This role requires a hands-on engineer with strong SQL expertise, particularly in MySQL , who can take ownership of migration tasks, proactively identify risks, and collaborate closely with technical and business stakeholders. Design, develop, and execute ETL processes to migrate data from legacy systems to target platforms. … Write and optimise complex SQL queries (preferably MySQL) to support data extraction, transformation, and validation. Apply the full data quality framework (accuracy, completeness, consistency, timeliness, validity, uniqueness, integrity) across all migration activities. Conduct data profiling, cleansing, and quality assurance checks to ensure accuracy and completeness of migrated data. Translate business and technical requirements into data transformation More ❯
Senior DataEngineer Contract Edinburgh (2 days onsite) £500/day (Likely Outside IR35) 3 months initially Bright Purple is delighted to be working with an exciting, product-focused consultancy delivering some of the UK s most high-profile and widely used consumer applications . Their client list features some of the biggest names in tech and beyond. … We re seeking an experienced Senior DataEngineer to join their growing Data Practice on a 3-month engagement, helping shape and deliver scalable, cloud-native data solutions for household-name clients. What you ll be doing Designing, building and maintaining robust data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP … Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
Senior DataEngineer – Contract | Edinburgh (2 days onsite) £500/day (Likely Outside IR35) 3 months initially Bright Purple is delighted to be working with an exciting, product-focused consultancy delivering some of the UK’s most high-profile and widely used consumer applications . Their client list features some of the biggest names in tech and beyond. … We’re seeking an experienced Senior DataEngineer to join their growing Data Practice on a 3-month engagement, helping shape and deliver scalable, cloud-native data solutions for household-name clients. What you’ll be doing Designing, building and maintaining robust data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP … Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NEC Software Solutions
note, ideally we would seek a candidate who can be hyrbid from Glasgow or London, but this role can be remote subject to location. We are seeking an experienced Data Migration Engineer to support a large-scale migration project. The successful candidate will play a key role in designing, developing, and executing migration strategies to ensure data is transferred accurately, securely, and efficiently into the target environment. This role requires a hands-on engineer with strong SQL expertise, particularly in MySQL , who can take ownership of migration tasks, proactively identify risks, and collaborate closely with technical and business stakeholders. Key Responsibilities Design, develop, and execute ETL processes to migrate data from legacy systems to … target platforms. Write and optimise complex SQL queries (preferably MySQL) to support data extraction, transformation, and validation. Apply the full data quality framework (accuracy, completeness, consistency, timeliness, validity, uniqueness, integrity) across all migration activities. Conduct data profiling, cleansing, and quality assurance checks to ensure accuracy and completeness of migrated data. Translate business and technical requirements into dataMore ❯
I am recruiting for a DataEngineer to be based in Glasgow 3 days a week, 2 days remote. The role falls inside IR35 so you will need to work through an umbrella company for the duration of the contract. You must have several years of experience developing data pipelines and data warehousing solutions using Python … and libraries such as Pandas, NumPy, PySpark, etc. You will also have a number of years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines. ETL process expertise is essential. Proficiency in working with Snowflake or similar cloud-based data warehousing solutions is also essential. Experience in data development and solutions … in highly complex data environments with large data volumes is also required. You will be responsible for collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and Databricks. You will also develop and deploy ETL jobs that extract data from various sources, transforming them to meet More ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid / WFH Options
KBC Technologies UK LTD
About the Role: We are looking for DataEngineer for Glasgow location. Mode of Work - hybrid Databricks being (primarily) a Managed Spark engine – strong Spark experience is a must-have. Databricks (BigData/Spark) & Snowflake specialists – and general DataEngineer skills, with RDBMS Fundamentals, SQL, ETL. More ❯
Role Responsibilities You will be responsible for: Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. Taking ownership of the end-to-end engineering lifecycle, including data extraction … cleansing, transformation, and loading, ensuring accuracy and consistency. Creating and manage data pipelines, ensuring proper error handling, monitoring and performance optimizations Working in an agile environment, participating in sprint planning, daily stand-ups, and retrospectives. Conducting code reviews, provide constructive feedback, and enforce coding standards to maintain a high quality. Developing and maintain tooling and automation scripts to streamline … repetitive tasks. Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes Utilizing REST APls and other integration techniques to connect various data sources Maintaining documentation, including data flow diagrams, technical specifications, and processes. You Have: Proficiency in Python programming, including experience in writing efficient and maintainable code. Hands-on experience with cloud More ❯
Role Responsibilities You will be responsible for: Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. Taking ownership of the end-to-end engineering lifecycle, including data extraction … cleansing, transformation, and loading, ensuring accuracy and consistency. Creating and manage data pipelines, ensuring proper error handling, monitoring and performance optimizations Working in an agile environment, participating in sprint planning, daily stand-ups, and retrospectives. Conducting code reviews, provide constructive feedback, and enforce coding standards to maintain a high quality. Developing and maintain tooling and automation scripts to streamline … repetitive tasks. Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes Utilizing REST APls and other integration techniques to connect various data sources Maintaining documentation, including data flow diagrams, technical specifications, and processes. You Have: Proficiency in Python programming, including experience in writing efficient and maintainable code. Hands-on experience with cloud More ❯
Role Responsibilities You will be responsible for: Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. Taking ownership of the end-to-end engineering lifecycle, including data extraction … cleansing, transformation, and loading, ensuring accuracy and consistency. Creating and manage data pipelines, ensuring proper error handling, monitoring and performance optimizations Working in an agile environment, participating in sprint planning, daily stand-ups, and retrospectives. Conducting code reviews, provide constructive feedback, and enforce coding standards to maintain a high quality. Developing and maintain tooling and automation scripts to streamline … repetitive tasks. Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes Utilizing REST APls and other integration techniques to connect various data sources Maintaining documentation, including data flow diagrams, technical specifications, and processes. You Have: Proficiency in Python programming, including experience in writing efficient and maintainable code. Hands-on experience with cloud More ❯
the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. YOUR ROLE We are seeking a skilled and hands-on AWS DataEngineer with strong coding expertise and deep experience in building scalable data solutions using AWS services. The ideal candidate will have a solid background in data engineering, Python development, and cloud-native architecture. YOUR PROFILE Design, develop, and maintain robust data pipelines and ETL workflows using AWS services. Implement scalable data processing solutions using PySpark and AWS Glue. Build and manage infrastructure as code using CloudFormation. Develop and deploy serverless applications using AWS Lambda, Step Functions, and S3. Perform data querying and … analysis using Athena. Collaborate with Data Scientists to operationalize models using SageMaker. Ensure secure and compliant data handling using IAM, KMS, and VPC configurations. Containerize applications using ECS for scalable deployment. Write clean, testable code in Python, with a strong emphasis on unit testing. Use GitLab for version control, CI/CD, and collaboration. Strong coding background in More ❯
Role Title: Sr. Databricks Engineer Location: Glasgow Duration: 31/12/2026 Days on site: 2-3 MUST BE PAYE THROUGH UMBRELLA Role Description: We are currently migrating our data pipelines from AWS to Databricks, and are seeking a Senior Databricks Engineer to lead and contribute to this transformation. This is a hands-on engineering role … focused on designing, building, and optimizing scalable data solutions using the Databricks platform. Key Responsibilities: • Lead the migration of existing AWS-based data pipelines to Databricks. • Design and implement scalable data engineering solutions using Apache Spark on Databricks. • Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines. • Optimize performance and … cost-efficiency of Databricks workloads. • Develop and maintain CI/CD workflows for Databricks using GitLab or similar tools. • Ensure data quality and reliability through robust unit testing and validation frameworks. • Implement best practices for data governance, security, and access control within Databricks. • Provide technical mentorship and guidance to junior engineers. Must-Have Skills: • Strong hands-on experience More ❯
Role Title: Sr. Databricks Engineer Location: Glasgow Duration: 31/12/2026 Days on site: 2-3 MUST BE PAYE THROUGH UMBRELLA Role Description: We are currently migrating our data pipelines from AWS to Databricks, and are seeking a Senior Databricks Engineer to lead and contribute to this transformation. This is a hands-on engineering role … focused on designing, building, and optimizing scalable data solutions using the Databricks platform. Key Responsibilities: • Lead the migration of existing AWS-based data pipelines to Databricks. • Design and implement scalable data engineering solutions using Apache Spark on Databricks. • Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines. • Optimize performance and … cost-efficiency of Databricks workloads. • Develop and maintain CI/CD workflows for Databricks using GitLab or similar tools. • Ensure data quality and reliability through robust unit testing and validation frameworks. • Implement best practices for data governance, security, and access control within Databricks. • Provide technical mentorship and guidance to junior engineers. Must-Have Skills: • Strong hands-on experience More ❯