Permanent Senior Data Engineer Jobs in Glasgow

2 of 2 Permanent Senior Data Engineer Jobs in Glasgow

Senior Data Engineer

Glasgow, Scotland, United Kingdom
Hybrid / WFH Options
Infoplus Technologies UK Limited
Job Title: Senior Data Engineer · Location: Glasgow, UK (Hybrid, 3 days onsite) · Duration: 6 months (Inside IR35) Job Description: ## Mandate skills required: Pyspark Python SQL Snowflake Role Responsibilities You will be responsible for: • Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks • Developing … and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. • Taking ownership of the end-to-end engineering lifecycle, including data extraction, cleansing, transformation, and loading, ensuring accuracy and consistency. • Creating and manage data pipelines, ensuring proper error handling, monitoring and performance optimizations • Working in an agile environment, participating in sprint … to streamline repetitive tasks. • Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes • Utilizing REST APls and other integration techniques to connect various data sources • Maintaining documentation, including data flow diagrams, technical specifications, and processes. You Have: • Proficiency in Python programming, including experience in writing efficient and maintainable code. • Hands-on experience More ❯
Posted:

Senior Data Engineer

Glasgow, Scotland, United Kingdom
Lorien
Role Title: Sr. Databricks Engineer Location: Glasgow Duration: 31/12/2026 Days on site: 2-3 MUST BE PAYE THROUGH UMBRELLA Role Description: We are currently migrating our data pipelines from AWS to Databricks, and are seeking a Senior Databricks Engineer to lead and contribute to this transformation. This is a hands-on engineering … role focused on designing, building, and optimizing scalable data solutions using the Databricks platform. Key Responsibilities: • Lead the migration of existing AWS-based data pipelines to Databricks. • Design and implement scalable data engineering solutions using Apache Spark on Databricks. • Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines. • Optimize performance … and cost-efficiency of Databricks workloads. • Develop and maintain CI/CD workflows for Databricks using GitLab or similar tools. • Ensure data quality and reliability through robust unit testing and validation frameworks. • Implement best practices for data governance, security, and access control within Databricks. • Provide technical mentorship and guidance to junior engineers. Must-Have Skills: • Strong hands-on More ❯
Posted: