Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NLB Services
Role: Data Engineer Location: Glasgow (Hybrid, 3 days onsite) Contract: 06-12months with possible extensions (No Sponsorship Available ) Skills/Qualifications: · 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. · 3+ years hands-on experience with cloud services, especially Databricks, for building and managing … scalable data pipelines · 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. · Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity … work collaboratively in a fast-paced, dynamic environment. · Experience with code versioning tools (e.g., Git) · Knowledge of Linux operating systems · Familiarity with REST APIs and integration techniques · Familiarity with data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as Apache Airflow · Previous exposure to bigMore ❯
About Synechron: Synechron is a leading digital transformation consulting firm, delivering innovative solutions across Financial Services, Insurance, and more. We empower organizations to harness the power of data, technology, and strategy to drive growth and efficiency. Role Overview: We are seeking a skilled Data Engineer to join our UK team. In this role, you will be responsible for … designing, developing, and maintaining scalable ETL data pipelines leveraging Python and DataBricks. You will collaborate with cross-functional teams to understand data requirements and ensure the delivery of reliable, high-quality data solutions that support our business objectives. Key Responsibilities: Collaborate with stakeholders and cross-functional teams to gather data requirements and translate them into efficient … ETL processes. Develop, test, and deploy end-to-end ETL pipelines extracting data from various sources, transforming it to meet business needs, and loading it into target systems. Take ownership of the full engineering lifecycle, including data extraction, cleansing, transformation, and loading, maintaining data accuracy and consistency. Manage and monitor data pipelines, implementing proper error handling More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Genpact
our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Data Engineer … to enhance collaboration. Flexible work arrangements (core hours and opportunities to work from home). Role Responsibilities You will be responsible for: Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks Developing and deploying ETL jobs that extract data from various sources, transforming it to meet … business needs. Taking ownership of the end-to-end engineering lifecycle, including data extraction, cleansing, transformation, and loading, ensuring accuracy and consistency. Creating and manage data pipelines, ensuring proper error handling, monitoring and performance optimizations Working in an agile environment, participating in sprint planning, daily stand-ups, and retrospectives. Conducting code reviews, provide constructive feedback, and enforce coding More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NLB Services
Data Engineer Location - Glasgow (hybrid) 3 days in a week Contract role (6 to 12 Months) Skills/Qualifications: · 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. · 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines · 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Experience with code versioning tools (e.g., Git) · Knowledge of Linux operating systems · Familiarity with REST APIs and integration techniques · Familiarity with … data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as Apache Airflow · Previous exposure to bigdata technologies (e.g., Hadoop, Spark) for large data processing More ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid / WFH Options
KBC Technologies UK LTD
We are looking for Data Engineer for Glasgow location. Mode of Work - hybrid Databricks being (primarily) a Managed Spark engine – strong Spark experience is a must-have. Databricks (BigData/Spark) & Snowflake specialists – and general Data Engineer skills, with RDBMS Fundamentals, SQL, ETL. More ❯