Glasgow, Scotland, United Kingdom Hybrid/Remote Options
NLB Services
Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. · Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. · Experience with code More ❯
database platforms Cloud Architecture: Experience with Azure data services (Azure Data Factory, Azure Synapse, Azure Data Lake) Data Governance: Understanding of data quality, data lineage, and metadata management principles. ETL/ELT Processes: Experience designing and implementing data integration workflows. Business Intelligence: Knowledge of reporting and analytics platforms (Power BI, SSRS, or similar) Data Warehousing: Experience with dimensional modelling andMore ❯
Glasgow, Scotland, United Kingdom Hybrid/Remote Options
Vanloq - Workforce Solutions
days per week on-site), contributing to key data initiatives that support risk and analytics functions across the business. Key Responsibilities: Design, build, and optimise scalable data pipelines andETL solutions. Work with complex datasets to enable data-driven insights and reporting. Collaborate closely with data scientists, analysts, and business stakeholders to deliver robust data solutions. Support the migration andMore ❯
data and validate by profiling in a data environment Understand data structures and data model (dimensional & relational) concepts like Star schema or Fact & Dimension tables, to design and develop ETL patterns/mechanisms to ingest, analyse, validate, normalize and cleanse data Understand and produce ‘Source to Target mapping’ (STTM) documents, containing data structures, business & data transformation logics Liaise with data More ❯
AWS services. The ideal candidate will have a solid background in data engineering, Python development, and cloud-native architecture. YOUR PROFILE Design, develop, and maintain robust data pipelines andETL workflows using AWS services. Implement scalable data processing solutions using PySpark and AWS Glue. Build and manage infrastructure as code using CloudFormation. Develop and deploy serverless applications using AWS Lambda More ❯
and efficiency. Role Overview: We are seeking a skilled Data Engineer to join our UK team. In this role, you will be responsible for designing, developing, and maintaining scalable ETL data pipelines leveraging Python and DataBricks. You will collaborate with cross-functional teams to understand data requirements and ensure the delivery of reliable, high-quality data solutions that support our … business objectives. Key Responsibilities: Collaborate with stakeholders and cross-functional teams to gather data requirements and translate them into efficient ETL processes. Develop, test, and deploy end-to-end ETL pipelines extracting data from various sources, transforming it to meet business needs, and loading it into target systems. Take ownership of the full engineering lifecycle, including data extraction, cleansing, transformation … Hands-on experience working with DataBricks and cloud services for building scalable data pipelines. Strong knowledge of cloud data warehousing solutions such as Snowflake or similar. Solid understanding of ETL workflows, data modeling, data warehousing, and data integration best practices. Experience working within agile teams, demonstrating collaboration and adaptability in fast-paced environments. Familiarity with version control tools such as More ❯
process Strong specialist experience across several of the following areas: Cloud services (SaaS, PaaS) (AWS preferred) Enterprise integration patterns and tooling (MuleSoft preferred) Enterprise data, analytics and information management, ETL knowledge High volume transactional online systems Security and identity management Service and micro-service architecture Knowledge of continuous integration tools and techniques Design and/or development of applications using More ❯
Glasgow, Scotland, United Kingdom Hybrid/Remote Options
Infoplus Technologies UK Limited
Glasgow, UK- Hybrid Duration: 6+ months Contract Job Description: You will be responsible for: • Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks • Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. • Taking ownership of the end-to-end engineering … to maintain a high quality. • Developing and maintain tooling and automation scripts to streamline repetitive tasks. • Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes • Utilizing REST APls and other integration techniques to connect various data sources • Maintaining documentation, including data flow diagrams, technical specifications, and processes. You Have: • Proficiency in Python programming, including … on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines • Proficiency in working with Snowflake or similar cloud-based data warehousing solutions • Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. • Experience with code More ❯
and visualizations. Typical tasks include: Assisting with data collection, cleaning, and preparation for analysis. Supporting the development of dashboards and reports using tools like Power BI or Tableau. Helping extractand organize data from databases and other sources. Performing basic data analysis and quality checks under supervision. Assisting in preparing client deliverables such as reports, charts, and presentations. Learning and … applying data transformation techniques (ETL) and basic statistical analysis. Supporting the team in managing project documentation and maintaining accurate records. Collaborating with colleagues across Forensic and Technology teams to meet client objectives. Requirements · Previous full-time experience of 2+ years in data analytics, business intelligence, data science, technology consulting or a related profession; · Experience delivering consulting services in the context … due to challenging deadlines, changing deliverables, and evolving task priorities; and · Strong ability and desire to utilise technology to solve complex problems. Technical skills · Basic understanding of information systems, ETL processes, automation, and their function within organisations; · Familiarity with programming languages (SQL, Python) and other database applications; · Understanding of PC environment and related software, including Microsoft Office applications; · Knowledge of More ❯
such as Pandas, NumPy, PySpark, etc. You will also have a number of years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines. ETL process expertise is essential. Proficiency in working with Snowflake or similar cloud-based data warehousing solutions is also essential. Experience in data development and solutions in highly complex data environments … with large data volumes is also required. You will be responsible for collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and Databricks. You will also develop and deploy ETL jobs that extract data from various sources, transforming them to meet business needs. Please apply ASAP if this is of More ❯
Knowledge and Skills: • Person requires a minimum of 7 years of experience developing Oracle SQL and PL/SQL. • Person requires a minimum of 5 years of experience developing ETL or ETL solutions. • Person requires a minimum of 5 years of work experience in Oracle Forms. • Person requires a minimum of 2 years of work experience in Pro*C. • Candidates More ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid/Remote Options
KBC Technologies UK LTD
About the Role: We are looking for Data Engineer for Glasgow location. Mode of Work - hybrid Databricks being (primarily) a Managed Spark engine – strong Spark experience is a must-have. Databricks (BigData/Spark) & Snowflake specialists – and general Data Engineer More ❯