Permanent Data Flow Diagram Jobs in Glasgow

2 of 2 Permanent Data Flow Diagram Jobs in Glasgow

Data Engineer

Glasgow, Scotland, United Kingdom
Seargin
through an umbrella company Requirements 'must have': Education: Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience). 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building … and managing scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. Experience with code versioning tools (e.g., Git) Knowledge of Linux operating systems Familiarity with REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g., Power BI) Background in More ❯
Posted:

Data Engineer (DataBricks)

Glasgow, Scotland, United Kingdom
Synechron
About Synechron: Synechron is a leading digital transformation consulting firm, delivering innovative solutions across Financial Services, Insurance, and more. We empower organizations to harness the power of data, technology, and strategy to drive growth and efficiency. Role Overview: We are seeking a skilled Data Engineer to join our UK team. In this role, you will be … responsible for designing, developing, and maintaining scalable ETL data pipelines leveraging Python and DataBricks. You will collaborate with cross-functional teams to understand data requirements and ensure the delivery of reliable, high-quality data solutions that support our business objectives. Key Responsibilities: Collaborate with stakeholders and cross-functional teams to gather data requirements and translate them into efficient ETL processes. Develop, test, and deploy end-to-end ETL pipelines extracting data from various sources, transforming it to meet business needs, and loading it into target systems. Take ownership of the full engineering lifecycle, including data extraction, cleansing, transformation, and loading, maintaining data accuracy and More ❯
Posted: