ETL Jobs in Renfrewshire

2 of 2 ETL Jobs in Renfrewshire

Data Engineer

paisley, central scotland, united kingdom
Hybrid / WFH Options
Genpact
and opportunities to work from home). Role Responsibilities You will be responsible for: Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. Taking ownership of the end-to-end engineering … to maintain a high quality. Developing and maintain tooling and automation scripts to streamline repetitive tasks. Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes Utilizing REST APIs and other integration techniques to connect various data sources Maintaining documentation, including data flow diagrams, technical specifications, and processes. You have: Proficiency in Python programming, including … on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines Proficiency in working with Snowflake or similar cloud-based data warehousing solutions Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. Experience with code More ❯
Posted:

Data Engineer (DataBricks)

paisley, central scotland, united kingdom
Synechron
and efficiency. Role Overview: We are seeking a skilled Data Engineer to join our UK team. In this role, you will be responsible for designing, developing, and maintaining scalable ETL data pipelines leveraging Python and DataBricks. You will collaborate with cross-functional teams to understand data requirements and ensure the delivery of reliable, high-quality data solutions that support our … business objectives. Key Responsibilities: Collaborate with stakeholders and cross-functional teams to gather data requirements and translate them into efficient ETL processes. Develop, test, and deploy end-to-end ETL pipelines extracting data from various sources, transforming it to meet business needs, and loading it into target systems. Take ownership of the full engineering lifecycle, including data extraction, cleansing, transformation … Hands-on experience working with DataBricks and cloud services for building scalable data pipelines. Strong knowledge of cloud data warehousing solutions such as Snowflake or similar. Solid understanding of ETL workflows, data modeling, data warehousing, and data integration best practices. Experience working within agile teams, demonstrating collaboration and adaptability in fast-paced environments. Familiarity with version control tools such as More ❯
Posted: