data platforms (Microsoft Fabric as a preference) and modern data warehousing. Deep proficiency in data modelling (conceptual, logical, physical) across relational, dimensional, and NoSQL paradigms. Skilled in data integration, ETL/ELT design, and API-driven architectures. Experience with data governance, metadata management, and master data management They're big on expertise, not hierarchy, so you'll be trusted with More ❯
data platforms (Microsoft Fabric as a preference) and modern data warehousing. Deep proficiency in data modelling (conceptual, logical, physical) across relational, dimensional, and NoSQL paradigms. Skilled in data integration, ETL/ELT design, and API-driven architectures. Experience with data governance, metadata management, and master data management They're big on expertise, not hierarchy, so you'll be trusted with More ❯
AWS services. The ideal candidate will have a solid background in data engineering, Python development, and cloud-native architecture. YOUR PROFILE Design, develop, and maintain robust data pipelines andETL workflows using AWS services. Implement scalable data processing solutions using PySpark and AWS Glue. Build and manage infrastructure as code using CloudFormation. Develop and deploy serverless applications using AWS Lambda More ❯
to Have Experience in financial or banking data environments. Familiarity with AI/ML frameworks (e.g., LangChain, OpenAI API, Hugging Face). Exposure to large-scale data migration or ETL projects. What's Offered Competitive salary and benefits package. Opportunity to work on a high-impact data modernization initiative. Collaborative and innovative environment focused on cutting-edge technology. More ❯
to Have Experience in financial or banking data environments. Familiarity with AI/ML frameworks (e.g., LangChain, OpenAI API, Hugging Face). Exposure to large-scale data migration or ETL projects. What’s Offered Competitive salary and benefits package. Opportunity to work on a high-impact data modernization initiative. Collaborative and innovative environment focused on cutting-edge technology. More ❯
to Have Experience in financial or banking data environments. Familiarity with AI/ML frameworks (e.g., LangChain, OpenAI API, Hugging Face). Exposure to large-scale data migration or ETL projects. What’s Offered Competitive salary and benefits package. Opportunity to work on a high-impact data modernization initiative. Collaborative and innovative environment focused on cutting-edge technology. More ❯
to Have Experience in financial or banking data environments. Familiarity with AI/ML frameworks (e.g., LangChain, OpenAI API, Hugging Face). Exposure to large-scale data migration or ETL projects. What’s Offered Competitive salary and benefits package. Opportunity to work on a high-impact data modernization initiative. Collaborative and innovative environment focused on cutting-edge technology. More ❯
to Have Experience in financial or banking data environments. Familiarity with AI/ML frameworks (e.g., LangChain, OpenAI API, Hugging Face). Exposure to large-scale data migration or ETL projects. What’s Offered Competitive salary and benefits package. Opportunity to work on a high-impact data modernization initiative. Collaborative and innovative environment focused on cutting-edge technology. More ❯
and efficiency. Role Overview: We are seeking a skilled Data Engineer to join our UK team. In this role, you will be responsible for designing, developing, and maintaining scalable ETL data pipelines leveraging Python and DataBricks. You will collaborate with cross-functional teams to understand data requirements and ensure the delivery of reliable, high-quality data solutions that support our … business objectives. Key Responsibilities: Collaborate with stakeholders and cross-functional teams to gather data requirements and translate them into efficient ETL processes. Develop, test, and deploy end-to-end ETL pipelines extracting data from various sources, transforming it to meet business needs, and loading it into target systems. Take ownership of the full engineering lifecycle, including data extraction, cleansing, transformation … Hands-on experience working with DataBricks and cloud services for building scalable data pipelines. Strong knowledge of cloud data warehousing solutions such as Snowflake or similar. Solid understanding of ETL workflows, data modeling, data warehousing, and data integration best practices. Experience working within agile teams, demonstrating collaboration and adaptability in fast-paced environments. Familiarity with version control tools such as More ❯
and efficiency. Role Overview: We are seeking a skilled Data Engineer to join our UK team. In this role, you will be responsible for designing, developing, and maintaining scalable ETL data pipelines leveraging Python and DataBricks. You will collaborate with cross-functional teams to understand data requirements and ensure the delivery of reliable, high-quality data solutions that support our … business objectives. Key Responsibilities: Collaborate with stakeholders and cross-functional teams to gather data requirements and translate them into efficient ETL processes. Develop, test, and deploy end-to-end ETL pipelines extracting data from various sources, transforming it to meet business needs, and loading it into target systems. Take ownership of the full engineering lifecycle, including data extraction, cleansing, transformation … Hands-on experience working with DataBricks and cloud services for building scalable data pipelines. Strong knowledge of cloud data warehousing solutions such as Snowflake or similar. Solid understanding of ETL workflows, data modeling, data warehousing, and data integration best practices. Experience working within agile teams, demonstrating collaboration and adaptability in fast-paced environments. Familiarity with version control tools such as More ❯
and efficiency. Role Overview: We are seeking a skilled Data Engineer to join our UK team. In this role, you will be responsible for designing, developing, and maintaining scalable ETL data pipelines leveraging Python and DataBricks. You will collaborate with cross-functional teams to understand data requirements and ensure the delivery of reliable, high-quality data solutions that support our … business objectives. Key Responsibilities: Collaborate with stakeholders and cross-functional teams to gather data requirements and translate them into efficient ETL processes. Develop, test, and deploy end-to-end ETL pipelines extracting data from various sources, transforming it to meet business needs, and loading it into target systems. Take ownership of the full engineering lifecycle, including data extraction, cleansing, transformation … Hands-on experience working with DataBricks and cloud services for building scalable data pipelines. Strong knowledge of cloud data warehousing solutions such as Snowflake or similar. Solid understanding of ETL workflows, data modeling, data warehousing, and data integration best practices. Experience working within agile teams, demonstrating collaboration and adaptability in fast-paced environments. Familiarity with version control tools such as More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NEC Software Solutions
SQL expertise, particularly in MySQL , who can take ownership of migration tasks, proactively identify risks, and collaborate closely with technical and business stakeholders. Key Responsibilities Design, develop, and execute ETL processes to migrate data from legacy systems to target platforms. Write and optimise complex SQL queries (preferably MySQL) to support data extraction, transformation, and validation. Apply the full data quality … structures and workflows. Qualifications Essential: Proven experience in public sector large-scale data migration projects . Strong SQL development skills within MySQL environments. Solid understanding of data transformations andETL workflows . Knowledge and application of data quality dimensions in migration processes. Familiarity with case management systems and associated data structures. Strong problem-solving mindset, with the ability to take More ❯
environment that values expertise, innovation, and collaboration. Responsibilities: Design, develop, and maintain scalable data pipelines and transformation processes utilizing modern tools and frameworks Implement and optimise data workflows andETL procedures within Snowflake Create robust data models to support advanced analytics and machine learning initiatives Collaborate with cross-functional stakeholders to understand business data requirements and deliver effective solutions Establish … inform business decisions Candidate Profile: strong background in building enterprise data solutions Extensive hands-on experience with Python and data transformation techniques Expertise in Snowflake cloud data platform andETL process optimisation Familiarity with machine learning tools such as TensorFlow or scikit-learn Strong communication skills, capable of translating complex technical concepts to non-technical stakeholders Experience managing end-to More ❯
Aberdeen, Aberdeenshire, Scotland, United Kingdom Hybrid / WFH Options
Bright Purple Resourcing
week into the office Pension scheme Career progression opportunities City centre parking What you will be doing: Developing and maintaining data integrations between internal and client systems. Building reliable ETL processes to manage andtransform large data sets. Working closely with clients and internal teams to support smooth onboarding and data flows. Contributing to the ongoing development of our SaaS … clearly, and enjoys finding elegant technical solutions. Youll bring: Strong C#/.Net Core and SQL skills. Experience working with REST APIs , JSON, and data pipelines. An understanding of ETL processes and data transformation. The ability to collaborate across technical and non-technical teams. Experience with Azure Knowledge of Power BI is a nice to have APPLY NOW via the More ❯
week into the office Pension scheme Career progression opportunities City centre parking What you will be doing: Developing and maintaining data integrations between internal and client systems. Building reliable ETL processes to manage andtransform large data sets. Working closely with clients and internal teams to support smooth onboarding and data flows. Contributing to the ongoing development of our SaaS … and enjoys finding elegant technical solutions. You'll bring: Strong C#/.Net Core and SQL skills. Experience working with REST APIs, JSON, and data pipelines. An understanding of ETL processes and data transformation. The ability to collaborate across technical and non-technical teams. Experience with Azure Knowledge of Power BI is a nice to have APPLY NOW via the More ❯
such as Pandas, NumPy, PySpark, etc. You will also have a number of years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines. ETL process expertise is essential. Proficiency in working with Snowflake or similar cloud-based data warehousing solutions is also essential. Experience in data development and solutions in highly complex data environments … with large data volumes is also required. You will be responsible for collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and Databricks. You will also develop and deploy ETL jobs that extract data from various sources, transforming them to meet business needs. Please apply ASAP if this is of More ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid / WFH Options
KBC Technologies UK LTD
About the Role: We are looking for Data Engineer for Glasgow location. Mode of Work - hybrid Databricks being (primarily) a Managed Spark engine – strong Spark experience is a must-have. Databricks (BigData/Spark) & Snowflake specialists – and general Data Engineer More ❯