Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
as Azure Data Factory, Synapse Pipelines, and SQL/T-SQL, ensuring data quality, performance, and reliability. Contribute to the evolution of our cloud-native data architecture, leveraging Azure Databricks, Azure Data Lake, and Snowflake where appropriate. Apply strong data modelling and transformation skills to support analytics, regulatory reporting, and operational use cases. Promote and implement engineering best practices, including More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Bright Purple Resourcing
based data platforms incorporating the full Azure data stack ETL Pipelines Data Modelling (Logical, Physical, Conceptual) Data Mapping The skills we are looking for include: Azure Data Factory, Azure Databricks, Blobs, Azure SQL, Synapse, etc. Python development for data engineering C# experience would be advantageous Solid experience with databases A curious mindset! The role is mostly remote but we need More ❯
Scheduling Implementation knowledge on principles such as ACID, SOLID, OWASP Knowledge on UK GDPR, PII, PCI-DSS data standards Implementation of designs related to Batch processing patterns involving Talend, Databricks, Snowflake or similar Experience of running GDPR related projects involving Data Sourcing, Validations, Integration, Data Disposition, Auditing & Reporting Ability to treat Data as an Asset and Architect & provide Solutions aligned More ❯
as modern data platforms, data product engineering, data marketplace architecture, data developer portals, platform engineering. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in AI/ML such as speaking at industry events, contributing to More ❯
as modern data platforms, data product engineering, data marketplace architecture, data developer portals, platform engineering. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in AI/ML such as speaking at industry events, contributing to More ❯
published data models and reports. Experience required: Strong background in data engineering, warehousing, and data quality. Proficiency in Microsoft 365, Power BI, and other BI tools Familiarity with Azure Databricks and Delta Lake is desirable. Ability to work autonomously in a dynamic environment and contribute to team performance. Strong communication, influencing skills, and a positive, can-do attitude. Knowledge of More ❯
product managers to bring innovative AI ideas to life. What you'll need Strong experience with RAG architectures or similar retrieval-based AI systems Hands on knowledge of Azure (Databricks) , Python , and SQL A passion for building intelligent systems that drive tangible results Curiosity and creativity to explore new technologies and stay ahead of the curve What's on offer More ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid / WFH Options
Cathcart Technology
product managers to bring innovative AI ideas to life. What you'll need: ** Strong experience with RAG architectures or similar retrieval-based AI systems ** Hands-on knowledge of Azure (Databricks) , Python , and SQL ** A passion for building intelligent systems that drive tangible results ** Curiosity and creativity to explore new technologies and stay ahead of the curve What's on offer More ❯
and solutions in highly complex data environments with large data volumes. SQL/PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis. Databricks experience is essential. Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will be able to develop solutions in a … hybrid data environment (on-Prem and Cloud). You must be able to collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, Azure, AWS, etc. Hands on experience with developing data pipelines for structured, semi-structured, and unstructured data and experience integrating with their supporting stores (e.g. RDBMS, NoSQL DBs, Document DBs, Log Files etc). Please apply ASAP More ❯
and solutions in highly complex data environments with large data volumes. SQL/PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis. Databricks experience is essential. Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will be able to develop solutions in a … hybrid data environment (on-Prem and Cloud). You must be able to collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, Azure, AWS, etc. Hands on experience with developing data pipelines for structured, semi-structured, and unstructured data and experience integrating with their supporting stores (e.g. RDBMS, NoSQL DBs, Document DBs, Log Files etc). Please apply ASAP More ❯
to work through an umbrella company for the duration of the contract. Responsibilities will include collecting and analysing requirements, analysing data and building data pipelines. Strong experience with Python & Databricks is essential and also experience of working in Unix environments. You must also have strong experience of SQL and relational databases. You will have extensive data modelling experience and experience More ❯
with diverse datasets (including real-world data) to extract insights and support innovation. Develop data pipelines and analytical tools using SQL, Python or R, and modern data platforms (e.g. Databricks, Snowflake). Partner with engineering teams to ensure seamless data integration and delivery. Engage directly with clients to understand their technical challenges and translate them into data solutions. Support research … years of experience in data science, analytics, or data engineering. Solid coding skills in Python or R, and confidence with SQL. Experience with cloud-based data platforms (Databricks, Snowflake, AWS, or similar). A strong analytical mindset and a hands-on attitude — happy building dashboards as well as models. Excellent communication skills, comfortable working with clients and non-technical stakeholders. More ❯
on a greenfield data transformation programme. Their current processes offer limited digital customer interaction, and the vision is to modernise these processes by: Building a modern data platform in Databricks Creating a single customer view across the organisation. Enabling new client-facing digital services through real-time and batch data pipelines. You will join a growing team of engineers and … a high-value greenfield initiative for the business, directly impacting customer experience and long-term data strategy. Key Responsibilities: Design and build scalable data pipelines and transformation logic in Databricks Implement and maintain Delta Lake physical models and relational data models. Contribute to design and coding standards, working closely with architects. Develop and maintain Python packages and libraries to support … data model components. Participate in Agile ceremonies (stand-ups, backlog refinement, etc.). Essential Skills: PySpark and SparkSQL. Strong knowledge of relational database modelling Experience designing and implementing in Databricks (DBX notebooks, Delta Lakes). Azure platform experience. ADF or Synapse pipelines for orchestration. Python development Familiarity with CI/CD and DevOps principles. Desirable Skills Data Vault 2.0. Data More ❯
on a greenfield data transformation programme. Their current processes offer limited digital customer interaction, and the vision is to modernise these processes by: Building a modern data platform in Databricks Creating a single customer view across the organisation. Enabling new client-facing digital services through real-time and batch data pipelines. You will join a growing team of engineers and … a high-value greenfield initiative for the business, directly impacting customer experience and long-term data strategy. Key Responsibilities: Design and build scalable data pipelines and transformation logic in Databricks Implement and maintain Delta Lake physical models and relational data models. Contribute to design and coding standards, working closely with architects. Develop and maintain Python packages and libraries to support … data model components. Participate in Agile ceremonies (stand-ups, backlog refinement, etc.). Essential Skills: PySpark and SparkSQL. Strong knowledge of relational database modelling Experience designing and implementing in Databricks (DBX notebooks, Delta Lakes). Azure platform experience. ADF or Synapse pipelines for orchestration. Python development Familiarity with CI/CD and DevOps principles. Desirable Skills Data Vault 2.0. Data More ❯
modern data and technology solutions. This will involve providing technical direction, encouraging best-practice, and cultivating a collaborative and supportive team environment. Their tech stack currently spans things like Databricks, Microsoft Azure, Power Platform, Power BI, M365, Co-pilot, and various applications such as Workday. Requirements: Experience guiding data strategy and designing and delivering data and system architectures Experience leading … small high-performing teams in an agile environment Hands-on experience with Azure data technologies and Databricks Strong understanding of data integration, automation, and system design An interest in emerging technologies such as AI Benefits: Salary up to £95,000 depending on experience Annual performance based bonus 25 days annual leave plus bank holidays Private healthcare Life insurance Electric vehicle More ❯
modern data and technology solutions. This will involve providing technical direction, encouraging best-practice, and cultivating a collaborative and supportive team environment. Their tech stack currently spans things like Databricks, Microsoft Azure, Power Platform, Power BI, M365, Co-pilot, and various applications such as Workday. Requirements: Experience guiding data strategy and designing and delivering data and system architectures Experience leading … small high-performing teams in an agile environment Hands-on experience with Azure data technologies and Databricks Strong understanding of data integration, automation, and system design An interest in emerging technologies such as AI Benefits: Salary up to £95,000 depending on experience Annual performance based bonus 25 days annual leave plus bank holidays Private healthcare Life insurance Electric vehicle More ❯