ELT principles , data architecture, and data warehouse concepts Familiarity with APIs, RESTful services, and JSON/XML data handling Experience with Azure Data Factory , Databricks , or AWS Glue Familiarity with CI/CD , version control (Git), and DevOps practices Knowledge of cloud platforms (Azure, AWS, or GCP) Basic understanding of More ❯
About You: Deep expertise in data engineering, data pipelines, and semantic models. Hands-on experience building data platforms on Microsoft Azure (Data Factory, Synapse, Databricks, Purview). Skilled in deploying machine learning models and generative AI applications in production. Strong programming skills in Python, SQL, YAML. Experience with CI/ More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
include, but will not be limited to: Design, build, and optimize high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks or Microsoft Fabric. Implement scalable solutions to ingest, store, and transform vast datasets, ensuring data availability and quality across the organization. Write clean, efficient, and More ❯
and Experience Degree (BSc, MSc, or PhD) in Computer Science, Mathematics, or a related field. Familiarity with alternative cloud and data platforms such as Databricks, Snowflake, Azure, or AWS. Knowledge of DevOps/DataOps methodologies and experience with CI/CD pipelines. Understanding of monitoring, logging, and troubleshooting in cloud More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Net Talent
with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
solutions align with NHS requirements and best practices. Develop end-to-end data solutions leveraging Azure services, including Azure Synapse Analytics, Azure Data Factory, Databricks, and Azure SQL. Define data models, integration patterns, and governance frameworks to ensure efficient data management, interoperability, and compliance. Drive cloud migration strategies for NHS More ❯
and processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres More ❯
field, with a focus on building scalable data systems and platforms. Expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
organisations, through e.g. the RFI/RFP process, as preferred bidder, documented bids and face to face presentations. Experience of data science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and machine learning frameworks (e.g. Keras, Tensorflow, PyTorch, scikit-learn) Cloud platforms – demonstrable experience of building and deploying solutions to Cloud (e.g. More ❯
demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Good knowledge of Databricks, Snowflake, Azure/AWS/Oracle cloud, R, Python. Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package More ❯
and processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres More ❯
include contributing to knowledge-sharing activities and data services. Essential technical experience you will demonstrate Strong experience designing and delivering data solutions in the Databricks Data Intelligence platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and More ❯
with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL More ❯
is seeking a Technical Architect to design and deliver cutting-edge solutions using Microsoft Fabric. This consultancy is both a Microsoft Partner and a Databricks Partner, specializing in helping clients manage and derive value from their data. They develop tailored, data-driven solutions to meet specific client needs, primarily within More ❯
is seeking a Technical Architect to design and deliver cutting-edge solutions using Microsoft Fabric. This Consultancy is both a Microsoft Partner and a Databricks Partner, specializing in helping clients better manage and derive value from their data. They pride themselves on developing tailored, data-driven solutions to meet specific More ❯
systems Qualifications & Skills: x5 + experience with Python programming for data engineering tasks Strong proficiency in SQL and database management Hands-on experience with Databricks and Apache Spark Familiarity with Azure cloud platform and related services Knowledge of data security best practices and compliance standards Excellent problem-solving and communication More ❯
data platform solutions • In depth knowledge of the Snowflake platform and capabilities • Relevant experience of working with other cloud data platform solutions such as Databricks, GCP BigQuery, Microsoft Azure or AWS offerings would also be advantageous • Practical knowledge of GenAI and LLM offerings in the market • Skilled in working on More ❯
modern tech stacks (e.g., TypeScript, React, Node.js, Python). Review code and support continuous integration and deployment processes. · Cloud & DevOps: Leverage platforms like Azure, Databricks, and Kubernetes to build and deploy resilient cloud-native applications. Use CI/CD and DevOps pipeline for automation. · Cross-Functional Collaboration: Work closely with More ❯
architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership More ❯
data platform solutions • In depth knowledge of the Snowflake platform and capabilities • Relevant experience of working with other cloud data platform solutions such as Databricks, GCP BigQuery, Microsoft Azure or AWS offerings would also be advantageous • Practical knowledge of GenAI and LLM offerings in the market • Skilled in working on More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers, analysts, and client teams to deliver value-focused data solutions We'd love to talk to you if: You More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Xcede
Consultant will have the following: Solid technical ability in Python , SQL & Azure (essential). Strong proficiency in Microsoft Azure and relevant tools/technologies ( Databricks, Azure Data Factory, Azure Data Lake, Azure Synapse ). Proven understanding of DevOps best practices: CI/CD (Azure DevOps is preferred). Solid communication More ❯
in solution delivery. Qualifications and Skills: Proficiency in SQL Server Business Intelligence Development Studio, SSRS, SSIS, SQL Server, Visual Studio, Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Microsoft Fabric, and PySpark. Experience in developing reusable and dynamic ADF/Fabric pipelines. Strong communication skills for interaction with clients and More ❯