Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency More ❯
a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow OR Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
organisations, through e.g. the RFI/RFP process, as preferred bidder, documented bids and face to face presentations. Experience of data science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and machine learning frameworks (e.g. Keras, Tensorflow, PyTorch, scikit-learn) Cloud platforms – demonstrable experience of building and deploying solutions to Cloud (e.g. More ❯
Aberdeen, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
organisations, through e.g. the RFI/RFP process, as preferred bidder, documented bids and face to face presentations. Experience of data science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and machine learning frameworks (e.g. Keras, Tensorflow, PyTorch, scikit-learn) Cloud platforms – demonstrable experience of building and deploying solutions to Cloud (e.g. More ❯
demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Good knowledge of Databricks, Snowflake, Azure/AWS/Oracle cloud, R, Python. Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package More ❯
and processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres More ❯
include contributing to knowledge-sharing activities and data services. Essential technical experience you will demonstrate Strong experience designing and delivering data solutions in the Databricks Data Intelligence platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and More ❯
with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL More ❯
with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL More ❯
with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL More ❯
with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL More ❯
with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL More ❯
with a focus on building scalable data systems and platforms. Strong expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL More ❯
is seeking a Technical Architect to design and deliver cutting-edge solutions using Microsoft Fabric. This consultancy is both a Microsoft Partner and a Databricks Partner, specializing in helping clients manage and derive value from their data. They develop tailored, data-driven solutions to meet specific client needs, primarily within More ❯
is seeking a Technical Architect to design and deliver cutting-edge solutions using Microsoft Fabric. This Consultancy is both a Microsoft Partner and a Databricks Partner, specializing in helping clients better manage and derive value from their data. They pride themselves on developing tailored, data-driven solutions to meet specific More ❯
systems Qualifications & Skills: x5 + experience with Python programming for data engineering tasks Strong proficiency in SQL and database management Hands-on experience with Databricks and Apache Spark Familiarity with Azure cloud platform and related services Knowledge of data security best practices and compliance standards Excellent problem-solving and communication More ❯
systems Qualifications & Skills: x5 + experience with Python programming for data engineering tasks Strong proficiency in SQL and database management Hands-on experience with Databricks and Apache Spark Familiarity with Azure cloud platform and related services Knowledge of data security best practices and compliance standards Excellent problem-solving and communication More ❯
systems Qualifications & Skills: x5 + experience with Python programming for data engineering tasks Strong proficiency in SQL and database management Hands-on experience with Databricks and Apache Spark Familiarity with Azure cloud platform and related services Knowledge of data security best practices and compliance standards Excellent problem-solving and communication More ❯
data platform solutions • In depth knowledge of the Snowflake platform and capabilities • Relevant experience of working with other cloud data platform solutions such as Databricks, GCP BigQuery, Microsoft Azure or AWS offerings would also be advantageous • Practical knowledge of GenAI and LLM offerings in the market • Skilled in working on More ❯
modern tech stacks (e.g., TypeScript, React, Node.js, Python). Review code and support continuous integration and deployment processes. · Cloud & DevOps: Leverage platforms like Azure, Databricks, and Kubernetes to build and deploy resilient cloud-native applications. Use CI/CD and DevOps pipeline for automation. · Cross-Functional Collaboration: Work closely with More ❯
architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership More ❯
architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership More ❯
architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers, analysts, and client teams to deliver value-focused data solutions We'd love to talk to you if: You More ❯
developing and deploying generative AI solutions at scale. Familiarity with LLMs like GPT and BERT, AI frameworks (TensorFlow, PyTorch), cloud solutions, and tools like Databricks and Snowflake. Experience managing stakeholders in complex environments. Skills: Expertise in designing, coding, and optimizing generative AI models, prompt engineering, understanding AI algorithms, and deploying More ❯