data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost More ❯
data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost More ❯
data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost More ❯
data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost More ❯
data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost More ❯
be technically skilled in most or all of the below: - Expert knowledge of Python and SQL, inc. the following libraries: Numpy, Pandas, PySpark and Spark SQL - Expert knowledge of ML Ops frameworks in the following categories: a) experiment tracking and model metadata management (e.g. MLflow) b) orchestration of ML More ❯
data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost More ❯
Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Somerset Bridge
experience in building ELT pipelines and working with large-scale datasets using Azure Data Factory (ADF) and Databricks. Strong proficiency in SQL (T-SQL, Spark SQL) for data extraction, transformation, and optimisation. Proficiency in Azure Databricks (PySpark, Delta Lake, Spark SQL) for big data processing. Knowledge of data … transactions, and time travel in Databricks. Strong Python (PySpark) skills for big data processing and automation. Experience with Scala (optional but preferred for advanced Spark applications). Experience working with Databricks Workflows & Jobs for data orchestration. Strong knowledge of feature engineering and feature stores, particularly in Databricks Feature store … training and inference. Experience with data modelling techniques to support analytics and reporting. Familiarity with real-time data processing and API integrations (e.g., Kafka, Spark Streaming). Proficiency in CI/CD pipelines for data deployment using Azure DevOps, GitHub Actions, or Terraform for Infrastructure as Code (IaC). More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
monitoring. Familiarity with cloud platforms such as Azure (e.g., Azure ML, Data Factory). Experience in big data environments and distributed computing frameworks (e.g., Spark). Knowledge of business intelligence tools and their integration with data science workflows. Prior experience mentoring or leading a team of data scientists. Why More ❯
on experience in tools like Snowflake, DBT, SQL Server, and programming languages such as Python, Java, or Scala. Proficient in big data tools (e.g., Spark, Kafka), cloud platforms (AWS, Azure, GCP), and embedding AI/GenAI into scalable data infrastructures. Strong stakeholder engagement and the ability to translate technical More ❯
on experience in tools like Snowflake, DBT, SQL Server, and programming languages such as Python, Java, or Scala. Proficient in big data tools (e.g., Spark, Kafka), cloud platforms (AWS, Azure, GCP), and embedding AI/GenAI into scalable data infrastructures. Strong stakeholder engagement and the ability to translate technical More ❯
GCP) Strong proficiency in SQL and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow Proficiency in Python and at More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
UK-based consultancy seeking a skilled professional to shape data strategies, mentor dynamic teams, and deliver cutting-edge solutions. With hands-on expertise in Spark, SQL, and cloud platforms like Azure, you’ll lead end-to-end projects, drive innovation, and collaborate with clients across industries. What You’ll … in ETL, data modelling, and Azure Data Services. Experience in designing and implementing data pipelines, data lakes, and data warehouses. Hands-on experience with ApacheSpark and bonus points for Microsoft Fabric Any certifications are a bonus. Benefits: Competitive base salary Hybrid work once a week into their More ❯
Familiarity with CICD, containerisation, deployment technologies & cloud platforms (Jenkins, Kubernetes, Docker, AWS) or Familiarity with Big Data and Machine Learning technologies (NumPy, PyTorch, TensorFlow, Spark). Excellent communication, collaboration & problem solving skills, ideally with some experience in agile ways of working. Security clearance: You must be able to gain More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Auto Trader UK
technologies that our Data Scientists use (we don't expect applicants to have experience with all of these): Python and Databricks for Data Science, Spark, MLFlow and Airflow for ML Workflows, Google Cloud Platform for our analytics infrastructure, dbt and BigQuery for data modelling and warehousing. We are looking More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
Databricks. Exposure to, or certifications on MS Fabric. Strong communication skills, with the ability to translate technical terminology to non-technical stakeholders. Hands-on ApacheSpark experience. What next? There are interview spots booked across the next couple of weeks, so please contact Adam Townsend on 07478213563 or More ❯
that meet the evolving needs of the business. Utilise your strong background in data engineering, combined with your existing experience using SQL, Python and ApacheSpark in production environments. The role will entail strong problem-solving skills, attention to detail, and the ability to work independently while collaborating More ❯
that meet the evolving needs of the business. Utilise your strong background in data engineering, combined with your existing experience using SQL, Python and ApacheSpark in production environments. The role will entail strong problem-solving skills, attention to detail, and the ability to work independently while collaborating More ❯
Experience in troubleshooting and problem resolution Experience in System Integration Knowledge of the following: Hadoop, Flume, Sqoop, Map Reduce, Hive/Impala, Hbase, Kafka, Spark Streaming Experience of ETL tools incorporating Big Data Shell Scripting, Python Beneficial Skills: Understanding of: LAN, WAN, VPN and SD Networks Hardware and Cabling More ❯
Wakefield, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
exchange connectivity Scripting abilities in Python, Bash, or similar languages Knowledge of monitoring tools and alerting frameworks Exposure to data technologies such as Kafka, Spark or Delta Lake is useful but not mandat Bachelor's degree in Computer Science, Engineering, or related technical field This role offers competitive compensation More ❯
Bolton, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
exchange connectivity Scripting abilities in Python, Bash, or similar languages Knowledge of monitoring tools and alerting frameworks Exposure to data technologies such as Kafka, Spark or Delta Lake is useful but not mandat Bachelor's degree in Computer Science, Engineering, or related technical field This role offers competitive compensation More ❯
s degree/PhD in Computer Science, Machine Learning, Applied Statistics, Physics, Engineering or related field Strong mathematical and statistical skills Experience with Python, Spark and SQL Experience implementing and validating a range of machine learning and optimization techniques Effective scientific communication for varied audiences Autonomy and ownership of More ❯
policies GCP or other cloud infrastructure providers Infrastructure as code (e.g. Terraform, Ansible, Puppet) Git CI/CD experience Network security Monitoring Database administration Spark Airflow Big Data ecosystem administration and tooling FinOps What you can expect from us We won’t just meet your expectations. We’ll defy More ❯