Swansea, Wales, United Kingdom Hybrid / WFH Options
CPS Group (UK) Limited
my client will train you): Knowledge of Microsoft SQL Server and packaged BI tools (SSAS and SSIS). Docker, Kubernetes and cloud computing technologies. Apache Kafka and data streaming. Familiarity with ApacheSpark or similar data processing tools. Experience developing and maintaining CICD pipelines, particularly Azure DevOps more »
language, ideally Python but can also be Java or C/c++ SQL expeirence Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau) Get in touch with Ella Alcott - Ella@engagewithus.com more »
Surrey, England, United Kingdom Hybrid / WFH Options
Hawksworth
working in the world of Data Science You're more than capable with SQL & Python You have exposure to big data technologies such as Spark Ideally you will have experience with statistical analysis, machine learning algorithms, and data mining techniques You have excellent communication skills and can communicate well more »
Manchester, North West, United Kingdom Hybrid / WFH Options
TECHNOLOGY RECWORKS LIMITED
sponsors) Knowledge and experience of the following would be advantageous: Knowledge of Enterprise Architecture Frameworks Good knowledge of Azure DevOps Pipelines Strong experience in ApacheSpark framework Previous experience in designing and delivering data warehouse and business intelligence solutions using on-premises Microsoft stack (SSIS, SSRS, SSAS) Knowledge more »
Cheltenham, England, United Kingdom Hybrid / WFH Options
Yolk Recruitment Ltd
software solutions. Skills Required: In depth experience designing & building backend applications in Python. Familiarity with Big Data and Machine Learning technologies (NumPy, PyTorch, TensorFlow, Spark). Experience developing in a highly Agile/Scrum environment. Familiarity with CICD, containerisation, deployment technologies & cloud platforms (Jenkins, Kubernetes, Docker, AWS). Benefits more »
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
yolk recruitment
software solutions. Skills Required: In depth experience designing & building backend applications in Python. Familiarity with Big Data and Machine Learning technologies (NumPy, PyTorch, TensorFlow, Spark). Experience developing in a highly Agile/Scrum environment. Familiarity with CICD, containerisation, deployment technologies & cloud platforms (Jenkins, Kubernetes, Docker, AWS). Benefits more »
Birmingham, England, United Kingdom Hybrid / WFH Options
⭕️ Nimbus®
as Python, C#, .Net and/or JavaScript is highly desirable. Experience with cloud platforms (e.g., Azure) and data technologies (e.g., SQL, NoSQL, Hadoop, Spark). PLEASE NOTE: You must have either UK citizenship or permanent leave to remain in the UK. Due to the high volume of applications more »
DETAIL AND ACCURACY IN DATA ANALYSIS · EFFECTIVE COMMUNICATION SKILLS TO CONVEY COMPLEX FINDINGS TO NON-TECHNICAL STAKEHOLDERS · EXPERIENCE WITH BIG DATA TECHNOLOGIES (E.G., HADOOP, SPARK). Could this be you? BNP Paribas Personal Finance believe it’s a positive attitude and passion to make things happen that matters most. more »
Pharmaceutical industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices more »
Pharmaceutical industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices more »
Pharmaceutical industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices more »
Manchester, England, United Kingdom Hybrid / WFH Options
Lorien
MongoDB/DynamoDB/etc.) Solid understanding of data governance principles and how to implement these across the business Knowledge of Big Data technology (Spark/Hadoop/etc.) Excellent communication skills across various levels of stakeholders Benefits: Salary available £120,000 Bonus scheme Enhanced pension contribution available Genuine more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Experian Ltd
Redshift, DynamoDB, Glue and SageMaker Infrastructure-as-Code tools and approaches (we use the AWS CDK with CloudFormation) Data processing frameworks such as pandas, Spark and PySpark Machine learning concepts like model training, model registry, model deployment and monitoring Development and CI/CD tools (we use GitHub, CodePipeline more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka. Experience with Big Data solutions or Relational DB. Demonstrated knowledge of software applications and technical processes within a cloud or microservice architecture. Hands more »
in programming languages commonly used in machine learning, preferably Python. Experience with machine learning frameworks and libraries, such as TensorFlow, PyTorch, scikit-learn, or Apache Spark. Proven track record of developing and implementing machine learning solutions in a professional setting. Passion for exploring new technologies and driving innovation in more »
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Experis
knowledge of security principles and best practices for cloud-based solutions. Preferred Skills : Certification in cloud platforms. Experience with big data technologies such as Apache Hadoop, Spark, or Kafka. Knowledge of data governance and compliance frameworks. Familiarity with DevOps practices and tools (e.g., Git, Jenkins, Terraform). All more »
major advantage Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate and the like. Hands-on with infrastructure-as-code tools and automation, such as Terraform, Ansible, or Helm. The role Tech Lead responsible more »
network programming. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate and the like. Hands-on with infrastructure-as-code tools and automation, such as Terraform, Ansible, or Helm. The role Tech Lead responsible more »
or Rust. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate etc. Hands-on exp. with IaC tools and automation, such as Terraform, Ansible, or Helm. Active engagement or contributions to the open-source more »
or Rust. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate etc. Hands-on exp. with IaC tools and automation, such as Terraform, Ansible, or Helm. Active engagement or contributions to the open-source more »
or Rust. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate etc. Hands-on exp. with IaC tools and automation, such as Terraform, Ansible, or Helm. Active engagement or contributions to the open-source more »
Luton, England, United Kingdom Hybrid / WFH Options
Ventula Consulting
science and analytics team in deploying pipelines. Coach and mentor the team to improve development standards. Key requirements: Strong hands-on experience with Databricks, Spark, SQL or Scala. Proven experience designing and building data solutions on a cloud based, big data distributed system (AWS/Azure etc.) Hands-on … models and following best practices. The Ability to develop pipelines using SageMaker, MLFlow or similar frameworks. Strong experience with data programming frameworks such as Apache Spark. Understanding of common Data Science and Machine Learning models, libraries and frameworks. This role provides a competitive salary plus excellent benefits package. In more »
data platform from a legacy system to one based on AWS EMR, with Amazon RDS and DynamoDB ingestion converted to Parquet files, interrogatable through Spark and MapReduce. This modern platform will support rapid data insight generation, data experiments for new product development, our live Machine Learning solutions and live … to-target mappings) to testing and service optimisation.) Good familiarity with our developing key services/applications - AmazonRDS, Amazon DynamoDB, AWS Glue, MapReduce, Hive, Spark, YARN, Airflow. Ability to work with a range of structured, semi-structured and unstructured file formats including Parquet, json, csv, pdf, jpg. Accomplished data more »
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
Senitor Associates Limited
within Software Engineering to explore new technologies. Contribute to a team culture that prioritizes diversity, equity, inclusion, and respect. Required Skills Expertise in Java , Spark, SQL, Relational DB, Spark, NoSQL, focusing on performance optimization. A thorough understanding of the Software Development Life Cycle and agile methodologies, including CI more »
in a technical and analytical role Experience of Data Lake/Hadoop platform implementation Hands-on experience in implementation and performance tuning Hadoop/Spark implementations Experience Apache Hadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr … Avro) Experience with one or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code in one or more programming languages (Java, Python, etc.) Preferred Qualifications Masters or PhD in Computer Science, Physics, Engineering or Math Hands on experience leading large-scale global data warehousing more »