a sense of trust with stakeholders. Preferred qualifications, capabilities and skills Experience with deep learning frameworks (pytorch, tensorflow) Experience with big-data technologies (Spark, Hadoop) or distributed computation frameworks (Dask, Modin) Hands on experience with Natural Language Processing (NLP) and Large Language Models (LLMs) Experience of creating and deploying more »
environments: Sybase ASE/IQ, Oracle or DB2 It would be great if you have: Experience in Cluster Computing and Big Data solutions: Spark, Hadoop, HDSF, XRS using public cloud Our Commitment to Diversity & Inclusion: Did you know that Apexon has been Certified™ by Great Place To Work®, the more »
CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm more »
CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm more »
ML frameworks (TensorFlow, PyTorch) Nice to have: Familiarity with Git or other Version Control Systems Computer Vision Library exposure Understanding of Big Data Technologies (Hadoop, Spark etc) Experience with Cloud platforms (AWS, GCP or Azure) This is a fully remote role, but may require very occasional travel (once a more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Dupen Ltd
Systems: Linux, APIs, infrastructure design – load balancing, VMs, PostgreSQL, vector dbs. ML Learning Engineer – desirable skills: Version control (Git), computer vision libraries, Big Data (Hadoop, Spark), Cloud – AWS, Google Cloud, Azure, and a knowledge of secure coding techniques – PCI-DSS, PA-DSS, ISO27001. Note: as there are actually two more »
Employment Type: Permanent
Salary: £50000 - £60000/annum To £60,000 + range of benefits
Milton Keynes, Buckinghamshire, South East, United Kingdom Hybrid / WFH Options
Dupen Ltd
Systems: Linux, APIs, infrastructure design load balancing, VMs, PostgreSQL, vector dbs. ML Learning Engineer desirable skills: Version control (Git), computer vision libraries, Big Data (Hadoop, Spark), Cloud AWS, Google Cloud, Azure, and a knowledge of secure coding techniques PCI-DSS, PA-DSS, ISO27001. Note: as there are actually two more »
Milton Keynes, Bedfordshire, South East, Woolstone, Buckinghamshire, United Kingdom Hybrid / WFH Options
Dupen Ltd
Systems: Linux, APIs, infrastructure design – load balancing, VMs, PostgreSQL, vector dbs. ML Learning Engineer – desirable skills: Version control (Git), computer vision libraries, Big Data (Hadoop, Spark), Cloud – AWS, Google Cloud, Azure, and a knowledge of secure coding techniques – PCI-DSS, PA-DSS, ISO27001. Note: as there are actually two more »
Employment Type: Permanent
Salary: £50000 - £60000/annum To £60,000 + range of benefits
Experience with modeling tools such as PyTorch, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Experience in building speech recognition, machine translation and natural language processing systems (e.g., commercial speech products or government speech projects) Amazon is more »
London, England, United Kingdom Hybrid / WFH Options
Global Relay
JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo Continuous delivery more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
best-of-breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
phases of projects through prototyping, architectural design and delivery. You will be working with Azure tools such as Databricks, Data Factory as well as Hadoop to create big data environments which, in turn, will help businesses to gain greater insight into their big data repositories. RESPONSIBILITIES Working on projects more »
system requirements reviews, design reviews and other types of technical meetings. Demonstratable knowledge of DevOps tool chains and processes. Experience with Big Data (e.g., Hadoop, NiFi, Spark, PySpark, Dask, etc.). Experience in application migration. Agile development experience. Bachelor's Degree in Computer Science, Computer Engineering, Software Engineering, or more »
system requirements reviews, design reviews and other types of technical meetings. Demonstratable knowledge of DevOps tool chains and processes. Experience with Big Data (e.g., Hadoop, NiFi, Spark, PySpark, Dask, etc.). Experience in application migration. Agile development experience. Bachelor's Degree in Computer Science, Computer Engineering, Software Engineering, or more »
system requirements reviews, design reviews and other types of technical meetings. Demonstratable knowledge of DevOps tool chains and processes. Experience with Big Data (e.g., Hadoop, NiFi, Spark, PySpark, Dask, etc.). Experience in application migration. Agile development experience. Bachelor's Degree in Computer Science, Computer Engineering, Software Engineering, or more »
Jenkins and UrbanCode methodologies, demonstrating engineering excellence and a passion for automation Additionally, data domain technology experience including: Database technology such as Teradata, Oracle, Hadoop, DB2 and familiarity with NoSQL databases such as Cassandra and HBase Expertise in Container & Virtualization/Hypervisor technologies such as K8,firecracker/gVisor more »
About the roleA Payments FinTech are currently seeking a Data Engineering Lead (Python, Hadoop & SQL) to lead and mentor a talented team of data engineers and scientists as they look to simplify the bank through developing innovative data driven solutions, allowing them to be commercially successful through insight, and more »
Greater London, England, United Kingdom Hybrid / WFH Options
Oliver Bernard
essential: -Proven experience as an Architect and excellent knowledge of Big Data -Great understanding of Cloud e.g. Azure and or AWS -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
within a typical retail trading environment is key. Experience required: A background in leveraging hands on skills using tools such as Python, R, Spark, Hadoop, SQL and cloud based platforms such as GCP, Azure and AWS to manipulate and analyse various data sets in large volumes Background in data more »
on experience with analytic tools like R & Python; & visualization tools like Tableau & Power BI Exposure to cloud platforms and big data systems such as Hadoop HDFS, and Hive is a plus Ability to work with IT and Data Engineering teams to help embed analytic outputs in business processes Graduate more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : • Base Salary: £45,000 - £75,000 (DoE) • Discretionary Bonus: Circa 10% per annum • DV Bonus: Circa £5,000 • Flex Fund: £5000 • Health: Private more »
City of London, London, United Kingdom Hybrid / WFH Options
BeTechnology Group
scalable systems. Deep knowledge of distributed and scalable systems, including proficiency with PostgreSQL, Ray, RabbitMQ, and Cassandra. Familiarity with big data technologies such as Hadoop, Spark, or Kafka . Experience with CI/CD Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems. Excellent more »