Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue technology e.g. ActiveMQ NiFi Rego More ❯
Good to have Interview includes coding test. Job Description: Scala/Spark • Good Big Data resource with the below Skillset: § Spark § Scala § Hive/HDFS/HQL • Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies, real time data processing platform (Spark Streaming) experience would be an advantage. • Consistently demonstrates clear and concise written More ❯
Good to have Interview includes coding test. Job Description: Scala/Spark • Good Big Data resource with the below Skillset: § Spark § Scala § Hive/HDFS/HQL • Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies, real time data processing platform (Spark Streaming) experience would be an advantage. • Consistently demonstrates clear and concise written More ❯
. noSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J). Data exchange and processing skills (e.g. ETL, ESB, API). Development skills (e.g. Python). Big data technologies knowledge (e.g. Hadoop stack). This position offers a lucrative benefits package, which includes but is not inclusive of: Salary: Circa £45,000 - £55,000 (depending on experience) Bonus scheme: Up to More ❯
. noSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J). Data exchange and processing skills (e.g. ETL, ESB, API). Development skills (e.g. Python). Big data technologies knowledge (e.g. Hadoop stack). This position offers a lucrative benefits package, which includes but is not inclusive of: Salary: Circa £45,000 - £55,000 (depending on experience) Bonus scheme: Up to More ❯
. noSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J). Data exchange and processing skills (e.g. ETL, ESB, API). Development skills (e.g. Python). Big data technologies knowledge (e.g. Hadoop stack). This position offers a lucrative benefits package, which includes but is not inclusive of: Salary: Circa £45,000 - £55,000 (depending on experience) Bonus scheme: Up to More ❯
mathematics or equivalent quantitative field - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - Experience with Deep Learning for search and recommendation systems - Experience with NLP and LLM algorithms and tools a plus - Experience performing and interpreting A/B experiments More ❯
of influencing C-suite executives and driving organizational change • Bachelor's degree, or 7+ years of professional or military experience • Experience in technical design, architecture and databases (SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) • Experience implementing serverless distributed solutions • Software development experience with object-oriented languages and deep expertise in AI/ML PREFERRED QUALIFICATIONS • Proven ability to shape market More ❯
an asset: Testing performance with JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo What you can expect: At Global Relay More ❯
Join to apply for the Machine Learning Engineer role at Lumilinks Group Ltd Join to apply for the Machine Learning Engineer role at Lumilinks Group Ltd About us: Turning the fantasy of analytics, data and A.I. into reality. In a More ❯
Internal use only - Grade E About us. We are The Very Group and we're here to help families get more out of life. We know that our customers work hard for their families and have a lot to balance More ❯
a relevant discipline such as Computer Science, Statistics, Applied Mathematics, or Engineering - Strong experience with Python and R - A strong understanding of a number of the tools across the Hadoop ecosystem such as Spark, Hive, Impala & Pig - An expertise in at least one specific data science area such as text mining, recommender systems, pattern recognition or regression models - Previous More ❯
controls o Experience on starting the front-end buildout from scratch by coordinating across multiple business and technology groups o Experience building complex single-page applications using Abinitio/Hadoop/Hive/Kafka/Oracle and modern MOM technologies o Experienced with Linux/Unix platform o Experience in SCMs like GIT; and tools like JIRA o Familiar More ❯
and machine learning PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. Los Angeles County applicants More ❯
and machine learning PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. Our inclusive culture empowers More ❯
and machine learning PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. Los Angeles County applicants More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown
Expertise in implementing statistical techniques in production-grade code, with a focus on scalability and reliability. Experience with large-scale data analysis, manipulation, and distributed computing platforms (e.g., Hive, Hadoop). Familiarity with advanced machine learning methods, including neural networks, reinforcement learning, and other cutting-edge Gen AI approaches. Skilled in API development and deployment, with a focus on More ❯
Employment Type: Permanent, Part Time, Work From Home
take pride in the software you produce. Apache Maven Desirable Skills and Experience Industry experience of any of the following technologies/skills would be beneficial: Elasticsearch Docker ApacheHadoop, Kafka or Camel. Javascript Knowledge of both Windows and Linux operating systems. Kubernetes Nifi Nsq Apache Ignite Arango Why BAE Systems? This is a place where you’ll be More ❯
Stevenage, Hertfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
Server, Oracle or similar - Experience with ETL, API, ESL or similar - Hands on with coding languages such as Python, Bash or similar - Knowledge of Big Data tech such as Hadoop or simlar - Knowledge in the following areas are highly desirable but not essential - AI, Natural Language Processing, OCR - Experience with containerisation via Docker is highly advantageous - Please not roles More ❯
London, England, United Kingdom Hybrid / WFH Options
Hays
focused environment, with a strong understanding of sustainability data. Programming skills in Python and/or R, with working knowledge of SQL and exposure to Big Data technologies (e.g. Hadoop). Solid background in financial services, preferably within asset management or investment analytics. Full-time office presence is required for the first 3 months. After this period, a hybrid More ❯
in software development with at least 2 server-side languages - Java being must have Proven experience with microservices architecture and scalable, distributed systems. Proficient in data technologies like MySQL , Hadoop , or Cassandra . Experience with batch processing , data pipelines , and data integrity practices. Familiarity with AWS services (e.g., RDS, Step Functions, EC2, Kinesis) is a plus. Solid understanding of More ❯
in software development with at least 2 server-side languages - Java being must have Proven experience with microservices architecture and scalable, distributed systems. Proficient in data technologies like MySQL , Hadoop , or Cassandra . Experience with batch processing , data pipelines , and data integrity practices. Familiarity with AWS services (e.g., RDS, Step Functions, EC2, Kinesis) is a plus. Solid understanding of More ❯
Big Data projects developing in Kotlin or any other JVM language. Experience with any of the following: Oracle, Kubernetes/Openshift, Redis, Memcached. Experience with Big Data technologies like Hadoop, Cassandra, Hive. We offer Opportunity to work on cutting-edge projects. Work with a highly motivated and dedicated team. Competitive daily rate. Benefits package including medical insurance and sports More ❯
maintaining, troubleshooting, tuning) of web architecture and related applications such as the following: Apache, Nginx, Python, MySQL, Postgres, MongoDB, Postfix, CDN integrations. Experience managing data warehouse platforms & tooling: ex. Hadoop, Kafka, Cassandra. Advanced knowledge and experience creating, maintaining and debugging shell scripts. The ideal candidate will be comfortable in "non-siloed" environments and have an appetite to research, test More ❯
We are looking for someone with experience managing System Integrators, Data Warehouse, ETL, and Hadoop Implementation companies, cloud, and data alliances worldwide, with a focus on the EMEA markets. The Skills You'll Need: You have at least 10 -15+ years of relevant partner management and sales experience. You are a team player and enjoy making others successful. More ❯