such will have a large focus on training and building your skill set. You will have a genuine opportunity to build your knowledge around cloud and big data, particularly Hadoop and related data science technology. This is a fantastic opportunity to join one of the most innovative and exciting technology companies in London and build a career in the More ❯
Data Scientist - skills in statistics, physics, mathematics, Computer Science, Engineering, Data Mining, Big Data (Hadoop, Hive, MapReduce) This is an exceptional opportunity to work as a Data Scientist within a global analytics team, utilizing various big data technologies to develop complex behavioral models, analyze customer uptake of products, and foster new product innovation. Responsibilities include: Generating and reviewing large More ❯
noSQL technologies (e.g., MS SQL, Oracle, MongoDB). Experience with data processing and exchange (ETL, ESB, API). Strong development skills, particularly with Python. Big data experience/knowledge (hadoop etc) Advantageous to have knowledge of Generative AI, Natural Language Processing, Object Character Recognition, and containerisation (Docker). This is a hybrid role, with a need for More ❯
Employment Type: Permanent
Salary: £50000 - £55000/annum bonus, overtime paid at 1.25, 16% pe
their own Cloud estate. Responsibilities include: DevOps tooling/automation written with Bash/Python/Groovy/Jenkins/Golang Provisioning software/frameworks (Elasticsearch/Spark/Hadoop/PostgreSQL) Infrastructure Management - CasC, IasC (Ansible, Terraform, Packer) Log and metric aggregation with Fluentd, Prometheus, Grafana, Alertmanager Public Cloud, primarily GCP, but also AWS and Azure More ❯
and machine learning PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - Master's degree in math/statistics/engineering or other equivalent quantitative discipline, or PhD More ❯
an asset: Testing performance with JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo What you can expect: At Global Relay More ❯
Internal use only - Grade E About us. We are The Very Group and we're here to help families get more out of life. We know that our customers work hard for their families and have a lot to balance More ❯
a relevant discipline such as Computer Science, Statistics, Applied Mathematics, or Engineering - Strong experience with Python and R - A strong understanding of a number of the tools across the Hadoop ecosystem such as Spark, Hive, Impala & Pig - An expertise in at least one specific data science area such as text mining, recommender systems, pattern recognition or regression models - Previous More ❯
in applied research PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - PhD Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your More ❯
neural deep learning methods and machine learning PREFERRED QUALIFICATIONS - Experience with popular deep learning frameworks such as MxNet and Tensor Flow - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience More ❯
and machine learning PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. Our inclusive culture empowers More ❯
experience across AWS Glue, Lambda, Step Functions, RDS, Redshift, and Boto3. Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as: Spark, Hadoop etc. Practical knowledge of building Real Time event streaming pipelines (eg, Kafka, Spark Streaming, Kinesis). Proven experience developing modern data architectures including Data Lakehouse and Data Warehousing. A … tooling, and data governance including GDPR. Bonus Points For Expertise in Data Modelling, schema design, and handling both structured and semi-structured data. Familiarity with distributed systems such as Hadoop, Spark, HDFS, Hive, Databricks. Exposure to AWS Lake Formation and automation of ingestion and transformation layers. Background in delivering solutions for highly regulated industries. Passion for mentoring and enabling More ❯
to knowledge sharing across the team. What We're Looking For Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as: Spark, Hadoop etc. Practical knowledge of building real-time event streaming pipelines (e.g., Kafka, Spark Streaming, Kinesis). Proficiency in AWS cloud environments. Proven experience developing modern data architectures including Data … tooling, and data governance including GDPR. Bonus Points For Expertise in Data Modelling, schema design, and handling both structured and semi-structured data. Familiarity with distributed systems such as Hadoop, Spark, HDFS, Hive, Databricks. Exposure to AWS Lake Formation and automation of ingestion and transformation layers. Background in delivering solutions for highly regulated industries. Passion for mentoring and enabling More ❯
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best practice guidance, e.g. ITIL, OGC toolkit. Additional More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Actica Consulting
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best practice guidance, e.g. ITIL, OGC toolkit. Additional More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best practice guidance, e.g. ITIL, OGC toolkit. Additional More ❯
should take pride in the software you produce. Apache Maven Desirable Skills and Experience Industry experience of any of the following technologies/skills would bebeneficial: Elasticsearch Docker ApacheHadoop, Kafka or Camel. Javascript Knowledge of both Windows and Linux operating systems. Kubernetes Nifi Nsq Apache Ignite Arango Why BAE Systems? This is a place where you'll be More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
BAE Systems (New)
should take pride in the software you produce. Apache Maven Desirable Skills and Experience Industry experience of any of the following technologies/skills would bebeneficial: Elasticsearch Docker ApacheHadoop, Kafka or Camel. Javascript Knowledge of both Windows and Linux operating systems. Kubernetes Nifi Nsq Apache Ignite Arango Why BAE Systems? This is a place where you'll be More ❯
Frimley, Surrey, United Kingdom Hybrid / WFH Options
BAE Systems (New)
should take pride in the software you produce. Apache Maven Desirable Skills and Experience Industry experience of any of the following technologies/skills would bebeneficial: Elasticsearch Docker ApacheHadoop, Kafka or Camel. Javascript Knowledge of both Windows and Linux operating systems. Kubernetes Nifi Nsq Apache Ignite Arango Why BAE Systems? This is a place where you'll be More ❯
Stevenage, Hertfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
Server, Oracle or similar - Experience with ETL, API, ESL or similar - Hands on with coding languages such as Python, Bash or similar - Knowledge of Big Data tech such as Hadoop or simlar - Knowledge in the following areas are highly desirable but not essential - AI, Natural Language Processing, OCR - Experience with containerisation via Docker is highly advantageous - Please not roles More ❯
London, England, United Kingdom Hybrid / WFH Options
Hays
focused environment, with a strong understanding of sustainability data. Programming skills in Python and/or R, with working knowledge of SQL and exposure to Big Data technologies (e.g. Hadoop). Solid background in financial services, preferably within asset management or investment analytics. Full-time office presence is required for the first 3 months. After this period, a hybrid More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hays
focused environment, with a strong understanding of sustainability data. Programming skills in Python and/or R, with working knowledge of SQL and exposure to Big Data technologies (e.g. Hadoop). Solid background in financial services, preferably within asset management or investment analytics. Full-time office presence is required for the first 3 months. After this period, a hybrid More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hays
focused environment, with a strong understanding of sustainability data. Programming skills in Python and/or R, with working knowledge of SQL and exposure to Big Data technologies (e.g. Hadoop). Solid background in financial services, preferably within asset management or investment analytics. Full-time office presence is required for the first 3 months. After this period, a hybrid More ❯
Pulsar) and Kubernetes Familiarity with graph databases and graph traversal languages like Cypher and Infrastructure as code tools (Pulumi, Cloudformation, Terraform) Hands-on experience with distributed computing technologies like Hadoop, Spark, Flink Comfortable with independently diagnosing issues across the entire stack network, application or server using tools such as ( JVM Profiling, Wireshark, Charles, debuggers More ❯
in applied research PREFERRED QUALIFICATIONS Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience More ❯