Internal use only - Grade E About us. We are The Very Group and we're here to help families get more out of life. We know that our customers work hard for their families and have a lot to balance More ❯
Analytic exposure is a big plus. Java is a must, but these will strengthen your case: Data Analytic development experience Agile development experience Familiarity with/interest in ApacheHadoop MapReduce Python experience AWS Lambdas experience Jira experience Confluence experience Gitlab experience Exposure or experience with NiFi Willingness/desire to work on high visibility tasking Willingness/ability More ❯
a relevant discipline such as Computer Science, Statistics, Applied Mathematics, or Engineering - Strong experience with Python and R - A strong understanding of a number of the tools across the Hadoop ecosystem such as Spark, Hive, Impala & Pig - An expertise in at least one specific data science area such as text mining, recommender systems, pattern recognition or regression models - Previous More ❯
enterprise-level systems; Excellent object-oriented design skills, including OOA/OOD; Experience with multi-tier architectures and service-oriented architecture; Exposure to and understanding of RDBMS, NoSQL, and Hadoop is desirable; Knowledge of the software development lifecycle and agile practices, including TDD/BDD; Strategic thinking, collaboration, and consensus-building skills. Please note: Familiarity with DevOps is important More ❯
in applied research PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. - PhD Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your More ❯
neural deep learning methods and machine learning PREFERRED QUALIFICATIONS - Experience with popular deep learning frameworks such as MxNet and Tensor Flow - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience More ❯
experience across AWS Glue, Lambda, Step Functions, RDS, Redshift, and Boto3. Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as: Spark, Hadoop etc. Practical knowledge of building Real Time event streaming pipelines (eg, Kafka, Spark Streaming, Kinesis). Proven experience developing modern data architectures including Data Lakehouse and Data Warehousing. A … tooling, and data governance including GDPR. Bonus Points For Expertise in Data Modelling, schema design, and handling both structured and semi-structured data. Familiarity with distributed systems such as Hadoop, Spark, HDFS, Hive, Databricks. Exposure to AWS Lake Formation and automation of ingestion and transformation layers. Background in delivering solutions for highly regulated industries. Passion for mentoring and enabling More ❯
to knowledge sharing across the team. What We're Looking For Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as: Spark, Hadoop etc. Practical knowledge of building real-time event streaming pipelines (e.g., Kafka, Spark Streaming, Kinesis). Proficiency in AWS cloud environments. Proven experience developing modern data architectures including Data … tooling, and data governance including GDPR. Bonus Points For Expertise in Data Modelling, schema design, and handling both structured and semi-structured data. Familiarity with distributed systems such as Hadoop, Spark, HDFS, Hive, Databricks. Exposure to AWS Lake Formation and automation of ingestion and transformation layers. Background in delivering solutions for highly regulated industries. Passion for mentoring and enabling More ❯
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best practice guidance, e.g. ITIL, OGC toolkit. Additional More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Actica Consulting
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best practice guidance, e.g. ITIL, OGC toolkit. Additional More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best practice guidance, e.g. ITIL, OGC toolkit. Additional More ❯
should take pride in the software you produce. Apache Maven Desirable Skills and Experience Industry experience of any of the following technologies/skills would bebeneficial: Elasticsearch Docker ApacheHadoop, Kafka or Camel. Javascript Knowledge of both Windows and Linux operating systems. Kubernetes Nifi Nsq Apache Ignite Arango Why BAE Systems? This is a place where you'll be More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
BAE Systems (New)
should take pride in the software you produce. Apache Maven Desirable Skills and Experience Industry experience of any of the following technologies/skills would bebeneficial: Elasticsearch Docker ApacheHadoop, Kafka or Camel. Javascript Knowledge of both Windows and Linux operating systems. Kubernetes Nifi Nsq Apache Ignite Arango Why BAE Systems? This is a place where you'll be More ❯
Frimley, Surrey, United Kingdom Hybrid / WFH Options
BAE Systems (New)
should take pride in the software you produce. Apache Maven Desirable Skills and Experience Industry experience of any of the following technologies/skills would bebeneficial: Elasticsearch Docker ApacheHadoop, Kafka or Camel. Javascript Knowledge of both Windows and Linux operating systems. Kubernetes Nifi Nsq Apache Ignite Arango Why BAE Systems? This is a place where you'll be More ❯
You will play a key role within a small data science team. Client is looking for hands on experience developing solutions for complex data science problems using Python, R, Hadoop, and Greenplum (or other Massively Parallel Processing solutions). REQUIRED SKILLS: Bachelor's Degree in a quantitative or technical field of study, such as Statistics, Mathematics, Computer Science, or More ❯
Stevenage, Hertfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
Server, Oracle or similar - Experience with ETL, API, ESL or similar - Hands on with coding languages such as Python, Bash or similar - Knowledge of Big Data tech such as Hadoop or simlar - Knowledge in the following areas are highly desirable but not essential - AI, Natural Language Processing, OCR - Experience with containerisation via Docker is highly advantageous - Please not roles More ❯
SQL Server, PL/SQL developer tools and Oracle database ; SSIS, SSAS, and SSRS; SAS Modeling; MS Visual Studio (.NET (ASP), C#, VB.net); Tableau and Shell scripting; Python and Hadoop technologies -Hive, PySpark and Zeppelin. Employer will also accept a Bachelors degree or foreign equivalent, in Computer Science, Computer Engineering, or a directly related field plus five (5) years More ❯
London, England, United Kingdom Hybrid / WFH Options
Hays
focused environment, with a strong understanding of sustainability data. Programming skills in Python and/or R, with working knowledge of SQL and exposure to Big Data technologies (e.g. Hadoop). Solid background in financial services, preferably within asset management or investment analytics. Full-time office presence is required for the first 3 months. After this period, a hybrid More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hays
focused environment, with a strong understanding of sustainability data. Programming skills in Python and/or R, with working knowledge of SQL and exposure to Big Data technologies (e.g. Hadoop). Solid background in financial services, preferably within asset management or investment analytics. Full-time office presence is required for the first 3 months. After this period, a hybrid More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hays
focused environment, with a strong understanding of sustainability data. Programming skills in Python and/or R, with working knowledge of SQL and exposure to Big Data technologies (e.g. Hadoop). Solid background in financial services, preferably within asset management or investment analytics. Full-time office presence is required for the first 3 months. After this period, a hybrid More ❯
Pulsar) and Kubernetes Familiarity with graph databases and graph traversal languages like Cypher and Infrastructure as code tools (Pulumi, Cloudformation, Terraform) Hands-on experience with distributed computing technologies like Hadoop, Spark, Flink Comfortable with independently diagnosing issues across the entire stack network, application or server using tools such as ( JVM Profiling, Wireshark, Charles, debuggers More ❯
in applied research PREFERRED QUALIFICATIONS Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience More ❯