South East London, England, United Kingdom Hybrid / WFH Options
Counter Terrorism Police
least three of the following: NET (VB, C#, ASP.NET, .NET CORE)MVC FrameworkPythonJavaScript (REACT, Bootstrap Frameworks)Database designSQL/SQL ServerNoSQL technologies e.g., MongoDB, Hadoop, etc.If you’re the right person for the role, you’ll bring experience of working on a range of applications across the development lifecycle more »
standard methodology methods and tooling used across the engineering teamRequirements:5-7 years of Java experienceCapital markets Front office experienceExperience working with Data lake (Hadoop) consumption, specifically Hive experienceKafka experienceRules engine experience (Ideally open source/vendor products, e.g. Drools or Camunda)Unix scripting knowledgeMarkets Regulatory/Trade control more »
Azure SQL Data Warehouse, Azure Data Lake, AWS S3,AWS RDS,AWS Lambda or similar Have experience with Open Source big data products i.e. Hadoop Hive, Pig, Impala or similar Have experience with Open Source non-relational or NoSQL data repositories such as: MongoDB, Cassandra, Neo4J or similar Be more »
Version 1 has celebrated over 26 years in the Technology industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
Version 1 has celebrated over 26 years in the Technology industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : • Base Salary: £45,000 - £75,000 (DoE) • Discretionary Bonus: Circa 10% per annum • DV Bonus: Circa £5,000 • Flex Fund: £5000 • Health: Private more »
London, England, United Kingdom Hybrid / WFH Options
Global Relay
a Global Relay DevOps engineer you will be integrated with a software engineering team to develop on premise ('on-prem') solutions including working with Hadoop based technologies. Your role will involve designing, implementing and supporting automated, scalable solutions. Your contribution will have an immediate impact of enabling efficient delivery … them from reoccurring. Deployments: Writing and running deployment automation tools using helm, ansible, or other configuration management systems Platform Integration: With technologies such as Hadoop and Kubernetes Some of the technologies that you will interact with include: Containerisation and virtualisation: Docker, Kubernetes, VMWare Operating Systems: Linux Build and deployment … Jenkins, Bitbucket, Maven, Helm Instrumentation and monitoring: Loki, Prometheus, Grafana, Mimir Languages and frameworks: Bash, Java, Groovy, Go, Python Big data technologies: Cassandra, ArangoDB, Hadoop, Kafka, MongoDB, Ceph Where you have knowledge gaps, training and mentoring will be provided. About You: You have an automation-first mindset. You enjoy more »
Azure SQL Data Warehouse, Azure Data Lake, AWS S3,AWS RDS,AWS Lambda or similar Have experience with Open Source big data products i.e. Hadoop Hive, Pig, Impala or similar Have experience with Open Source non-relational or NoSQL data repositories such as: MongoDB, Cassandra, Neo4J or similar Be more »
Senior Vice President Data & AI London based We are searching for a Senior Vice President of Data and Artificial Intelligence- someone with hands on experience designing AI solutions to solve complex business problems. Your new role is a leadership position more »
Woking, Surrey, South East, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited
e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Implement security best practices and ensure compliance with cybersecurity standards. Collaborate with development teams to integrate security practices … Familiarity with SQL and database management systems, including relational and NoSQL databases. Knowledge of data streaming technologies (e.g., Kafka) and big data platforms (e.g., Hadoop). Understanding of cybersecurity principles and best practices. Familiarity with architectural styles and experience in implementing DevSecOps practices. Excellent problem-solving skills and attention more »
Woking, Surrey, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
large-scale data science/data analytics projects Ability to lead effectively across organizations Hands-on experience with Data Analytics technologies such as AWS, Hadoop, Spark, Spark SQL, Mlib or Storm/Samza Implementing AWS services in a variety of distributed computing, enterprise environments Proficiency with at least one more »
modeling, data access, and data storage techniques. Excellent problem-solving skills and the ability to think algorithmically. Desirable Skills: Knowledge of big data technologies (Hadoop, Spark, Kafka) is highly desirable. Familiarity with data governance and compliance requirements. more »
as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong problem-solving and analytical abilities. Ability to present solutions and limitations to non-IT business experts ABOUT YOU Integrity, respect, intellectually curious more »
Guildford, England, United Kingdom Hybrid / WFH Options
Hawksworth
data warehousing and ETL frameworks Proficiency in working with relational databases (e.g., Oracle, PostgreSQL), Parquet/Delta files and big data technologies (e.g. Synapse, Hadoop, Spark, Kafka) Knowledge of Microsoft Azure and associated data services is a good to have. Strong analytical and data interpretation skills, with the ability more »
experience in data science, preferably within the energy or utilities sector. Technical Skills: Proficiency in Python, R, SQL, and big data technologies such as Hadoop, Spark, or Kafka. Experience with machine learning frameworks such as TensorFlow or PyTorch. Analytical Skills: Strong problem-solving skills with the ability to derive more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
Analytics. Strong SQL and Python skills. Experience with data modeling, ETL processes, and data warehousing. Knowledge of big data technologies such as Spark and Hadoop is a plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Experience in the healthcare sector is a plus more »
Surrey, England, United Kingdom Hybrid / WFH Options
The JM Longbridge Group
e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Qualifications : Bachelor’s degree in computer science, Engineering, or related field (or equivalent experience). Experience with cloud more »
scaling, and troubleshooting of cloud systems.- Operational experience running a 24x7 production infrastructure at scale.- Proficiency working with data structures, schemas, and technologies like Hadoop, Hive, Redis, and MySQL- Experience in using cloud-native services like GKE, EKS, AWS/GCP load balancing, AWS/GCP cloud storage platforms more »
manage multiple tasks and projects simultaneously. Preferred Qualifications AWS Certified Solutions Architect or other relevant AWS certifications. Experience with big data technologies such as Hadoop, Spark, or similar. Knowledge of data governance and data quality best practices. Familiarity with machine learning and AI concepts and tools. more »
ideas • Ability to set the direction and deliver on a vision with forward planning to achieve results • Technical knowledge of big data platforms (e.g., Hadoop and Hive) as well as knowledge of ML, Data science and advanced modelling techniques, technologies, and programming languages • Possess a high degree of self more »
Experience of Data Lake/Hadoop platform implementation Hands-on experience in implementation and performance tuning Hadoop/Spark implementations Experience ApacheHadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr, Avro) Experience with one … or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code in one or more programming languages (Java, Python, etc.) Preferred Qualifications Masters or PhD in Computer Science, Physics, Engineering or Math Hands on experience leading large-scale global data warehousing and analytics projects Ability more »