Greater London, England, United Kingdom Hybrid / WFH Options
Oliver Bernard
following skills/experience is essential: -Proven experience as an Architect and excellent knowledge of Big Data -Excellence experience across Azure -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
decision making skills Exposure to working with REST API's Any of the following skills would be an added bonus: Has run code across Hadoop/MapReduce clusters Has code running in a production environment Used SAS before (or at least can decipher SAS code) Worked with very large more »
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
G.Digital
experience applying ML models Python, SQL or R Any cloud data platforms would be great - AWS, Azure or GCP Any big data technologies like Hadoop would be great Strong communication and stakeholder engagement What’s on offer to you? 💰 £85-95k Bonus of up to 10% Flexible working more »
As an IT Specialist, you'll need broad expertise across various areas of the technology/software domain. Proficiency in AWS or Big Data, Hadoop or other SQL databases, Lucene, Spark, web app development (JavaScript, Node.js), Docker, Jenkins, Git, Python, or Ruby would be highly beneficial. Key Responsibilities: Meet more »
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
BeTechnology Group
in Python development Deep knowledge of distributed and scalable systems, including proficiency with PostgreSQL, RabbitMQ, and Cassandra. Familiarity with big data technologies such as Hadoop, Spark, or Kafka . Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems. Excellent communication and collaboration skills, with more »
Cambridge, Cambridgeshire, East Anglia, United Kingdom Hybrid / WFH Options
Be Technology
in Python development Deep knowledge of distributed and scalable systems, including proficiency with PostgreSQL, RabbitMQ, and Cassandra. Familiarity with big data technologies such as Hadoop, Spark, or Kafka . Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems. Excellent communication and collaboration skills, with more »
Data Analytics stack (IS, AS, RS) Power BI, DAX MDS Azure Data Lakes Supporting: Azure ML .Net/HTML5 Azure infrastructure R, Python Powershell Hadoop, Data Factory Principles: Data Modelling Data Warehouse Theory Data Architecture Master Data Management Data Science WHY ADATIS? There’s a long list of reasons more »
Woking, Surrey, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
scalability, and extensibility. The following skills/experience is essential: -Proven experience as an Architect and excellent knowledge of Big Data -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
Testing performance with JMeter or similar toolsWeb services technology such as REST, JSON or ThriftTesting web applications with Selenium WebDriverBig data technology such as Hadoop, MongoDB, Kafka or SQLNetwork principles and protocols such as HTTP, TLS and TCPContinuous integration systems such as Jenkins or BambooContinuous delivery concepts#LI-HybridWhat you more »
rulemaking. What you'll need to succeed Extensive Business and Data Analysis experience Strong SQL and Excel skills Data Visualisation experience Experience with Python, Hadoop or Big Data What you'll get in return An exciting opportunity to join an international organisation working with a major financial services organisation. more »
City of London, London, United Kingdom Hybrid / WFH Options
ECS Resource Group
continuously enhancing our technical capabilities. Must-Have Skills: Proficiency in Python and Java, with experience in any flavor of Spark. In-depth knowledge of Hadoop ecosystem components. Expertise in CI/CD pipelines and a solid understanding of DevOps practices. Hands-on experience with containerization technologies. String experience in more »
standard methodology methods and tooling used across the engineering teamRequirements:5-7 years of Java experienceCapital markets Front office experienceExperience working with Data lake (Hadoop) consumption, specifically Hive experienceKafka experienceRules engine experience (Ideally open source/vendor products, e.g. Drools or Camunda)Unix scripting knowledgeMarkets Regulatory/Trade control more »
in your own skills development with support from the team and wider organisation GNU/Linux software customisation, configuration and integration Kubernetes Operators and Hadoop customisation, API integration and configuration with Ansible, Kustomize and Helm System performance analysis and improvement of multiple, high spec large clusters Required qualifications to more »
in your own skills development with support from the team and wider organisation GNU/Linux software customisation, configuration and integration Kubernetes Operators and Hadoop customisation, API integration and configuration with Ansible, Kustomize and Helm System performance analysis and improvement of multiple, high spec large clusters Required qualifications to more »
No Sponsorship provided Responsibilities & Skills: Strong Snowflake Development experience Hands on experience on Data Modelling/Data Migration and related tools Sound Knowledge on Hadoop Architecture Strong knowledge of SQL Best pointers for working with the client Ø Opportunity to shape the future of a rapidly growing company. Ø more »
new platforms and into new customer bases. C urrently exploring options including RAD Studio, Visual Studio, Delphi, C#, C++, Client/Server, n-tier, Hadoop and SaaS. They require candidate with a strong computing background . You will be coding in Delphi and other languages. Any similar Object Oriented more »
Experience or ability to demonstrate mentoring experience – jQuery, – Version control using git – Working knowledge of Linux – Experience with mobile applications would be beneficial – Currently Hadoop is used but not requiredThe environment is that of Facebook or Google, relaxed open with time to think and make the right decisions. The more »
Southampton, Hampshire, South East, United Kingdom Hybrid / WFH Options
Leo Recruitment Limited
Are you passionate about exploring the intricate nuances hidden within vast datasets, using your expertise to uncover valuable insights that drive strategic decision-making and fuel business growth? Do you thrive in a high volume data-oriented environment, where every more »
Woking, Surrey, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Implement security best practices and ensure compliance with cybersecurity standards. Collaborate with development teams to integrate security practices … Familiarity with SQL and database management systems, including relational and NoSQL databases. Knowledge of data streaming technologies (e.g., Kafka) and big data platforms (e.g., Hadoop). Understanding of cybersecurity principles and best practices. Familiarity with architectural styles and experience in implementing DevSecOps practices. Excellent problem-solving skills and attention more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Implement security best practices and ensure compliance with cybersecurity standards. Collaborate with development teams to integrate security practices … Familiarity with SQL and database management systems, including relational and NoSQL databases. Knowledge of data streaming technologies (e.g., Kafka) and big data platforms (e.g., Hadoop). Understanding of cybersecurity principles and best practices. Familiarity with architectural styles and experience in implementing DevSecOps practices. Excellent problem-solving skills and attention more »
your time for being hands on to stay current with tools and technology.Traditionally our work has been implemented on-premise on EDH (LBG’s Hadoop platform) and we are also in the process of transitioning our teams onto Google Cloud Platform (GCP), which makes it a very exciting time … excellence and a passion for automation.Expertise in building, deploying, and maintaining large scale ETL/ELT data solutions (ideally on GCP and/or Hadoop)Good understanding of metadata management and data quality concerns, as well as the ability to validate and challenge team’s engineering designs, code and more »
Experience of Big Data technologies/Big Data Analytics. C++, Java, Python, Shell Script R, Matlab, SAS Enterprise Miner Elastic search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use more »
Experience of Big Data technologies/Big Data Analytics. C++, Java, Python, Shell Script R, Matlab, SAS Enterprise Miner Elastic search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use more »