on experience with analytic tools like R & Python; & visualization tools like Tableau & Power BI Exposure to cloud platforms and big data systems such as Hadoop HDFS, and Hive is a plus Ability to work with IT and Data Engineering teams to help embed analytic outputs in business processes Graduate more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : • Base Salary: £45,000 - £75,000 (DoE) • Discretionary Bonus: Circa 10% per annum • DV Bonus: Circa £5,000 • Flex Fund: £5000 • Health: Private more »
City of London, London, United Kingdom Hybrid / WFH Options
BeTechnology Group
scalable systems. Deep knowledge of distributed and scalable systems, including proficiency with PostgreSQL, Ray, RabbitMQ, and Cassandra. Familiarity with big data technologies such as Hadoop, Spark, or Kafka . Experience with CI/CD Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems. Excellent more »
East London, London, United Kingdom Hybrid / WFH Options
Be Technology
scalable systems. Deep knowledge of distributed and scalable systems, including proficiency with PostgreSQL, Ray, RabbitMQ, and Cassandra. Familiarity with big data technologies such as Hadoop, Spark, or Kafka . Experience with CI/CD Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems. Excellent more »
scaling, and troubleshooting of cloud systems.- Operational experience running a 24x7 production infrastructure at scale.- Proficiency working with data structures, schemas, and technologies like Hadoop, Hive, Redis, and MySQL- Experience in using cloud-native services like GKE, EKS, AWS/GCP load balancing, AWS/GCP cloud storage platforms more »
The following skills/experience is essential: -Proven experience as a Lead Big Data Engineer with excellent knowledge of Big Data -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
Greater London, England, United Kingdom Hybrid / WFH Options
Oliver Bernard
following skills/experience is essential: -Proven experience as an Architect and excellent knowledge of Big Data -Excellence experience across Azure -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
decision making skills Exposure to working with REST API's Any of the following skills would be an added bonus: Has run code across Hadoop/MapReduce clusters Has code running in a production environment Used SAS before (or at least can decipher SAS code) Worked with very large more »
As an IT Specialist, you'll need broad expertise across various areas of the technology/software domain. Proficiency in AWS or Big Data, Hadoop or other SQL databases, Lucene, Spark, web app development (JavaScript, Node.js), Docker, Jenkins, Git, Python, or Ruby would be highly beneficial. Key Responsibilities: Meet more »
scalability, and extensibility. The following skills/experience is essential: -Proven experience as an Architect and excellent knowledge of Big Data -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
Testing performance with JMeter or similar toolsWeb services technology such as REST, JSON or ThriftTesting web applications with Selenium WebDriverBig data technology such as Hadoop, MongoDB, Kafka or SQLNetwork principles and protocols such as HTTP, TLS and TCPContinuous integration systems such as Jenkins or BambooContinuous delivery concepts#LI-HybridWhat you more »
standard methodology methods and tooling used across the engineering teamRequirements:5-7 years of Java experienceCapital markets Front office experienceExperience working with Data lake (Hadoop) consumption, specifically Hive experienceKafka experienceRules engine experience (Ideally open source/vendor products, e.g. Drools or Camunda)Unix scripting knowledgeMarkets Regulatory/Trade control more »
No Sponsorship provided Responsibilities & Skills: Strong Snowflake Development experience Hands on experience on Data Modelling/Data Migration and related tools Sound Knowledge on Hadoop Architecture Strong knowledge of SQL Best pointers for working with the client Ø Opportunity to shape the future of a rapidly growing company. Ø more »
new platforms and into new customer bases. C urrently exploring options including RAD Studio, Visual Studio, Delphi, C#, C++, Client/Server, n-tier, Hadoop and SaaS. They require candidate with a strong computing background . You will be coding in Delphi and other languages. Any similar Object Oriented more »
Experience or ability to demonstrate mentoring experience – jQuery, – Version control using git – Working knowledge of Linux – Experience with mobile applications would be beneficial – Currently Hadoop is used but not requiredThe environment is that of Facebook or Google, relaxed open with time to think and make the right decisions. The more »
your time for being hands on to stay current with tools and technology.Traditionally our work has been implemented on-premise on EDH (LBG’s Hadoop platform) and we are also in the process of transitioning our teams onto Google Cloud Platform (GCP), which makes it a very exciting time … excellence and a passion for automation.Expertise in building, deploying, and maintaining large scale ETL/ELT data solutions (ideally on GCP and/or Hadoop)Good understanding of metadata management and data quality concerns, as well as the ability to validate and challenge team’s engineering designs, code and more »
Experience of Big Data technologies/Big Data Analytics. C++, Java, Python, Shell Script R, Matlab, SAS Enterprise Miner Elastic search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use more »
Principal Data Engineer (SVP)- AI We’re searching for a Principal Data Engineer/Scientist with experience in generative AI to join a business that provides crucial services, often focusing on saving programs and businesses from the brink of collapse. more »
responsibilities will include building microservices, using Docker and Kubernetes and 3rd party API integrations. You will also be working with big data technologies like Hadoop, Kafka and Cassandra.This Senior Java Engineer (Multithreading) role will be a strong fit if you:Have extensive Core Java, multihreading, low-latency, concurrency experienceEnjoy … working with big data technologies like Hadoop (or if you are keen to learn!)Have messaging experience with Kafka, RabbitMQ or MongoDBAs the Senior Java Engineer, you will be a strong advocate for modern development ways of working including pair programming, Agile and BDD/TDD. This company (during more »
Data and Artificial Intelligence, Senior Vice President We are searching for a Senior Vice President of Data and Artificial Intelligence- someone with hands on experience designing AI solutions to solve complex business problems. Your new role is a leadership position more »
Job Description Senior Data Scientist PhD - Canary Wharf We urgently require a Senior Data Scientist with at least 5 to 10 years proven track record as Data Scientist. Must have a PhD ideally in Maths or Physics and an excellent more »