within a typical retail trading environment is key. Experience required: A background in leveraging hands on skills using tools such as Python, R, Spark, Hadoop, SQL and cloud based platforms such as GCP, Azure and AWS to manipulate and analyse various data sets in large volumes Background in data more »
london, south east england, United Kingdom Hybrid / WFH Options
Anson McCade
and NoSQL databases Programming languages such as Spark or Python Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits: Base Salary: £45,000 - £75,000 (DoE) Discretionary Bonus DV Bonus Flex Fund: £5000 Health: Private Medical Insurance Annual Leave: 25 Days plus more »
on experience with analytic tools like R & Python; & visualization tools like Tableau & Power BI Exposure to cloud platforms and big data systems such as Hadoop HDFS, and Hive is a plus Ability to work with IT and Data Engineering teams to help embed analytic outputs in business processes Graduate more »
on experience with analytic tools like R & Python; & visualization tools like Tableau & Power BI Exposure to cloud platforms and big data systems such as Hadoop HDFS, and Hive is a plus Ability to work with IT and Data Engineering teams to help embed analytic outputs in business processes Graduate more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : • Base Salary: £45,000 - £75,000 (DoE) • Discretionary Bonus: Circa 10% per annum • DV Bonus: Circa £5,000 • Flex Fund: £5000 • Health: Private more »
experience Demostrate In-depth knowledge of large-scale data platforms (Databricks, Snowflake) and cloud-native tools (Azure Synapse, RedShift) Experience of analytics technologies (Spark, Hadoop, Kafka) Have familiarity with Data Lakehouse architecture, SQL Server, DataOps, and data lineage concepts Demonstrate In-depth knowledge of large-scale data platforms (Databricks more »
experience Demostrate In-depth knowledge of large-scale data platforms (Databricks, Snowflake) and cloud-native tools (Azure Synapse, RedShift) Experience of analytics technologies (Spark, Hadoop, Kafka) Have familiarity with Data Lakehouse architecture, SQL Server, DataOps, and data lineage concepts Demonstrate In-depth knowledge of large-scale data platforms (Databricks more »
London, England, United Kingdom Hybrid / WFH Options
Global Relay
JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo Continuous delivery more »
decision making skills Exposure to working with REST API's Any of the following skills would be an added bonus: Has run code across Hadoop/MapReduce clusters Has code running in a production environment Used SAS before (or at least can decipher SAS code) Worked with very large more »
As an IT Specialist, you'll need broad expertise across various areas of the technology/software domain. Proficiency in AWS or Big Data, Hadoop or other SQL databases, Lucene, Spark, web app development (JavaScript, Node.js), Docker, Jenkins, Git, Python, or Ruby would be highly beneficial. Key Responsibilities: Meet more »
share knowledge with the team. Qualifications You will have expertise within the following: Java and Python development knowledge (Essential) Previous experience with Spark or Hadoop (Essential) Trino orAirflow (Desirable) Architecture and capabilities. Designing and implementing complex solutions with a focus on scalability and security. Excellent communication and collaboration skills. more »
share knowledge with the team. Qualifications You will have expertise within the following: Java and Python development knowledge (Essential) Previous experience with Spark or Hadoop (Essential) Trino orAirflow (Desirable) Architecture and capabilities. Designing and implementing complex solutions with a focus on scalability and security. Excellent communication and collaboration skills. more »
about the customers value as the primary goal of everything we do Desirable • Experience with the following: Unix CLI, Python, GoLang, Java, Dagster, MLFlow, Hadoop, Snowflake • Experience building fraud/scam/money laundering models. • Knowledge of the payments ecosystem • Knowledge of blockchain/crypto ecosystem Corporate Security Responsibility more »
new platforms and into new customer bases. C urrently exploring options including RAD Studio, Visual Studio, Delphi, C#, C++, Client/Server, n-tier, Hadoop and SaaS. They require candidate with a strong computing background . You will be coding in Delphi and other languages. Any similar Object Oriented more »
Founding Data Engineer Role update - this role is now contract only. And really a mix of data science and engineering... Central London 3 days a week, hybrid NB – we are looking for someone with specific experience around fintech, finance, banking more »
Founding Data Engineer Role update - this role is now contract only. And really a mix of data science and engineering... Central London 3 days a week, hybrid NB – we are looking for someone with specific experience around fintech, finance, banking more »
Experience with RDS like MySQL or PostgreSQL or others, and experienced with NoSQL like Redis or MongoDB or others; * Experience with BigData technologies like Hadoop eco-system is a plus. * Excellent writing, proof-reading and editing skills. Able to create documentation that can express cloud architectures using text and more »
Experience with RDS like MySQL or PostgreSQL or others, and experienced with NoSQL like Redis or MongoDB or others; * Experience with BigData technologies like Hadoop eco-system is a plus. * Excellent writing, proof-reading and editing skills. Able to create documentation that can express cloud architectures using text and more »
Version 1 has celebrated over 26 years in the Technology industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
Version 1 has celebrated over 26 years in the Technology industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
Experience of Big Data technologies/Big Data Analytics. C++, Java, Python, Shell Script R, Matlab, SAS Enterprise Miner Elastic search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use more »
London, England, United Kingdom Hybrid / WFH Options
Version 1
BASED WITH VERY OCCASIONAL TRAVEL TO CLIENT SITES AND OFFICE. Would you like to the opportunity to expand your skillset across Java, Python, Spark, Hadoop, Trino & Airflow across the Banking & Financial Services industries? How about if you worked with an Innovation Partner of the Year Winner (2023 Oracle EMEA … with the latest trends and best practices, and share knowledge with the team. Qualifications You will have expertise within the following: Java, Python, Spark, Hadoop (Essential) Trino, Airflow (Desirable) Architecture and capabilities. Designing and implementing complex solutions with a focus on scalability and security. Excellent communication and collaboration skills. more »
Leading ecommerce client are now searching for a Senior Data Engineer to contribute towards the delivery of their data strategy. This engineer will shape our client’s data function, delivering end to end solutions for an array of customer data more »