Data Analytics stack (IS, AS, RS) Power BI, DAX MDS Azure Data Lakes Supporting: Azure ML .Net/HTML5 Azure infrastructure R, Python Powershell Hadoop, Data Factory Principles: Data Modelling Data Warehouse Theory Data Architecture Master Data Management Data Science WHY ADATIS? There’s a long list of reasons more »
Woking, Surrey, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Manchester, England, United Kingdom Hybrid / WFH Options
Lorien
DynamoDB/etc.) Solid understanding of data governance principles and how to implement these across the business Knowledge of Big Data technology (Spark/Hadoop/etc.) Excellent communication skills across various levels of stakeholders Benefits: Salary available £120,000 Bonus scheme Enhanced pension contribution available Genuine opportunity to more »
Experience of Data Lake/Hadoop platform implementation Hands-on experience in implementation and performance tuning Hadoop/Spark implementations Experience ApacheHadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr, Avro) Experience with one … or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code in one or more programming languages (Java, Python, etc.) Preferred Qualifications Masters or PhD in Computer Science, Physics, Engineering or Math Hands on experience leading large-scale global data warehousing and analytics projects Ability more »
Months Location: Milton Keynes Key responsibilities: Demonstrable experience as an Kafka Developer (Ideally Kafka Streams). Hand on experience in Big data technologies (Hadoop, Hue, Hive, Impala,Spark,.) and stronger within Kafka. Knowledge and experience using Key value data bases . Experience developing microservices using Spring. Design and more »
new platforms and into new customer bases. C urrently exploring options including RAD Studio, Visual Studio, Delphi, C#, C++, Client/Server, n-tier, Hadoop and SaaS. They require candidate with a strong computing background . You will be coding in Delphi and other languages. Any similar Object Oriented more »
Founding Data Engineer Role update - this role is now contract only. And really a mix of data science and engineering... Central London 3 days a week, hybrid NB – we are looking for someone with specific experience around fintech, finance, banking more »
Founding Data Engineer Role update - this role is now contract only. And really a mix of data science and engineering... Central London 3 days a week, hybrid NB – we are looking for someone with specific experience around fintech, finance, banking more »
Experience with RDS like MySQL or PostgreSQL or others, and experienced with NoSQL like Redis or MongoDB or others; * Experience with BigData technologies like Hadoop eco-system is a plus. * Excellent writing, proof-reading and editing skills. Able to create documentation that can express cloud architectures using text and more »
Experience with RDS like MySQL or PostgreSQL or others, and experienced with NoSQL like Redis or MongoDB or others; * Experience with BigData technologies like Hadoop eco-system is a plus. * Excellent writing, proof-reading and editing skills. Able to create documentation that can express cloud architectures using text and more »
Key Skills 3+ years of Python experience Highly statistical and Analytical Exposure to Google Cloud Platform ( BigQuery, GCS, Datalab, Dataproc, Cloud ML (desirable) Spark & Hadoop experience Strong communication skills Good problem solving skills Qualifications Bachelor's degree or equivalent experience in a quantative field (Statistics, Mathematics, Computer Science, Engineering … and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau) This is a permanent position, and offers flexibility with Hybrid working, 2-3 days per week in the office, depending on workload more »
London, England, United Kingdom Hybrid / WFH Options
Global Relay
a Global Relay DevOps engineer you will be integrated with a software engineering team to develop on premise ('on-prem') solutions including working with Hadoop based technologies. Your role will involve designing, implementing and supporting automated, scalable solutions. Your contribution will have an immediate impact of enabling efficient delivery … them from reoccurring. Deployments: Writing and running deployment automation tools using helm, ansible, or other configuration management systems Platform Integration: With technologies such as Hadoop and Kubernetes Some of the technologies that you will interact with include: Containerisation and virtualisation: Docker, Kubernetes, VMWare Operating Systems: Linux Build and deployment … Jenkins, Bitbucket, Maven, Helm Instrumentation and monitoring: Loki, Prometheus, Grafana, Mimir Languages and frameworks: Bash, Java, Groovy, Go, Python Big data technologies: Cassandra, ArangoDB, Hadoop, Kafka, MongoDB, Ceph Where you have knowledge gaps, training and mentoring will be provided. About You: You have an automation-first mindset. You enjoy more »
Version 1 has celebrated over 26 years in the Technology industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
Version 1 has celebrated over 26 years in the Technology industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
Version 1 has celebrated over 26 years in the Technology industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
Milton Keynes, Buckinghamshire, South East, United Kingdom
Maclean Moore Ltd
Months Location: Milton Keynes Key responsibilities: Demonstrable experience as an Kafka Developer (Ideally Kafka Streams). Hand on experience in Big data technologies (Hadoop, Hue, Hive, Impala,Spark,.) and stronger within Kafka. Knowledge and experience using Key value data bases . Experience developing microservices using Spring. Design and … data services for different applications ,usually 24/7 applications with a big performance requirement . Key skills/knowledge/experience: Big Data Hadoop - Hive and Spark/Scala solid experience SQL advance knowledge - Been able to test changes and issues properly, replicating the code functionality into SQL more »
Dayrate :£450 Job Purpose and Primary Objectives We are seeking a highly experienced Kafka Developer with expertise in Kafka Streams and Big Data technologies (Hadoop, Hue, Hive, Impala, Spark). The ideal candidate will have strong knowledge of key-value databases and experience in developing microservices using Spring. The … time data services for various 24/7 applications with high-performance requirements. Key Skills/Knowledge Solid experience in Big Data technologies, specifically Hadoop, Hive, Java and Spark/Scala . Advanced SQL knowledge for testing changes and replicating code functionality. Proficient with code repositories like GIT and more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Implement security best practices and ensure compliance with cybersecurity standards. Collaborate with development teams to integrate security practices … Familiarity with SQL and database management systems, including relational and NoSQL databases. Knowledge of data streaming technologies (e.g., Kafka) and big data platforms (e.g., Hadoop). Understanding of cybersecurity principles and best practices. Familiarity with architectural styles and experience in implementing DevSecOps practices. Excellent problem-solving skills and attention more »
Horsell, England, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Implement security best practices and ensure compliance with cybersecurity standards. Collaborate with development teams to integrate security practices … Familiarity with SQL and database management systems, including relational and NoSQL databases. Knowledge of data streaming technologies (e.g., Kafka) and big data platforms (e.g., Hadoop). Understanding of cybersecurity principles and best practices. Familiarity with architectural styles and experience in implementing DevSecOps practices. Excellent problem-solving skills and attention more »
Woking, Surrey, South East, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited
e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Implement security best practices and ensure compliance with cybersecurity standards. Collaborate with development teams to integrate security practices … Familiarity with SQL and database management systems, including relational and NoSQL databases. Knowledge of data streaming technologies (e.g., Kafka) and big data platforms (e.g., Hadoop). Understanding of cybersecurity principles and best practices. Familiarity with architectural styles and experience in implementing DevSecOps practices. Excellent problem-solving skills and attention more »
integrity can be maintained as part of business improvement plans which affect the organisation Review, manage and lead the development of data frameworks (e.g. Hadoop) and analysis of data to ensure accuracy of sources and data resilience Communications and Engagement Identify and understand the business needs, prioritise, and design … recognising entities in free text Experience creating and developing SQL server queries and/or stored procedures Experience using and developing data frameworks (e.g. Hadoop) Experience developing and scripting dashboards and data visualisations using tools such as QlikView, QlikSense and Tableau Experience interpreting and analysing complex data sets from more »
Developer Duration: 6 Months Location: Milton Keynes Key responsibilities: experience as an Kafka Developer (Ideally Kafka Streams). on experience in Big data technologies (Hadoop, Hue, Hive, Impala,Spark,.) and stronger within Kafka. and experience using Key value data bases . Experience developing microservices using Spring. and develop … of data services for different applications ,usually 24/7 applications with a big performance requirement . Key skills/knowledge/experience: Data Hadoop - Hive and Spark/Scala solid experience advance knowledge - Been able to test changes and issues properly, replicating the code functionality into SQL with more »
Experience of Big Data technologies/Big Data Analytics. C++, Java, Python, Shell Script R, Matlab, SAS Enterprise Miner Elastic search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use more »