instrumental in ensuring the security and efficiency of the data handling and reporting processes. Key Responsibilities: Data Processing: Utilize Apache Spark, AWS RDS, and Hadoop to process large datasets efficiently and securely. Reporting: Generate comprehensive and insightful reports using Tableau. Business Rules Management: Implement and manage business rules with more »
Data Lake/Hadoop platform implementation Good level hands-on experience in implementation and performance tuning Hadoop/Spark implementations Experience ApacheHadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr, Avro) Experience with one … or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code in one or more programming languages (Java, Python, etc.) Preferred Qualifications: Masters or PhD in Computer Science, Physics, Engineering or Maths Hands on experience leading large-scale global data warehousing and analytics projects Ability more »
computing platforms - preferably in GCP - and experience with container orchestration technologies such as Kubernetes. Strong background in distributed computing and familiarity with technologies like Hadoop, Spark, Kafka, and distributed cache systems (Hazelcast, Redis). Experience with database management and proficiency in SQL and NoSQL databases. Knowledge of monitoring and more »
consulting environment • Current or previous consulting experience highly desirable • Experience of working with companies in the finance sector highly desirable • Platform implementation experience (ApacheHadoop - Kafka - Storm and Spark, Elasticsearch and others) • Experience around data integration & migration, data governance, data mining, data visualisation, database modelling in an agile delivery more »
EC1N, Farringdon Without, Greater London, United Kingdom
Damia Group Ltd
your key responsibilities will be to : Manage operational procedures. Transform and process data using Apache Spark. Administer AWS RDS with MySQL. Work with the Hadoop platform. Create reports using Tableau. Utilize Red Hat Decision Central Skills/Experience Required of the SC Cleared DevSecOps Engineer: Strong operational procedures knowledge. more »
Employment Type: Permanent
Salary: £50000 - £65000/annum 15% cash flex and 10% bonus
following: .NET (VB, C#, ASP.NET, .NET CORE) MVC Framework Python JavaScript (REACT, Bootstrap Frameworks) Database design SQL/SQL Server NoSQL technologies e.g., MongoDB, Hadoop, etc. If you’re the right person for the role, you’ll bring experience of working on a range of applications across the development more »
PostgreSQL), NoSQL databases (e.g., M MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. - Work with big data technologies such as Hadoop to manage and analyse large datasets. Qualifications: - Bachelor's Degree in Computer Science or Engineering. - Experience with cloud technologies, particularly Azure and AWS. - Proficiency more »
Employment Type: Permanent
Salary: £52000 - £62000/annum Bonus + Full Benefits
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Bright Purple
have: A passion for manipulation and visualization of data. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop/Splunk. Experience with network security products and solutions. Ability to work with Python, HTML, CSS and JavaScript experience. Ifyou are a driven and more »
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bright Purple
have: A passion for manipulation and visualization of data. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop/Splunk. Experience with network security products and solutions. Ability to work with Python, HTML, CSS and JavaScript experience. Ifyou are a driven and more »
Glue, AWS Redshift, and Python Experience with ETL processes, data integration, and data warehousing. Strong SQL skills Experience with Big Data technologies such as Hadoop, Spark, and Kafka Familiarity with cloud platforms (AWS, Azure, Google Cloud) Working knowledge of data visualisation tools (PowerBI, Tableau, Qlik Sense) Additional Skills: Client more »
and analytical abilities. Preferred Skills: Experience with cloud databases (e.g., AWS, Azure, Google Cloud Platform). Knowledge of big data tools and frameworks (e.g., Hadoop, Spark). Certification in database management (e.g., Microsoft Certified: Azure Data Engineer Associate). What We Offer: Competitive salary and comprehensive benefits package. Opportunity more »