and Application/Platform Lifecycle Management in the area of Production Datawarehouse & Reporting Solutions Continuously enhance our Production Reporting Platforms using various technologies (Oracle, Hadoop, Denodo, ) Define, design, build and enhance business intelligence solutions Partner up with management, application owner, key business customer and team members for the delivery … Knowledge in SQL and PL/SQL in Oracle and know-how in MS SQL Database Experience with Big Data platforms/development (e.g. Hadoop, Spark, Impala, HIVE) Experience in data warehousing projects (as an advantage) Good analytical troubleshooting, problem-solving skills The ability to work independently with minimal More ❯
LiveRamp is the data collaboration platform of choice for the world's most innovative companies. A groundbreaking leader in consumer privacy, data ethics, and foundational identity, LiveRamp is setting the new standard for building a connected customer view with unmatched More ❯
Senior Data Analytics Consultant - Public Sector and Defence Are you passionate about harnessing data to drive strategic decision-making? Join a leading technology consultancy delivering tailored solutions to high-profile clients in National Security, Defence, and the UK Civil Service. More ❯
graag doet, voel je je goed in de job. Je hebt een bachelor of masterdiploma IT en ziet het volledig zitten om Cloud, Mainframe, Hadoop, Kubernetes en Docker infrastructuur beter te leren kennen. Je spreekt en schrijft vlot Engels . Aanbod Bij KBC IT ben je zeker van een More ❯
Groovy, Python, and/or shell scripting • Javascript development experience with Angular, React, ExtJS and/or Node.js • Experience with distributed computing technologies including Hadoop, HBase, Cassandra, Elasticsearch and Apache Spark a plus • Hands-on experience working with Elastic Search, Mongo DB, Node, Hadoop, Map Reduce, Spark, Rabbit More ❯
Groovy, Python, and/or shell scripting • Javascript development experience with Angular, React, ExtJS and/or Node.js • Experience with distributed computing technologies including Hadoop, HBase, Cassandra, Elasticsearch and Apache Spark a plus • Hands-on experience working with Elastic Search, Mongo DB, Node, Hadoop, Map Reduce, Spark, Rabbit More ❯
and Application/Platform Lifecycle Management in the area of Production Datawarehouse & Reporting Solutions Continuously enhance our Production Reporting Platforms using various technologies (Oracle, Hadoop, Denodo, ) Define, design, build and enhance business intelligence solutions Partner up with management, application owner, key business customer and team members for the delivery … Knowledge in SQL and PL/SQL in Oracle and know-how in MS SQL Database Experience with Big Data platforms/development (e.g. Hadoop, Spark, Impala, HIVE) Experience in data warehousing projects (as an advantage) Good analytical troubleshooting, problem-solving skills The ability to work independently with minimal More ❯
and Application/Platform Lifecycle Management in the area of Production Datawarehouse & Reporting Solutions Continuously enhance our Production Reporting Platforms using various technologies (Oracle, Hadoop, Denodo, ) Define, design, build and enhance business intelligence solutions Partner up with management, application owner, key business customer and team members for the delivery … Knowledge in SQL and PL/SQL in Oracle and know-how in MS SQL Database Experience with Big Data platforms/development (e.g. Hadoop, Spark, Impala, HIVE) Experience in data warehousing projects (as an advantage) Good analytical troubleshooting, problem-solving skills The ability to work independently with minimal More ❯
and Application/Platform Lifecycle Management in the area of Production Datawarehouse & Reporting Solutions Continuously enhance our Production Reporting Platforms using various technologies (Oracle, Hadoop, Denodo, ) Define, design, build and enhance business intelligence solutions Partner up with management, application owner, key business customer and team members for the delivery … Knowledge in SQL and PL/SQL in Oracle and know-how in MS SQL Database Experience with Big Data platforms/development (e.g. Hadoop, Spark, Impala, HIVE) Experience in data warehousing projects (as an advantage) Good analytical troubleshooting, problem-solving skills The ability to work independently with minimal More ❯
Are you passionate about harnessing data to drive strategic decision-making? Join a leading technology consultancy delivering tailored solutions to high-profile clients in National Security, Defence, and the UK Civil Service. As part of a fast-growing team of More ❯
Disk (SSD). • NFS/CIFS based server/storage appliance. • HPSE. • Data Domain and similar deduplication products. • Cloud based storage solutions such as HADOOP, and IBM BigInsights. • Trouble ticket management utilizing Remedy. Requirements IAT Level II Certification Required EQUAL OPPORTUNITY EMPLOYER VETERANS DISABLED More ❯
leading cloud big data platform for petabyte-scale data processing, interactive analytics, and machine learning using open-source frameworks such as Apache Spark, Trino, Hadoop, Hive, and HBase. Amazon Athena is a serverless query service that simplifies analyzing data directly in Amazon S3 using standard SQL. The ODA Fundamentals … designing or architecting (design patterns, reliability, and scaling) of new and existing systems Master's degree in computer science or equivalent Experience with ApacheHadoop ecosystem applications: Hadoop, Hive, Presto, Spark, and more Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
Henderson Scott
Tech You'll Use: Languages & Tools: SQL, Python, Power BI/Tableau, XML, JavaScript Platforms & Frameworks: Azure Data Services, Microsoft Fabric (nice to have), Hadoop, Spark Reporting & Visualization: Power BI, Tableau, Business Objects Methodologies: Agile/Scrum, CI/CD pipelines What You'll Be Doing: Designing and building … SQL, Python, and BI platforms like Tableau or Power BI Strong background in data warehousing, data modelling, and statistical analysis Experience with distributed computing (Hadoop, Spark) and data profiling Skilled at explaining complex technical concepts to non-technical audiences Hands-on experience with Azure Data Services (or similar cloud More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Henderson Scott
Tech You'll Use: Languages & Tools: SQL, Python, Power BI/Tableau, XML, JavaScript Platforms & Frameworks: Azure Data Services, Microsoft Fabric (nice to have), Hadoop, Spark Reporting & Visualization: Power BI, Tableau, Business Objects Methodologies: Agile/Scrum, CI/CD pipelines What You'll Be Doing: Designing and building … SQL, Python, and BI platforms like Tableau or Power BI Strong background in data warehousing, data modelling, and statistical analysis Experience with distributed computing (Hadoop, Spark) and data profiling Skilled at explaining complex technical concepts to non-technical audiences Hands-on experience with Azure Data Services (or similar cloud More ❯
family). Experience installing, administrating, and operating two or more of the following technologies: SUSE Rancher and Kubernetes clusters , Elasticsearch, Cloudera Private Cloud Platform , Hadoop components (Hadoop, yarn, HBase, Impala, etc.), VMWare vSphere virtualization platforms. Knowledge or experience in High-availability solutions, load balancers, relational databases (PostgreSQL), monitoring More ❯
family). Experience installing, administrating, and operating two or more of the following technologies: SUSE Rancher and Kubernetes clusters , Elasticsearch, Cloudera Private Cloud Platform , Hadoop components (Hadoop, yarn, HBase, Impala, etc.), VMWare vSphere virtualization platforms. Knowledge or experience in High-availability solutions, load balancers, relational databases (PostgreSQL), monitoring More ❯
Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit Job Title: Sr. Hadoop with SQL, Hive Work Location Tampa FL Duration: Full time Job Description: Mandatory Certificate Databricks Certified Developer Apache Spark 3.0 Skills Python PySpark Spark … SQL Hadoop Hive Responsibilities Ensure effective Design Development Validation and Support activities in line with client needs and architectural requirements Ensure continual knowledge management Adherence to the organizational guidelines and processes As part of the delivery team your primary role would be to ensure effective Design Development Validation and … navigate their next in their digital transformation journey Requirements A good professional with at least 6-10 yrs of experience in Bigdata PySpark HIVE Hadoop PLSQL Good knowledge of AWS and Snowflake Good understanding of CICD and system design Candidate with prior experience working on technologies on Fund transfer More ❯
supporting IC or DoD in the Cyber Security Domain Familiarity with the RMF process Experience with Relational Database Management System (RDMS) Experience with ApacheHadoop and the Hadoop Distributed File System Experience with Amazon Elastic MapReduce (EMR) and SageMaker Experience with Machine Learning or Artificial Intelligence Travel Security More ❯
Experience employing spreadsheets for data manipulation and visualization. Developer (MCSD)/Microsoft Certified Solution Expert (MCSE)/Private Cloud/Certified Administrator for ApacheHadoop (CCAH) (Cloudera) Experience with programs and projects and other enterprise initiatives for efforts in the RDT&E phase of the acquisition life cycle Experience … Cloud Security (CCSK)/CompTIA A+/CompTIA Security+/EMC Data Science Associate (EMCDSA)/Cloudera Certified Data Scientist (CCDH)/Certified ApacheHadoop Developer (HCAHD) (Hortonworks)/Certified Information System Security Professional (CISSP)/Certified Cloud Professional (CCP) (Cloudera)/Microsoft Certified Professional Developer (MCPD)/Microsoft More ❯
Analytics Consultant, A2C Job ID: Amazon Web Services Korea LLC Are you a Data Analytics specialist? Do you have Data Warehousing and/or Hadoop experience? Do you like to solve the most complex and high scale data challenges in the world today? Would you like a career that … with AWS services - Hands on experience leading large-scale global data warehousing and analytics projects. - Experience using some of the following: Apache Spark/Hadoop ,Flume, Kinesis, Kafka, Oozie, Hue, Zookeeper, Ranger, Elasticsearch, Avro, Hive, Pig, Impala, Spark SQL, Presto, PostgreSQL, Amazon EMR,Amazon Redshift . Our inclusive culture More ❯
Python, Java, AWS Infrastructure, Linux, Kubernetes, Hadoop, CI/CD , Big Data Platform, Agile, JIRA, Confluence, Github, Gitlab, puppet, ansible, maven, virtualization, ovirt, proxmox, vmware, Shell/Bash scripting Due to federal contract requirements, United States citizenship and an active TS/SCI security clearance and polygraph are required … accredited college or university. Prior experience or familiarity with DISA's Big Data Platform or other Big Data systems (e.g. Cloudera's Distribution of Hadoop, Hortonworks Data Platform, MapR, etc ) is a plus. Experience with CI/CD pipelines (e.g. Gitlab-CI, Travis-CI, etc.). Understanding of agile More ❯
supporting IC or DoD in the Cyber Security Domain Familiarity with the RMF process Experience with Relational Database Management System (RDMS) Experience with ApacheHadoop and the Hadoop Distributed File System Experience with Amazon Elastic MapReduce (EMR) and SageMaker Experience with Machine Learning or Artificial Intelligence More ❯
such as Hbase, CloudBase/Acumulo, Big Table, etc.; Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.; Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS); Shall have demonstrated work experience with Serialization such More ❯
upon documented requirements for the Data Transport System (DTS) • DTS products include but are not limited to: Cloud storage areas: Apache Accumulo (Apache Zookeeper, ApacheHadoop), Oracle DMBS Real time streaming: Storm Distributed in-memory data Cache/Storage: Redis, Graph Compute engine/Query Interface apache Tinkerpop/Gremlin. Rules … DTS portfolio encompasses transport streams, messages and files with content size ranging from bytes to Terabytes • Candidates should have experience writing analytics using ApacheHadoop, HDFS, and MapReduce • Experience processing large data sets or high-volume data ingest is a plus • Experience monitoring, maintaining and troubleshooting Apache Accumulo, Apache … Hadoop, and Apache Zookeeper deployments is required • Knowledge of the Spring Framework and Dependency Injection. • Linux proficiency is required, all development is done on Linux systems. • Working knowledge of Git, Maven, Gradle • Use configuration management tools and repositories (i.e. Maven, Eclipse, GIT, Redmine) • Ability to support Multi-threaded applications More ❯
Solving: Proven ability to troubleshoot and solve complex problems. Nice to Haves: AWS certification or Security+ certification. Relevant IT discipline certifications (e.g., Java, .NET, Hadoop, Spring). Cloud Experience: Familiarity with cloud technologies such as Hadoop, HBase, or MongoDB. Independent and Collaborative Worker: Ability to function effectively both More ❯