Hadoop
Hadoop
Apache Hadoop is an open-source software framework for distributed storage and distributed processing of very large data sets on computer clusters built from commodity hardware. All the modules in Hadoop are designed with a fundamental assumption that hardware failures are common and should be automatically handled by the framework.
Version: 2.7.2
Apache Software Foundation
From: Apache Software Foundation

Deployment Setting
Compute Node Config:
HPC Nodes:
1 Head * +
Compute
Storage Space:
Use RDMA:
Starts @ Rs.
 / Hour
For 
 CPU Cores
* Head node configuration is '4 core, 14GB RAM'. It manages and schedules services to the cluster.
Total cost is calculated as per actual usage on hourly basis. Optimal price will be applied automatically.


Description:

Distributed Storage And Distributed Processing.


Apache Hadoop is an open-source software framework for distributed storage and distributed processing of very large data sets on computer clusters built from commodity hardware. All the modules in Hadoop are designed with a fundamental assumption that hardware failures are common and should be automatically handled by the framework.