Cloud Architecture and Infrastructure Lab

The project leverages the distributed computing model by utilizing various commodity servers and each and every processing program will be parallel processed by using Apache Map-Reduce frameworks. Hadoop MapReduce is a programming model and software framework for writing applications that rapidly process vast amounts of data in parallel on large clusters of compute nodes.

Hadoop Cloud

Hbase cluster and Cloud Infrastructure