According to Gartner 2016 has seen Hadoop move into early mainstream status.The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage.
Malcolm Kavalsky has started a series of blog posts about building Hadoop on Oracle Solaris. He starts with building Hadoop's latest stable version 2.7.3 on Solaris 11.3 on both x86 and SPARC. His x86 machine is a virtualbox image (with 5G Ram). His SPARC machine is a Solaris 11.3 zone on a T5-2. The tools needed and the build procedure are identical for both platforms. He discusses the required tools, where to get them and the steps for building Hadoop. His next blog will be on installing and running Hadoop. Read the blog for details.