Oracle Technology Network (OTN) published the "How to Set Up a Hadoop Cluster Using Oracle Solaris" OOW 2013 Hands-On Lab.
This hands-on lab presents exercises that demonstrate how to set up an Apache Hadoop cluster using Oracle Solaris
11 technologies such as Oracle Solaris Zones, ZFS, and network virtualization. Key topics include the Hadoop Distributed File System
(HDFS) and the Hadoop MapReduce programming model.
We will also cover the Hadoop installation process and the cluster building blocks:
NameNode, a secondary NameNode, and DataNodes. In addition, you will see how you can combine the Oracle Solaris 11 technologies for better
scalability and data security, and you will learn how to load data into the Hadoop cluster and run a MapReduce job.
Summary of Lab Exercises
This hands-on lab consists of 13 exercises covering various Oracle Solaris and Apache Hadoop technologies:
Edit the Hadoop configuration files.
Configure the Network Time Protocol.
Create the virtual network interfaces (VNICs).
Create the NameNode and the secondary NameNode zones.
Set up the DataNode zones.
Configure the NameNode.
Set up SSH.
Format HDFS from the NameNode.
Start the Hadoop cluster.
Run a MapReduce job.
Secure data at rest using ZFS encryption.
Use Oracle Solaris DTrace for performance monitoring.