Wednesday Sep 17, 2014

Hadoop Hands-on Lab Oracle OpenWorld 2014

In a few days the largest IT event in world will start.  Oracle OpenWorld 2014 . As in the past, and growing exponentially each year, we will host over 2000 sessions and many ‘hands-on’ labs.

These labs are a unique opportunity to familiarize yourselves with the Oracle products which address the entire IT portfolio.

Our specific interest is in Big Data, Solaris, and virtualization.  This year Jeff Taylor and I will present the following lab:  Set Up a Hadoop 2 Cluster with Oracle Solaris Zones, Oracle Solaris ZFS, and Unified Archive [HOL2086] 
This hands-on lab addresse all the requirements and demonstrates how to set up an Apache Hadoop 2 (YARN) cluster using Oracle Solaris 11 technologies such as Oracle Solaris Zones, Oracle Solaris ZFS, and Unified Archive.  Key topics include the Hadoop Distributed File System (HDFS) and the Hadoop MapReduce programming model. 
It also covers the Hadoop installation process and the cluster building blocks, namely: NameNode, Resource Manager, History Server, and DataNodes.

In addition, you will learn how to combine Oracle Solaris 11 technologies for better scalability and data security and will learn how to enable a HDFS high-availability cluster and run a MapReduce job.

Please register by using the link below:

Set Up a Hadoop 2 Cluster with Oracle Solaris Zones, Oracle Solaris ZFS, and Unified Archive [HOL2086]

See you at OpenWorld!

Orgad

Thursday Aug 22, 2013

Hadoop Cluster with Oracle Solaris Hands on Lab at Oracle Open WORLD 2013

If you want to learn how-to build a Hadoop cluster using Solaris 11 technologies please join me at the following Oracle Open WORLD 2013 lab.
How to Set Up a Hadoop Cluster with Oracle Solaris [HOL10182]



In this Hands-on-Lab we will preset and demonstrate using exercises how to set up a Hadoop cluster Using Oracle Solaris 11 technologies like: Zones, ZFS, DTrace  and Network Virtualization.
Key topics include the Hadoop Distributed File System and MapReduce.
We will also cover the Hadoop installation process and the cluster building blocks: NameNode, a secondary NameNode, and DataNodes.
In addition how we can combine the Oracle Solaris 11 technologies for better scalability and data security.
During the lab users will learn how to load data into the Hadoop cluster and run Map-Reduce job.
This hands-on training lab is for system administrators and others responsible for managing Apache Hadoop clusters in production or development environments.

This Lab will cover the following topics:

    1. How to install Hadoop

    2. Edit the Hadoop configuration files

    3. Configure the Network Time Protocol

    4. Create the Virtual Network Interfaces

    5. Create the NameNode and the Secondary NameNode Zones

    6. Configure the NameNode

    7. Set Up SSH between the Hadoop cluster member

    8. Format the HDFS File System

    9. Start the Hadoop Cluster

   10. Run a MapReduce Job

   11. How to secure data at rest using ZFS encryption

   12. Performance monitoring using Solaris DTrace

Register Now


About

This blog covers cloud computing, big data and virtualization technologies

Search

Categories
Archives
« May 2015
SunMonTueWedThuFriSat
     
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
      
Today