Thursday, May 22, 2014

Three essential things to do while building Hadoop environment

Last year I setup Hadoop environment by using Cloudera manager. (Basically I followed this video tutorial : http://www.youtube.com/watch?v=CobVqNMiqww)
I used CDH4(cloudera hadoop)  that included HDFS, MapReduce, Hive, ZooKeeper HBase, Flume and other essential components. It also included YARN (MapReduce 2) but it was not stable so I used MapReduce instead.
I installed CDH4 on 10 centos nodes, and I set the Flume to collect twitter data, and by using "crontab" I scheduled the indexing the twitter data in Hive.
Anyways, I want to share some of my experiences  and challenges that I faced.
First, let me give some problem solutions that everyone must had faced while using Hadoop.

1. vm.swappiness warning on hadoop nodes

It is easy to get rid of this warning by just simply running this shell command on nodes:
>sysctl -w vm.swappiness=0
More details are written on cloudera's site

2. Make sure to synchronize time on all nodes (otherwise it will give error on nodes)

In Linux there is a module to sync time via internet. Do these for all nodes.
Install ntp: 
>yum install ntp
Run these commands by order
>chkconfig ntpd on
>ntpdate pool.ntp.org
>/etc/init.d/ntpd start

3. Problem uploading files to HDFS (hadoop fs -put localfiles destinationOnHdfs)

This error message says: "INFO hdfs.DFSClient: Exception in createBlockOutputStream” while uploading file to HDFS"

This is caused by firewall settings. Here is the solution:
>service iptables save
>service iptables stop
>chkconfig iptables off
do these on all nodes. Now you should be uploading files to HDFS without any problem.

No comments:

Post a Comment