I have a script file where i'm validatig the input file and storing the validated records on HDFS.
I wanted to load data from HDFS to HBASE using pig script. So for that i have created a HBASE table and written pig script to load data from HDFS to HBASE which is working fine. Now i wanted those scripts to be wriitn in the .sh file.
I have had a 10 node HBase cluster up and running for the past 4 months. The cluster was setup on VMs in a corporate environment which I do not control, but everything has been working great...until today.
i want to access hbase table from hadoop mapreduce....i m using windowsXP and cygwin
i m using hadoop-0.20.2 and hbase-0.92.0
hadoop cluster is working fine....i am able to run mapreduce wordcount successfully on 3 pc's
hbase is also working .....i can cerate table from shell
i have tried many examples but they are not working....when i try to compile it using
It is possible to call Windows Environment Variables in a bat script by defining e.g. %JAVA_HOME%. It looks like this is not possible using CygWin and shell scripts. Is it possible to call Windows Environment Variables in shell scripts? This would be handy for the configuration and running of an Unix package on Windows.