here is a commercial open source company, Cloudera, that supports and evangelizes Hadoop, and mainstream organizations and businesses ranging from the New York Times to IBM are using it...
on 05/26/2010 – Made popular on 05/26/2010
I am required to hack a single node hadoop "cluster" (cloudera psuedo-distributed) to be able to access it remotely. I have successfully installed hadoop and I have updated the localhost identifiers in the configs to the IP address of the machine. I can run hadoop fs -ls / and all is good. I have created a passphraseless key and I can ssh to the hadoop machine.
Cloudera has raised $65 million to further fuel Hadoop adoption and expand its European operations. The new round led by Accel Partners, brings the total raised to $140 million. Existing investors Greylock Partners, Ignition Partners, In-Q-Tel and Meritech Capital Partners all participated in the round.
Cloudera has established itself as one of the true leaders of the big data movement.
Ok, this is the first script I've ever attempted to write so go easy on me. I'm just trying to simple copy a tar.gz file from the directory the script is located to /usr/local/filename.tar.gz The script is run as root so it shouldn't be a permissions issue.
I've setup Hadoop to use Kerberos (following the Cloudera security guide),
but it is unclear how I connect to hadoop with regular users (e.g.
Currently I have myself authenticated with Kerberos with my Keberos
admin user (via kinit kerbadmin/admin), but that doesn't seem to
help. Do I need to tell Hadoop that kerberos user "kerbadmin" is
allowed to use Hadoop?
Virtualization giant VMware has unveiled Spring Hadoop, which integrates its Spring Framework with the Apache Hadoop platform. Spring provides a comprehensive, lightweight framework that will make it easier for devs to build solutions around the Hadoop platform, according to the company. Spring Hadoop is available under the open source Apache 2.0 license and can be downloaded free.