Access HDFS from Matlab
2 次查看(过去 30 天)
显示 更早的评论
Hi
we have installed Hadoop on two Linux (Ubuntu) machines (2 Datanode / 1 Namenode). Now, we want to access the data from a third computer where our Matlab R2014b is installed on a Windows operating system.
We have two questions:
1. How should we specify the Environment variables (HADOOP_PREFIX) on our Windows machine? 2. Do we need to install Hadoop on our Windows machine?
Thanks for your support.
2 个评论
Siddharth Sundar
2014-10-13
The error suggests that datastore hasn't been able to read the folder that contains the customers files. My suggestion for the first step is to check the permissions in HDFS. HDFS is a filesystem that is part of Hadoop, which has posix-like permissions. This folder will be owned by user 'hadoop' and it is possible that permissions are set such that other users cannot access it.
What username are you running MATLAB as? If it is not 'hadoop', then do the following (in a Linux terminal window):
/home/hadoop/hadoop-1.2.1/bin/hadoop fs -ls -l /user/hadoop
If this fails, or if it returns something like:
drwx------ - hadoop supergroup ... /user/hadoop/airline
Then you needs to correct the permissions in your filesystem.
Does this work for you?
采纳的回答
Javensius Sembiring
2014-10-21
Hi Kalsi,
Thanks for your feedback. Ludwig and I are currently working in this project. The problem was that the configuration in core-site.xml which contain namenode address (fs.default.name) still refers to local IP address. On the other hand, Matlab requires a correct IP address which directly links to the location of hdfs system. So, by changing the IP fs.default.name to public IP address, the Matlab is now able to connect to hdfs storage system.
The Matlab-hadoop configuration we are developing now consist of three computers which connected each other using private network. All these three computers use Ubuntu OS in which hadoop is installed. One of these computers has two network cards, one for local connection and the other for public connection.
The public network card is used by the other computer client to access to this Hadoop cluster. The problem is when we change the df.default.name (namenode address) to public IP address, the hadoop can not start the other two data nodes since the other two data nodes refers to namenode local IP address. I know that this is not Matlab related problem, but do you know how to configure it correctly ?
Thanks in advance,
2 个评论
更多回答(2 个)
Aaditya Kalsi
2014-10-15
You do need to install Hadoop on your Windows machine and provide that installation path to MATLAB on the same machine through the HADOOP_PREFIX environment variable.
To specify the environment variable on your Windows machine try:
setenv('HADOOP_PREFIX', 'C:\path\to\hadoop_installation')
ds = datastore('hdfs://host/path/to/file.txt', ...)
3 个评论
Aaditya Kalsi
2014-10-16
Could you provide the configuration details? Is the host and port correct and is the path known to exist?
It might also help to ensure that the server name and port are exactly the same as the fs.default.name in your Hadoop configuration file.
If youre not in the same network, you may have to fully qualify the hostname.
Hope this helps.
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 HDF5 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!