Sparkのインストールにハマる[4]

プリコンパイル済みの最新版を入れてみました.

$ wget http://people.apache.org/~pwendell/spark-1.0.0-rc3/spark-1.0.0-bin-hadoop2.tgz
$ tar xf spark-1.0.0-bin-hadoop2.tgz
$ chown -R spark:spark spark-1.0.0-bin-hadoop2
$ ln -sv spark-1.0.0-bin-hadoop2 spark
`spark' -> `spark-1.0.0-bin-hadoop2'
$ sudo su spark
$ cd spark
$ cat RELEASE
Spark 1.0.0 built for Hadoop 2.2.0
$ $ SPARK_YARN=true HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop SPARK_JAR=./lib/spark-assembly-1.0.0-hadoop2.2.0.jar ./bin/spark-submit ./lib/spark-examples-1.0.0-hadoop2.2.0.jar --master yarn --class org.apache.spark.examples.SparkPi yarn-client
14/05/11 21:04:41 INFO spark.SecurityManager: Changing view acls to: hadoop
14/05/11 21:04:41 INFO spark.SecurityManager: SecurityManager, is authentication enabled: false are ui acls enabled: false users with view permissions: Set(hadoop)
14/05/11 21:04:42 INFO slf4j.Slf4jLogger: Slf4jLogger started
14/05/11 21:04:42 INFO Remoting: Starting remoting

進んでる雰囲気...しかしここで止まっている.

INFO client.RMProxy: Connecting to ResourceManager at myresoucemanager/192.168.1.4:8040

リソースマネージャ側では,エラーが出てます.

 WARN org.apache.hadoop.ipc.Server: Incorrect header or version mismatch from 192.168.1.9:56819 got version 9 expected version 7

今度はSparkのバージョンが進みすぎたようです...