安装spark必须要安装scala吗
2015-11-09
展开全部
你好朋友,可以不用装。【汽车有问题,问汽车大师。4S店专业技师,10分钟解决。】
已赞过
已踩过<
评论
收起
你对这个回答的评价是?
2015-08-21 · 知道合伙人软件行家
关注
展开全部
安装spark
tar -zxvf spark-1.3.0-bin-hadoop2.3.tgz
mkdir /usr/local/spark
mv spark-1.3.0-bin-hadoop2.3 /usr/local/spark
vim /etc/bashrc
export SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3
export PATH=$SCALA_HOME/bin:$SPARK_HOME/bin:$PATH
source /etc/bashrc
cd /usr/local/绝拍唤spark/spark-1.3.0-bin-hadoop2.3/conf/
cp spark-env.sh.template spark-env.sh
vim spark-env.sh
export JAVA_HOME=/java
export SCALA_HOME=/usr/lib/scala/scala-2.10.5
export SPARK_HOME=/usr/local/贺哗spark/spark-1.3.0-bin-hadoop2.3
export SPARK_MASTER_IP=192.168.137.101
export SPARK_WORKER_MEMORY=1g
export HADOOP_CONF_DIR=/home/hadoop/hadoop/etc/hadoop
export SPARK_LIBRARY_PATH=$SPARK_HOME/lib
export SCALA_LIBRARY_PATH=$SPARK_LIBRARY_PATH
cp slaves.template slaves
vim slaves
hd1
hd2
hd3
hd4
hd5
7、分发
scp /etc/bashrc hd2:/etc
scp /etc/bashrc hd3:/etc
scp /etc/bashrc hd4:/etc
scp /etc/bashrc hd5:/etc
scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd2:/usr/local/spark/
scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd3:/usr/local/spark/
scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd4:/usr/local/spark/
scp -r /usr/并凯local/spark/spark-1.3.0-bin-hadoop2.3 hd5:/usr/local/spark/
7、 启动
在hd1,启动
cd $SPARK_HOME/sbin
./start-all.sh
tar -zxvf spark-1.3.0-bin-hadoop2.3.tgz
mkdir /usr/local/spark
mv spark-1.3.0-bin-hadoop2.3 /usr/local/spark
vim /etc/bashrc
export SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3
export PATH=$SCALA_HOME/bin:$SPARK_HOME/bin:$PATH
source /etc/bashrc
cd /usr/local/绝拍唤spark/spark-1.3.0-bin-hadoop2.3/conf/
cp spark-env.sh.template spark-env.sh
vim spark-env.sh
export JAVA_HOME=/java
export SCALA_HOME=/usr/lib/scala/scala-2.10.5
export SPARK_HOME=/usr/local/贺哗spark/spark-1.3.0-bin-hadoop2.3
export SPARK_MASTER_IP=192.168.137.101
export SPARK_WORKER_MEMORY=1g
export HADOOP_CONF_DIR=/home/hadoop/hadoop/etc/hadoop
export SPARK_LIBRARY_PATH=$SPARK_HOME/lib
export SCALA_LIBRARY_PATH=$SPARK_LIBRARY_PATH
cp slaves.template slaves
vim slaves
hd1
hd2
hd3
hd4
hd5
7、分发
scp /etc/bashrc hd2:/etc
scp /etc/bashrc hd3:/etc
scp /etc/bashrc hd4:/etc
scp /etc/bashrc hd5:/etc
scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd2:/usr/local/spark/
scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd3:/usr/local/spark/
scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd4:/usr/local/spark/
scp -r /usr/并凯local/spark/spark-1.3.0-bin-hadoop2.3 hd5:/usr/local/spark/
7、 启动
在hd1,启动
cd $SPARK_HOME/sbin
./start-all.sh
本回答被提问者和网友采纳
已赞过
已踩过<
评论
收起
你对这个回答的评价是?
推荐律师服务:
若未解决您的问题,请您详细描述您的问题,通过百度律临进行免费专业咨询