hadoop一个节点挂了怎样重启nodemanager
1个回答
展开全部
登录到bigtop1上,vagrant ssh bigtop1
- #!/bin/bash -ex
- #
- # Licensed to the Apache Software Foundation (ASF) under one or more# contributor license agreements. See the NOTICE file distributed with
- # this work for additional information regarding copyright ownership.
- # The ASF licenses this file to You under the Apache License, Version 2.0# (the "License"); you may not use this file except in compliance with
- # the License. You may obtain a copy of the License at
- #
- # Unless required by applicable law or agreed to in writing, software
- # distributed under the License is distributed on an "AS IS" BASIS,
- # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- # See the License for the specific language governing permissions and
- # limitations under the License.
- ### Script requires package bigtop-groovy to be installed
- # Use this script to initialize HDFS directory structure for various components
- # to run. This script can be run from any node in the Hadoop cluster but should
- # only be run once by one node only. If you are planning on using oozie, we
- # recommend that you run this script from a node that has hive, pig, sqoop, etc.
- # installed. Unless you are using psuedo distributed cluster, this node is most
- # likely NOT your namenode
- # Steps to be performed before running this script:
- # 1. Stop the namenode and datanode services if running.
- # 2. Format namenode (su -s /bin/bash hdfs hdfs namenode -format).
- # 3. Start the namenode and datanode services on appropriate nodes.
- # Autodetect JAVA_HOME if not definedif [ -f /usr/lib/bigtop-utils/bigtop-detect-javahome ]; then
- . /usr/lib/bigtop-utils/bigtop-detect-javahomefiHADOOP_LIB_DIR=/usr/lib/hadoop/lib
- HDFS_LIB_DIR=/usr/lib/hadoop-hdfs/lib
- HADOOP_DEPENDENCIES="commons-logging*.jar guava*.jar commons-configuration*.jar commons-collections*.jar slf4j-api*.jar protobuf-java*.jar commons-lang*.jar"HDFS_DEPENDENCIES="htrace-core*.jar"for i in /usr/lib/hadoop/*.jar; do CLASSPATH=$CLASSPATH:$i; done
- for i in /usr/lib/hadoop-yarn/lib/*.jar; do CLASSPATH=$CLASSPATH:$i; done
- CLASSPATH=/etc/hadoop/conf:$CLASSPATH:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar
- pushd .
- cd $HADOOP_LIB_DIR
- for d in $HADOOP_DEPENDENCIES; do CLASSPATH=$CLASSPATH:$HADOOP_LIB_DIR/$d; done
- for d in $HDFS_DEPENDENCIES; do CLASSPATH=$CLASSPATH:$HDFS_LIB_DIR/$d; done
- popd
- su -s /bin/bash hdfs -c "/usr/lib/bigtop-groovy/bin/groovy -classpath $CLASSPATH /usr/lib/hadoop/libexec/init-hcfs.groovy /usr/lib/hadoop/libexec/init-hcfs.json"
将/usr/lib/hadoop/libexec/init-hdfs.sh文件内容替换为:
在bigtop1上执行命令:
puppet apply -d --modulepath=/bigtop-home/bigtop-deploy/puppet/modules:/etc/puppet/modules:/usr/share/puppet/modules /bigtop-home/bigtop-deploy/puppet/manifests/site.pp
执行完后,使用jps命令查看:
推荐律师服务:
若未解决您的问题,请您详细描述您的问题,通过百度律临进行免费专业咨询