Hadoop源码分析三启动及脚本剖析_第1页
Hadoop源码分析三启动及脚本剖析_第2页
Hadoop源码分析三启动及脚本剖析_第3页
Hadoop源码分析三启动及脚本剖析_第4页
Hadoop源码分析三启动及脚本剖析_第5页
已阅读5页,还剩8页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

第Hadoop源码分析三启动及脚本剖析#somevariables

exportHADOOP_LOGFILE=hadoop-$HADOOP_IDENT_STRING-$command-$HOSTNAME.log

exportHADOOP_ROOT_LOGGER=${HADOOP_ROOT_LOGGER:-"INFO,RFA"}

exportHADOOP_SECURITY_LOGGER=${HADOOP_SECURITY_LOGGER:-"INFO,RFAS"}

exportHDFS_AUDIT_LOGGER=${HDFS_AUDIT_LOGGER:-"INFO,NullAppender"}

log=$HADOOP_LOG_DIR/hadoop-$HADOOP_IDENT_STRING-$command-$HOSTNAME.out

pid=$HADOOP_PID_DIR/hadoop-$HADOOP_IDENT_STRING-$command.pid

HADOOP_STOP_TIMEOUT=${HADOOP_STOP_TIMEOUT:-5}

#Setdefaultschedulingpriority

if["$HADOOP_NICENESS"=""];then

exportHADOOP_NICENESS=0

case$startStopin

(start)

[-w"$HADOOP_PID_DIR"]||mkdir-p"$HADOOP_PID_DIR"

if[-f$pid];then

ifkill-0`cat$pid`/dev/null2then

echo$commandrunningasprocess`cat$pid`.Stopitfirst.

exit1

if["$HADOOP_MASTER"!=""];then

echorsyncfrom$HADOOP_MASTER

rsync-a-essh--delete--exclude=.svn--exclude='logs/*'--exclude='contrib/hod/logs/*'$HADOOP_MASTER/"$HADOOP_PREFIX"

hadoop_rotate_log$log

echostarting$command,loggingto$log

cd"$HADOOP_PREFIX"

case$commandin

namenode|secondarynamenode|datanode|journalnode|dfs|dfsadmin|fsck|balancer|zkfc)

if[-z"$HADOOP_HDFS_HOME"];then

hdfsScript="$HADOOP_PREFIX"/bin/hdfs

else

hdfsScript="$HADOOP_HDFS_HOME"/bin/hdfs

nohupnice-n$HADOOP_NICENESS$hdfsScript--config$HADOOP_CONF_DIR$command"$@""$log"21/dev/null

nohupnice-n$HADOOP_NICENESS$hadoopScript--config$HADOOP_CONF_DIR$command"$@""$log"21/dev/null

esac

echo$!$pid

sleep1

head"$log"

#capturetheulimitoutput

if["true"="$starting_secure_dn"];then

echo"ulimit-aforsecuredatanodeuser$HADOOP_SECURE_DN_USER"$log

#capturetheulimitinfofortheappropriateuser

su--shell=/bin/bash$HADOOP_SECURE_DN_USER-c'ulimit-a'$log21

elif["true"="$starting_privileged_nfs"];then

echo"ulimit-aforprivilegednfsuser$HADOOP_PRIVILEGED_NFS_USER"$log

su--shell=/bin/bash$HADOOP_PRIVILEGED_NFS_USER-c'ulimit-a'$log21

else

echo"ulimit-aforuser$USER"$log

ulimit-a$log21

sleep3;

if!ps-p$!/dev/null;then

exit1

(stop)

if[-f$pid];then

TARGET_PID=`cat$pid`

ifkill-0$TARGET_PID/dev/null2then

echostopping$command

kill$TARGET_PID

sleep$HADOOP_STOP_TIMEOUT

ifkill-0$TARGET_PID/dev/null2then

echo"$commanddidnotstopgracefullyafter$HADOOP_STOP_TIMEOUTseconds:killingwithkill-9"

kill-9$TARGET_PID

else

echono$commandtostop

rm-f$pid

else

echono$commandtostop

echo$usage

exit1

这段代码的重点在第131行到结束。这里是真正在启动服务的代码,这个文件在调用的时候,会传入两个重要的参数start/stopxxx。用于启动或停止某些服务。以启动服务为例,其重点在第153行,这里会执行一个hdfsScript脚本。这个参数的定义在第155行,

这里可以看见它实际是hadoop的bin目录下的hdfs文件

文件的内容如下:

#!/usr/bin/envbash

#LicensedtotheApacheSoftwareFoundation(ASF)underoneormore

#contributorlicenseagreements.SeetheNOTICEfiledistributedwith

#thisworkforadditionalinformationregardingcopyrightownership.

#TheASFlicensesthisfiletoYouundertheApacheLicense,Version2.0

#(the"License");youmaynotusethisfileexceptincompliancewith

#theLicense.YoumayobtainacopyoftheLicenseat

#/licenses/LICENSE-2.0

#Unlessrequiredbyapplicablelaworagreedtoinwriting,software

#distributedundertheLicenseisdistributedonan"ASIS"BASIS,

#WITHOUTWARRANTIESORCONDITIONSOFANYKIND,eitherexpressorimplied.

#SeetheLicenseforthespecificlanguagegoverningpermissionsand

#limitationsundertheLicense.

#EnvironmentVariables

#JSVC_HOMEhomedirectoryofjsvcbinary.Requiredforstartingsecure

#datanode.

#JSVC_OUTFILEpathtojsvcoutputfile.Defaultsto

#$HADOOP_LOG_DIR/jsvc.out.

#JSVC_ERRFILEpathtojsvcerrorfile.Defaultsto$HADOOP_LOG_DIR/jsvc.err.

bin=`which$0`

bin=`dirname${bin}`

bin=`cd"$bin"/dev/null;pwd`

DEFAULT_LIBEXEC_DIR="$bin"/../libexec

HADOOP_LIBEXEC_DIR=${HADOOP_LIBEXEC_DIR:-$DEFAULT_LIBEXEC_DIR}

.$HADOOP_LIBEXEC_DIR/hdfs-config.sh

functionprint_usage(){

echo"Usage:hdfs[--configconfdir][--loglevelloglevel]COMMAND"

echo"whereCOMMANDisoneof:"

echo"dfsrunafilesystemcommandonthefilesystemssupportedinHadoop."

echo"classpathprintstheclasspath"

echo"namenode-formatformattheDFSfilesystem"

echo"secondarynamenoderuntheDFSsecondarynamenode"

echo"namenoderuntheDFSnamenode"

echo"journalnoderuntheDFSjournalnode"

echo"zkfcruntheZKFailoverControllerdaemon"

echo"datanoderunaDFSdatanode"

echo"dfsadminrunaDFSadminclient"

echo"haadminrunaDFSHAadminclient"

echo"fsckrunaDFSfilesystemcheckingutility"

echo"balancerrunaclusterbalancingutility"

echo"jmxgetgetJMXexportedvaluesfromNameNodeorDataNode."

echo"moverrunautilitytomoveblockreplicasacross"

echo"storagetypes"

echo"oivapplytheofflinefsimageviewertoanfsimage"

echo"oiv_legacyapplytheofflinefsimageviewertoanlegacyfsimage"

echo"oevapplytheofflineeditsviewertoaneditsfile"

echo"fetchdtfetchadelegationtokenfromtheNameNode"

echo"getconfgetconfigvaluesfromconfiguration"

echo"groupsgetthegroupswhichusersbelongto"

echo"snapshotDiffdifftwosnapshotsofadirectoryordiffthe"

echo"currentdirectorycontentswithasnapshot"

echo"lsSnapshottableDirlistallsnapshottabledirsownedbythecurrentuser"

echo"Use-helptoseeoptions"

echo"portmaprunaportmapservice"

echo"nfs3runanNFSversion3gateway"

echo"cacheadminconfiguretheHDFScache"

echo"cryptoconfigureHDFSencryptionzones"

echo"storagepolicieslist/get/setblockstoragepolicies"

echo"versionprinttheversion"

echo""

echo"Mostcommandsprinthelpwheninvokedw/oparameters."

#Therearealsodebugcommands,buttheydon'tshowupinthislisting.

if[$#=0];then

print_usage

exit

COMMAND=$1

shift

case$COMMANDin

#usageflags

--help|-help|-h)

print_usage

exit

#Determineifwe'restartingasecuredatanode,andifso,redefineappropriatevariables

if["$COMMAND"=="datanode"]["$EUID"-eq0][-n"$HADOOP_SECURE_DN_USER"];then

if[-n"$JSVC_HOME"];then

if[-n"$HADOOP_SECURE_DN_PID_DIR"];then

HADOOP_PID_DIR=$HADOOP_SECURE_DN_PID_DIR

if[-n"$HADOOP_SECURE_DN_LOG_DIR"];then

HADOOP_LOG_DIR=$HADOOP_SECURE_DN_LOG_DIR

HADOOP_OPTS="$HADOOP_OPTS-Dhadoop.log.dir=$HADOOP_LOG_DIR"

HADOOP_IDENT_STRING=$HADOOP_SECURE_DN_USER

HADOOP_OPTS="$HADOOP_OPTS-Dhadoop.id.str=$HADOOP_IDENT_STRING"

starting_secure_dn="true"

else

echo"Itlookslikeyou'retryingtostartasecureDN,but\$JSVC_HOME"\

"isn'tset.FallingbacktostartinginsecureDN."

#Determineifwe'restartingaprivilegedNFSdaemon,andifso,redefineappropriatevariables

if["$COMMAND"=="nfs3"]["$EUID"-eq0][-n"$HADOOP_PRIVILEGED_NFS_USER"];then

if[-n"$JSVC_HOME"];then

if[-n"$HADOOP_PRIVILEGED_NFS_PID_DIR"];then

HADOOP_PID_DIR=$HADOOP_PRIVILEGED_NFS_PID_DIR

if[-n"$HADOOP_PRIVILEGED_NFS_LOG_DIR"];then

HADOOP_LOG_DIR=$HADOOP_PRIVILEGED_NFS_LOG_DIR

HADOOP_OPTS="$HADOOP_OPTS-Dhadoop.log.dir=$HADOOP_LOG_DIR"

HADOOP_IDENT_STRING=$HADOOP_PRIVILEGED_NFS_USER

HADOOP_OPTS="$HADOOP_OPTS-Dhadoop.id.str=$HADOOP_IDENT_STRING"

starting_privileged_nfs="true"

else

echo"Itlookslikeyou'retryingtostartaprivilegedNFSserver,but"\

"\$JSVC_HOMEisn'tset.FallingbacktostartingunprivilegedNFSserver."

if["$COMMAND"="namenode"];then

HADOOP_OPTS="$HADOOP_OPTS$HADOOP_NAMENODE_OPTS"

elif["$COMMAND"="zkfc"];then

HADOOP_OPTS="$HADOOP_OPTS$HADOOP_ZKFC_OPTS"

elif["$COMMAND"="secondarynamenode"];then

HADOOP_OPTS="$HADOOP_OPTS$HADOOP_SECONDARYNAMENODE_OPTS"

elif["$COMMAND"="datanode"];then

if["$starting_secure_dn"="true"];then

HADOOP_OPTS="$HADOOP_OPTS-jvmserver$HADOOP_DATANODE_OPTS"

else

HADOOP_OPTS="$HADOOP_OPTS-server$HADOOP_DATANODE_OPTS"

elif["$COMMAND"="journalnode"];then

HADOOP_OPTS="$HADOOP_OPTS$HADOOP_JOURNALNODE_OPTS"

elif["$COMMAND"="dfs"];then

"$HADOOP_OPTS$HADOOP_CLIENT_OPTS"

elif["$COMMAND"="dfsadmin"];then

"$HADOOP_OPTS$HADOOP_CLIENT_OPTS"

elif["$COMMAND"="haadmin"];then

"$HADOOP_OPTS$HADOOP_CLIENT_OPTS"

elif["$COMMAND"="fsck"];then

"$HADOOP_OPTS$HADOOP_CLIENT_OPTS"

elif["$COMMAND"="balancer"];then

"$HADOOP_OPTS$HADOOP_BALANCER_OPTS"

elif["$COMMAND"="mover"];then

"${HADOOP_OPTS}${HADOOP_MOVER_OPTS}"

elif["$COMMAND"="storagepolicies"];then

"$COMMAND"="jmxget"];then

"$COMMAND"="oiv"];then

"$COMMAND"="oiv_legacy"];then

"$COMMAND"="oev"];then

"$COMMAND"="fetchdt"];then

"$COMMAND"="getconf"];then

"$COMMAND"="groups"];then

"$COMMAND"="snapshotDiff"];then

"$COMMAND"="lsSnapshottableDir"];then

"$COMMAND"="portmap"];then

"$HADOOP_OPTS$HADOOP_PORTMAP_OPTS"

elif["$COMMAND"="nfs3"];then

"$HADOOP_OPTS$HADOOP_NFS3_OPTS"

elif["$COMMAND"="cacheadmin"];then

"$COMMAND"="crypto"];then

"$COMMAND"="version"];then

"$COMMAND"="debug"];then

"$COMMAND"="classpath"];then

if["$#"-gt0];then

"$CLASSPATH"2/dev/null)

echo$CLASSPATH

exit0

#cygwinpathtranslation

if$cygwin;then

CLASSPATH=$(cygpath-p-w"$CLASSPATH"2/dev/null)

HADOOP_LOG_DIR=$(cygpath-w"$HADOOP_LOG_DIR"2/dev/null)

HADOOP_PREFIX=$(cygpath-w"$HADOOP_PREFIX"2/dev/null)

HADOOP_CONF_DIR=$(cygpath-w"$HADOOP_CONF_DIR"2/dev/null)

HADOOP_COMMON_HOME=$(cygpath-w"$HADOOP_COMMON_HOME"2/dev/null)

HADOOP_HDFS_HOME=$(cygpath-w"$HADOOP_HDFS_HOME"2/dev/null)

HADOOP_YARN_HOME=$(cygpath-w"$HADOOP_YARN_HOME"2/dev/null)

HADOOP_MAPRED_HOME=$(cygpath-w"$HADOOP_MAPRED_HOME"2/dev/null)

exportCLASSPATH=$CLASSPATH

HADOOP_OPTS="$HADOOP_OPTS-Dhadoop.security.logger=${HADOOP_SECURITY_LOGGER:-INFO,NullAppender}"

#Checktoseeifweshouldstartasecuredatanode

if["$starting_secure_dn"="true"];then

if["$HADOOP_PID_DIR"=""];then

HADOOP_SECURE_DN_PID="/tmp/hadoop_secure_dn.pid"

else

HADOOP_SECURE_DN_PID="$HADOOP_PID_DIR/hadoop_secure_dn.pid"

JSVC=$JSVC_HOME/jsvc

if[!-f$JSVC];then

echo"JSVC_HOMEisnotsetcorrectlysojsvccannotbefound.jsvcisrequiredtorunsecuredatanodes."

echo"Pleasedownloadandinstalljsvcfrom/dist/commons/daemon/binaries/"\

"andsetJSVC_HOMEtothedirectorycontainingthejsvcbinary."

exit

if[[!$JSVC_OUTFILE]];then

JSVC_OUTFILE="$HADOOP_LOG_DIR/jsvc.out"

if[[!$JSVC_ERRFILE]];then

JSVC_ERRFILE="$HADOOP_LOG_DIR/jsvc.err"

exec"$JSVC"\

-Dproc_$COMMAND-outfile"$JSVC_OUTFILE"\

-errfile"$JSVC_ERRFILE"\

-pidfile"$HADOOP_SECURE_DN_PID"\

-nodetach\

-user"$HADOOP_SECURE_DN_USER"\

-cp"$CLASSPATH"\

$JAVA_HEAP_MAX$HADOOP_OPTS\

org.apache.hadoop.hdfs.server.datanode.SecureDataNodeStarter"$@"

elif["$starting_privileged_nfs"="true"];then

if["$HADOOP_PID_DIR"=""];then

HADOOP_PRIVILEGED_NFS_PID="/tmp/hadoop_privileged_nfs3.pid"

else

HADOOP_PRIVILEGED_NFS_PID="$HADOOP_PID_DIR/h

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论