<rt id="bn8ez"></rt>
<label id="bn8ez"></label>

  • <span id="bn8ez"></span>

    <label id="bn8ez"><meter id="bn8ez"></meter></label>

    so true

    心懷未來,開創未來!
    隨筆 - 160, 文章 - 0, 評論 - 40, 引用 - 0
    數據加載中……

    一個簡單shell腳本

    今天能寫出這樣一個shell腳本,其實并沒有費太大力氣,因此并不是說我幾經周折終有結果而興奮,而是覺得自己現在終于可以踏實下來做自己喜歡做的事情,能夠專注的去學該學的東西而興奮。之前學了很多雜七雜八的東西,因為目標不明確,很痛苦,究其根本,是因為不知道自己將從事什么職業,只知道自己想從事IT這行,但具體的工作方向卻不知道,因此啥都要學習,這個過程對于我來說很痛苦。因為我是一個比較喜歡踏踏實實做事的人,不做就不做,做就要做得很好。我之前看過一篇關于論述程序員浮躁的文章,寫得太精彩了。而里面提到的很多浮躁的做法都在我身上得到了印證,這讓我很郁悶。現在,工作定了,我知道該學點啥了,目標專注了,太美好了。

    借用Steven Jobs的一番話來說就是:

    The only way to be truely satisfied is to do what you believe is great work, and the only way to do great work is to love what you do!

    我覺得一個人能做到這一步,真的很幸福,自己去努力,去拼搏,去實現自己的價值,讓自己對自己的表現滿意,這是我經常對自己說的一句話。

    現在的我,工作定了,女友也定了,也就是媳婦定了,我需要做的就是去奮斗,去努力,去拼搏。

    我很感謝自己能遇到這樣一個媳婦,能支持我,關心我,我不知道自己今后會不會很成功,但是我知道有了這個好內柱,我做什么都踏實。我知道,有了她,我太幸福,我也一定會帶給她幸福的,I promise!

     

    好了,下面就把代碼貼出來吧,呵呵:

    #!/bin/sh

    cd /hadoop/logs

    var="`ls *.log`"
    cur=""
    name=""
    file=log_name.txt

    if [ -e $file ]; then
     rm $file
    fi

    for cur in $var
    do
     name=`echo $cur | cut -d'-' -f3`
     
     #cat $cur | grep ^2008 | awk '{print $0 " [`echo $name`]"}' >> $file
     cat $cur | grep ^2008 | sed "s/^.*$/&[$name]/" >> $file
     #awk '{print $0 " [`echo $name`]"}' >> $file
    done

    cp $file __temp.txt
    sort __temp.txt >$file
    rm __temp.txt

    運行的結果是:

    2008-11-14 10:08:47,671 INFO org.apache.hadoop.dfs.NameNode: STARTUP_MSG: [namenode]
    2008-11-14 10:08:48,140 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: Initializing RPC Metrics with hostName=NameNode, port=9000[namenode]
    2008-11-14 10:08:48,171 INFO org.apache.hadoop.dfs.NameNode: Namenode up at: bacoo/192.168.1.34:9000[namenode]
    2008-11-14 10:08:48,171 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=NameNode, sessionId=null[namenode]
    2008-11-14 10:08:48,234 INFO org.apache.hadoop.dfs.NameNodeMetrics: Initializing NameNodeMeterics using context object:org.apache.hadoop.metrics.spi.NullContext[namenode]
    2008-11-14 10:08:48,875 INFO org.apache.hadoop.dfs.FSNamesystemMetrics: Initializing FSNamesystemMeterics using context object:org.apache.hadoop.metrics.spi.NullContext[namenode]
    2008-11-14 10:08:48,875 INFO org.apache.hadoop.fs.FSNamesystem: fsOwner=Zhaoyb,None,root,Administrators,Users,Debugger,Users[namenode]
    2008-11-14 10:08:48,875 INFO org.apache.hadoop.fs.FSNamesystem: isPermissionEnabled=true[namenode]
    2008-11-14 10:08:48,875 INFO org.apache.hadoop.fs.FSNamesystem: supergroup=supergroup[namenode]
    2008-11-14 10:08:48,890 INFO org.apache.hadoop.fs.FSNamesystem: Registered FSNamesystemStatusMBean[namenode]
    2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Edits file edits of size 4 edits # 0 loaded in 0 seconds.[namenode]
    2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Image file of size 80 loaded in 0 seconds.[namenode]
    2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Number of files = 0[namenode]
    2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Number of files under construction = 0[namenode]
    2008-11-14 10:08:48,953 INFO org.apache.hadoop.fs.FSNamesystem: Finished loading FSImage in 657 msecs[namenode]
    2008-11-14 10:08:49,000 INFO org.apache.hadoop.dfs.StateChange: STATE* Leaving safe mode after 0 secs.[namenode]
    2008-11-14 10:08:49,000 INFO org.apache.hadoop.dfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes[namenode]
    2008-11-14 10:08:49,000 INFO org.apache.hadoop.dfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks[namenode]
    2008-11-14 10:08:49,609 INFO org.mortbay.util.Credential: Checking Resource aliases[namenode]
    2008-11-14 10:08:50,015 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[namenode]
    2008-11-14 10:08:50,015 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][namenode]
    2008-11-14 10:08:50,015 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][namenode]
    2008-11-14 10:08:54,656 INFO org.mortbay.util.Container: Started org.mortbay.jetty.servlet.WebApplicationHandler@17f11fb[namenode]
    2008-11-14 10:08:55,453 INFO org.mortbay.util.Container: Started WebApplicationContext[/,/][namenode]
    2008-11-14 10:08:55,468 INFO org.apache.hadoop.fs.FSNamesystem: Web-server up at: 0.0.0.0:50070[namenode]
    2008-11-14 10:08:55,468 INFO org.mortbay.http.SocketListener: Started SocketListener on 0.0.0.0:50070[namenode]
    2008-11-14 10:08:55,468 INFO org.mortbay.util.Container: Started org.mortbay.jetty.Server@61a907[namenode]
    2008-11-14 10:08:55,484 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting[namenode]
    2008-11-14 10:08:55,484 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting[namenode]
    2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 1 on 9000: starting[namenode]
    2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 3 on 9000: starting[namenode]
    2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 5 on 9000: starting[namenode]
    2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 7 on 9000: starting[namenode]
    2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 9 on 9000: starting[namenode]
    2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 9000: starting[namenode]
    2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 2 on 9000: starting[namenode]
    2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 4 on 9000: starting[namenode]
    2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 6 on 9000: starting[namenode]
    2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 8 on 9000: starting[namenode]
    2008-11-14 10:08:56,015 INFO org.apache.hadoop.dfs.NameNode.Secondary: STARTUP_MSG: [secondarynamenode]
    2008-11-14 10:08:56,156 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=SecondaryNameNode, sessionId=null[secondarynamenode]
    2008-11-14 10:08:56,468 WARN org.apache.hadoop.dfs.Storage: Checkpoint directory \tmp\hadoop-SYSTEM\dfs\namesecondary is added.[secondarynamenode]
    2008-11-14 10:08:56,546 INFO org.mortbay.util.Credential: Checking Resource aliases[secondarynamenode]
    2008-11-14 10:08:56,609 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[secondarynamenode]
    2008-11-14 10:08:56,609 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][secondarynamenode]
    2008-11-14 10:08:56,609 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][secondarynamenode]
    2008-11-14 10:08:56,953 INFO org.mortbay.jetty.servlet.XMLConfiguration: No WEB-INF/web.xml in file:/E:/cygwin/hadoop/webapps/secondary. Serving files and default/dynamic servlets only[secondarynamenode]
    2008-11-14 10:08:56,953 INFO org.mortbay.util.Container: Started org.mortbay.jetty.servlet.WebApplicationHandler@b1a4e2[secondarynamenode]
    2008-11-14 10:08:57,062 INFO org.mortbay.util.Container: Started WebApplicationContext[/,/][secondarynamenode]
    2008-11-14 10:08:57,078 INFO org.apache.hadoop.dfs.NameNode.Secondary: Secondary Web-server up at: 0.0.0.0:50090[secondarynamenode]
    2008-11-14 10:08:57,078 INFO org.mortbay.http.SocketListener: Started SocketListener on 0.0.0.0:50090[secondarynamenode]
    2008-11-14 10:08:57,078 INFO org.mortbay.util.Container: Started org.mortbay.jetty.Server@18a8ce2[secondarynamenode]
    2008-11-14 10:08:57,078 WARN org.apache.hadoop.dfs.NameNode.Secondary: Checkpoint Period   :3600 secs (60 min)[secondarynamenode]
    2008-11-14 10:08:57,078 WARN org.apache.hadoop.dfs.NameNode.Secondary: Log Size Trigger    :67108864 bytes (65536 KB)[secondarynamenode]
    2008-11-14 10:08:59,828 INFO org.apache.hadoop.mapred.JobTracker: STARTUP_MSG: [jobtracker]
    2008-11-14 10:09:00,015 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: Initializing RPC Metrics with hostName=JobTracker, port=9001[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 9001: starting[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 1 on 9001: starting[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 2 on 9001: starting[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 3 on 9001: starting[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 4 on 9001: starting[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 5 on 9001: starting[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 6 on 9001: starting[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 7 on 9001: starting[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 8 on 9001: starting[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 9 on 9001: starting[jobtracker]
    2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9001: starting[jobtracker]
    2008-11-14 10:09:00,125 INFO org.mortbay.util.Credential: Checking Resource aliases[jobtracker]
    2008-11-14 10:09:01,703 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[jobtracker]
    2008-11-14 10:09:01,703 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][jobtracker]
    2008-11-14 10:09:01,703 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][jobtracker]
    2008-11-14 10:09:02,312 INFO org.mortbay.util.Container: Started org.mortbay.jetty.servlet.WebApplicationHandler@1cd280b[jobtracker]
    2008-11-14 10:09:08,359 INFO org.mortbay.util.Container: Started WebApplicationContext[/,/][jobtracker]
    2008-11-14 10:09:08,375 INFO org.apache.hadoop.mapred.JobTracker: JobTracker up at: 9001[jobtracker]
    2008-11-14 10:09:08,375 INFO org.apache.hadoop.mapred.JobTracker: JobTracker webserver: 50030[jobtracker]
    2008-11-14 10:09:08,375 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=[jobtracker]
    2008-11-14 10:09:08,375 INFO org.mortbay.http.SocketListener: Started SocketListener on 0.0.0.0:50030[jobtracker]
    2008-11-14 10:09:08,375 INFO org.mortbay.util.Container: Started org.mortbay.jetty.Server@16a9b9c[jobtracker]
    2008-11-14 10:09:12,984 INFO org.apache.hadoop.mapred.JobTracker: Starting RUNNING[jobtracker]
    2008-11-14 10:09:56,894 INFO org.apache.hadoop.dfs.DataNode: STARTUP_MSG: [datanode]
    2008-11-14 10:10:02,516 INFO org.apache.hadoop.mapred.TaskTracker: STARTUP_MSG: [tasktracker]
    2008-11-14 10:10:08,768 INFO org.apache.hadoop.dfs.Storage: Formatting ...[datanode]
    2008-11-14 10:10:08,768 INFO org.apache.hadoop.dfs.Storage: Storage directory /hadoop/hadoopfs/data is not formatted.[datanode]
    2008-11-14 10:10:11,343 INFO org.apache.hadoop.dfs.DataNode: Registered FSDatasetStatusMBean[datanode]
    2008-11-14 10:10:11,347 INFO org.apache.hadoop.dfs.DataNode: Opened info server at 50010[datanode]
    2008-11-14 10:10:11,352 INFO org.apache.hadoop.dfs.DataNode: Balancing bandwith is 1048576 bytes/s[datanode]
    2008-11-14 10:10:16,430 INFO org.mortbay.util.Credential: Checking Resource aliases[tasktracker]
    2008-11-14 10:10:17,976 INFO org.mortbay.util.Credential: Checking Resource aliases[datanode]
    2008-11-14 10:10:20,068 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[datanode]
    2008-11-14 10:10:20,089 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][datanode]
    2008-11-14 10:10:20,089 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][datanode]
    2008-11-14 10:10:20,725 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[tasktracker]
    2008-11-14 10:10:20,727 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][tasktracker]
    2008-11-14 10:10:20,727 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][tasktracker]
    2008-11-14 10:10:27,078 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/localhost[jobtracker]
    2008-11-14 10:10:32,171 INFO org.apache.hadoop.dfs.StateChange: BLOCK* NameSystem.registerDatanode: node registration from 192.168.1.167:50010 storage DS-1556534590-127.0.0.1-50010-1226628640386[namenode]
    2008-11-14 10:10:32,187 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/192.168.1.167:50010[namenode]
    2008-11-14 10:13:57,171 WARN org.apache.hadoop.dfs.Storage: Checkpoint directory \tmp\hadoop-SYSTEM\dfs\namesecondary is added.[secondarynamenode]
    2008-11-14 10:13:57,187 INFO org.apache.hadoop.fs.FSNamesystem: Number of transactions: 5 Total time for transactions(ms): 0 Number of syncs: 3 SyncTimes(ms): 4125 [namenode]
    2008-11-14 10:13:57,187 INFO org.apache.hadoop.fs.FSNamesystem: Roll Edit Log from 192.168.1.34[namenode]
    2008-11-14 10:13:57,953 INFO org.apache.hadoop.dfs.NameNode.Secondary: Downloaded file fsimage size 80 bytes.[secondarynamenode]
    2008-11-14 10:13:57,968 INFO org.apache.hadoop.dfs.NameNode.Secondary: Downloaded file edits size 288 bytes.[secondarynamenode]
    2008-11-14 10:13:58,593 INFO org.apache.hadoop.fs.FSNamesystem: fsOwner=Zhaoyb,None,root,Administrators,Users,Debugger,Users[secondarynamenode]
    2008-11-14 10:13:58,593 INFO org.apache.hadoop.fs.FSNamesystem: isPermissionEnabled=true[secondarynamenode]
    2008-11-14 10:13:58,593 INFO org.apache.hadoop.fs.FSNamesystem: supergroup=supergroup[secondarynamenode]
    2008-11-14 10:13:58,640 INFO org.apache.hadoop.dfs.Storage: Edits file edits of size 288 edits # 5 loaded in 0 seconds.[secondarynamenode]
    2008-11-14 10:13:58,640 INFO org.apache.hadoop.dfs.Storage: Number of files = 0[secondarynamenode]
    2008-11-14 10:13:58,640 INFO org.apache.hadoop.dfs.Storage: Number of files under construction = 0[secondarynamenode]
    2008-11-14 10:13:58,718 INFO org.apache.hadoop.dfs.Storage: Image file of size 367 saved in 0 seconds.[secondarynamenode]
    2008-11-14 10:13:58,796 INFO org.apache.hadoop.fs.FSNamesystem: Number of transactions: 0 Total time for transactions(ms): 0 Number of syncs: 0 SyncTimes(ms): 0 [secondarynamenode]
    2008-11-14 10:13:58,921 INFO org.apache.hadoop.dfs.NameNode.Secondary: Posted URL 0.0.0.0:50070putimage=1&port=50090&machine=192.168.1.34&token=-16:145044639:0:1226628551796:1226628513000[secondarynamenode]
    2008-11-14 10:13:59,078 INFO org.apache.hadoop.fs.FSNamesystem: Number of transactions: 0 Total time for transactions(ms): 0 Number of syncs: 0 SyncTimes(ms): 0 [namenode]
    2008-11-14 10:13:59,078 INFO org.apache.hadoop.fs.FSNamesystem: Roll FSImage from 192.168.1.34[namenode]
    2008-11-14 10:13:59,265 WARN org.apache.hadoop.dfs.NameNode.Secondary: Checkpoint done. New Image Size: 367[secondarynamenode]
    2008-11-14 10:29:02,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 0 time(s).[secondarynamenode]
    2008-11-14 10:29:04,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 1 time(s).[secondarynamenode]
    2008-11-14 10:29:06,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 2 time(s).[secondarynamenode]
    2008-11-14 10:29:08,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 3 time(s).[secondarynamenode]
    2008-11-14 10:29:10,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 4 time(s).[secondarynamenode]
    2008-11-14 10:29:11,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 5 time(s).[secondarynamenode]
    2008-11-14 10:29:13,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 6 time(s).[secondarynamenode]
    2008-11-14 10:29:15,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 7 time(s).[secondarynamenode]
    2008-11-14 10:29:17,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 8 time(s).[secondarynamenode]
    2008-11-14 10:29:19,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 9 time(s).[secondarynamenode]
    2008-11-14 10:29:21,078 ERROR org.apache.hadoop.dfs.NameNode.Secondary: Exception in doCheckpoint: [secondarynamenode]
    2008-11-14 10:29:21,171 ERROR org.apache.hadoop.dfs.NameNode.Secondary: java.io.IOException: Call failed on local exception[secondarynamenode]
    2008-11-14 10:34:23,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 0 time(s).[secondarynamenode]
    2008-11-14 10:34:25,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 1 time(s).[secondarynamenode]
    2008-11-14 10:34:27,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 2 time(s).[secondarynamenode]
    2008-11-14 10:34:29,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 3 time(s).[secondarynamenode]
    2008-11-14 10:34:31,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 4 time(s).[secondarynamenode]
    2008-11-14 10:34:32,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 5 time(s).[secondarynamenode]
    2008-11-14 10:34:34,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 6 time(s).[secondarynamenode]
    2008-11-14 10:34:36,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 7 time(s).[secondarynamenode]
    2008-11-14 10:34:38,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 8 time(s).[secondarynamenode]
    2008-11-14 10:34:40,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 9 time(s).[secondarynamenode]
    2008-11-14 10:34:41,468 ERROR org.apache.hadoop.dfs.NameNode.Secondary: Exception in doCheckpoint: [secondarynamenode]
    2008-11-14 10:34:41,468 ERROR org.apache.hadoop.dfs.NameNode.Secondary: java.io.IOException: Call failed on local exception[secondarynamenode]
    2008-11-14 10:38:43,359 INFO org.apache.hadoop.dfs.NameNode.Secondary: SHUTDOWN_MSG: [secondarynamenode]

    我相信,這樣就可以按照時間的順序,把生產的日志好好理一遍順序了,而且每一個步驟后面還都有了各自對應的node類型。

    posted on 2008-11-15 01:23 so true 閱讀(1507) 評論(0)  編輯  收藏 所屬分類: Linux

    主站蜘蛛池模板: a级毛片免费在线观看| www.亚洲精品| 无码少妇精品一区二区免费动态 | 色窝窝亚洲AV网在线观看| 伊人久久综在合线亚洲2019| 亚洲另类激情专区小说图片| 好爽…又高潮了毛片免费看| 9420免费高清在线视频| 最近更新免费中文字幕大全 | 亚洲国产精品一区二区第一页免| 成年在线网站免费观看无广告| 99久久国产免费中文无字幕 | 久久精品国产亚洲| 亚洲熟妇无码另类久久久| 免费国产不卡午夜福在线| 最新中文字幕免费视频| 免费下载成人电影| www视频在线观看免费| 99久久99久久精品免费观看| 国产成人免费视频| 在线观看片免费人成视频播放| 天堂亚洲免费视频| 日韩大片在线永久免费观看网站 | 四虎在线视频免费观看视频| 免费无码中文字幕A级毛片| 一个人免费视频在线观看www| 亚洲黄片手机免费观看| 人成午夜免费大片在线观看| 日韩精品无码免费视频| 免费国产在线精品一区| 免费国产黄网站在线看| jizz免费一区二区三区| 精品无码国产污污污免费网站国产 | 免费无码婬片aaa直播表情| 免费的黄色的网站| aaa毛片免费观看| 可以免费观看的国产视频| 久久久久国产精品免费看| 67pao强力打造高清免费| 国产一卡二卡3卡四卡免费| 国产精品美女午夜爽爽爽免费|