用hive往elasticsearch中導數據出錯(用的命令是INSERT OVERWRITE TABLE doc SELECT s.id,s.name FROM user_f s;)
錯誤代碼如下
16/03/24 13:22:54 [main]: INFO exec.Utilities: File not found: File does not exist: /tmp/hive/hadoop/cf07a2cb-f401-440b-b230-3adb69d7ce9a/hive_2016-03-24_13-22-52_349_3866738858790764474-1/-mr-10001/4166b8bf-0706-4bda-9912-3cab4e82bcde/reduce.xml
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71)
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1828)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:587)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
16/03/24 13:22:54 [main]: INFO exec.Utilities: No plan file found: hdfs://ubuntu:9000/tmp/hive/hadoop/cf07a2cb-f401-440b-b230-3adb69d7ce9a/hive_2016-03-24_13-22-52_349_3866738858790764474-1/-mr-10001/4166b8bf-0706-4bda-9912-3cab4e82bcde/reduce.xml
ubuntu的磁盤空間不足了 刪除了一些文件 雖然還是沒有這個文件 但是能往下繼續運行 獲得結果了