目錄 1,環境準備 2,安裝Hive和配置環境變量 3,安裝MySQL 4,在mysql上創建hive元數據庫,并對hive進行授權 5,安裝jar包到hive 6,配置hive-site.xml 7,元數據存儲初始化 8,啟動驗證hive 9,報錯及解決方法
準備好Hadoop集群,參照文章:【hadoop分布式集群的安裝教程】 啟動Hadoop
# cd /opt/hadoop-2.6.4/sbin # start-all.sh
# cd /opt # tar zxvf apache-hive-2.1.0-bin.tar.gz # ln -s apache-hive-2.1.0-bin hive //創建超鏈接,方便以后版本更新 # vim /etc/bashrc //配置環境變量 export HIVE_HOME=/opt/hive export PATH=$PATH:$HIVE_HOME/bin # source /etc/bashrc //使之生效
# rpm -q mysql-server //檢查是否安裝mysql # yum install -y mysql-server //yum安裝 # service mysqld start # chkconfig mysqld on //開機啟動mysql
# mysql -uroot create database if not exists hive_metadata; grant all privileges on hive_metadata.* to 'root'@'%' identified by '123456'; grant all privileges on hive_metadata.* to 'root'@'localhost' identified by '123456'; grant all privileges on hive_metadata.* to 'root'@'hadoop1' identified by '123456'; flush privileges; //刷盤生效
# yum install -y mysql-connector-java //安裝mysql-connector-java # cp /usr/share/java/mysql-connector-java-5.1.17.jar /opt/hive/lib/ //將mysql connector拷貝到hive的lib包中
# cd /opt/hive/conf/ # cp hive-default.xml.template hive-site.xml # vim hive-site.xml ------------------------------------------------------------------------------------------------------------------------- <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://hadoop1:3306/hive_metadata?createDatabaseIfNotExist=true </value> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>root</value> //我這里是用root用戶操作的,要與5.1步驟授權一致。 </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>123456</value> </property> <property> <name>hive.metastore.warehouse.dir</name> <value>/user/hive/warehouse</value> </property> <property> <name>datanucleus.schema.autoCreateAll</name> <value>true</value> //我安裝時報錯,加上這個完美解決 <description>creates necessary schema on a startup if one doesn't exist. set this to false, after creating it once</description> </property>
初始化hive的元數據庫(使用mysql數據庫)
# cd /opt/hive/bin # schematool -initSchema -dbType mysql
# hi
大功告成!!!不容易啊!碰到一堆報錯,一個一個某度解決的,順便吐槽一下某度,全特么是沒用信息。 下面是遇到的坑及解決方法。
補充1: remote分開模式,客戶端配置文件:
<?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>hive.metastore.warehouse.dir</name> <value>/user/hive/warehouse</value> </property> <property> <name>hive.metastore.local</name> <value>false</value> </property> <property> <name>hive.metastore.uris</name> <value>thrift://hadoop1:9083</value> </property> </configuration>
master運行hive –service metastore &起服務 slave運行hive驗證
補充2: Hive Thrift客戶端:
master運行
nohup -service hiveserver2 &
slave運行
beeline !connect jdbc:hive2://hadoop10000 root 123456
俗話說的好,有問題,看日志。Hive的日志默認在/tmp/root/hive.log
報錯1: 剛安裝完,啟動hive時報錯: 經查,是Hadoop的slf4j 與Hbase的slf4j jar包發生綁定沖突,移除其中一個即可。 解決方法:
# rm -rf /opt/hive/lib/log4j-slf4j-impl-2.4.1.jar
報錯2: 奶奶的,就說沒這么順利,剛解決1,緊接又來個錯。 連不上mysql庫,經查是hive-site.xml和mysql庫里授權的帳號密碼填錯了。 解決方法: 改成一樣的就好啦~
報錯3: 淡定,接連3個錯,哥已經習慣了報錯,就像習慣單身一樣~。 網上找到的解決方法,解決之后大功告成! 解決方法: 跟蹤hive源碼得知:修改的配置的內容如下
<property> <name>datanucleus.readOnlyDatastore</name> <value>false</value> </property> <property> <name>datanucleus.fixedDatastore</name> <value>false</value> </property> <property> <name>datanucleus.autoCreateSchema</name> <value>true</value> </property> <property> <name>datanucleus.autoCreateTables</name> <value>true</value> </property> <property> <name>datanucleus.autoCreateColumns</name> <value>true</value> </property>
或者將:
<property> <name>datanucleus.schema.autoCreateAll</name> <value>false</value> <description>creates necessary schema on a startup if one doesn't exist. set this to false, after creating it once</description> </property>
修改為
<property> <name>datanucleus.schema.autoCreateAll</name> <value>true</value> <description>creates necessary schema on a startup if one doesn't exist. set this to false, after creating it once</description> </property>
posted on 2017-07-20 15:20 鴻雁 閱讀(463) 評論(0) 編輯 收藏 所屬分類: IT技術相關
Powered by: BlogJava Copyright © 鴻雁