hadoop-2.8.1.tar.gz
apache-maven-3.3.9 mysql-5.1
xx.xx.xx.xx ip地址 NN hadoop01
xx.xx.xx.xx ip地址 DN hadoop02
xx.xx.xx.xx ip地址 DN hadoop03
xx.xx.xx.xx ip地址 DN hadoop04
xx.xx.xx.xx ip地址 DN hadoop05
本次涉及伪分布式部署只是要主机hadoop01,软件安装参考伪分布式部署终极篇
编译hive之前,我们需要阅读编译hive的readme.txt 以便我们下在相应的软件版本Requirements
============
- Java 1.6, 1.7 ---jdk只能使用1.6或者1.7 不能使用1.8
- Hadoop 1.x, 2.x
2、安装jdk
mkdir /usr/java && cd /usr/java/
tar -zxvf /tmp/server-jre-7u80-linux-x64.tar.gz
chown -R root:root /usr/java/jdk1.7.0_80/
echo 'export JAVA_HOME=/usr/java/jdk1.7.0_80'>>/etc/profile
source /etc/profile
cd /usr/local/
unzip /tmp/apache-maven-3.3.9-bin.zip
chown root: /usr/local/apache-maven-3.3.9 -R
echo 'export MAVEN_HOME=/usr/local/apache-maven-3.3.9'>>/etc/profile
echo 'export MAVEN_OPTS="-Xms256m -Xmx512m"'>>/etc/profile
echo 'export PATH=$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH'>>/etc/profile
source /etc/profile
--JDK 和maven部署可参考---大数据之----部署安装编译打包hadoop终极篇
4、安装Mysql
yum -y install MYSQL-server mysql
/etc/init.d/mysqld start
chkconfig mysqld on
mysqladmin -u root password 123456
mysql -uroot -p123456
use mysql;
GRANT ALL PRIVILEGES ON *.* TO 'root'@'localhost' IDENTIFIED BY 'v123456' WITH GRANT OPTION;
GRANT ALL PRIVILEGES ON *.* TO 'root'@'127.0.0.1' IDENTIFIED BY '123456' WITH GRANT OPTION;
GRANT ALL PRIVILEGES ON *.* TO 'root'@'%' IDENTIFIED BY '123456' WITH GRANT OPTION;
update user set password=password('123456') where user='root';
delete from user where not (user='root') ;
delete from user where user='root' and password='';
drop database test;
DROP USER ''@'%';
flush privileges;
#
# 根据cdh版本选择对应hive软件包:
# hive-1.1.0-cdh6.7.1-src.tar.gz
# 解压后使用maven命令编译成安装包
cd /tmp/
tar -xf hive-1.1.0-cdh6.7.1-src.tar.gz
cd /tmp/hive-1.1.0-cdh6.7.1
mvn clean package -DskipTests -Phadoop-2 -Pdist
# 编译生成的包在以下位置:
# packaging/target/apache-hive-1.1.0-cdh6.7.1-bin.tar.gz
cd /usr/local/
tar -xf /tmp/apache-hive-1.1.0-cdh6.7.1-bin.tar.gz
ln -s apache-hive-1.1.0-cdh6.7.1-bin hive
chown -R hadoop:hadoop apache-hive-1.1.0-cdh6.7.1-bin
chown -R hadoop:hadoop hive
echo 'export HIVE_HOME=/usr/local/hive'>>/etc/profile
echo 'export PATH=$HIVE_HOME/bin:$PATH'>>/etc/profile
su - hadoop
cd /usr/local/hive
cd conf
1、hive-env.sh
cp hive-env.sh.template hive-env.sh&&vi hive-env.sh
HADOOP_HOME=/usr/local/hadoop
2、hive-site.xml
vi hive-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href=http://www.yisu.com/zixun/"configuration.xsl"?>
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/vincent_hive?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>vincent</value>
</property>
</configuration>
郑重声明:本文版权归原作者所有,转载文章仅为传播更多信息之目的,如作者信息标记有误,请第一时间联系我们修改或删除,多谢。