大数据Hadoop之——部署hadoop+hive+Mysql环境(Linux)
以下是部署Hadoop、Hive和MySQL环境的基本步骤,请注意,这些步骤可能会根据具体的Linux发行版和Hadoop、Hive和MySQL版本的不同而有所变化。
- 安装Java
sudo apt-get update
sudo apt-get install openjdk-8-jdk
- 配置环境变量
echo 'export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64' | sudo tee -a /etc/profile
echo 'export PATH=$PATH:$JAVA_HOME/bin' | sudo tee -a /etc/profile
source /etc/profile
- 验证Java安装
java -version
- 安装Hadoop
wget https://downloads.apache.org/hadoop/common/hadoop-3.2.2/hadoop-3.2.2.tar.gz
tar -xzf hadoop-3.2.2.tar.gz
sudo mv hadoop-3.2.2 /usr/local/hadoop
echo 'export HADOOP_HOME=/usr/local/hadoop' | sudo tee -a /etc/profile
echo 'export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin' | sudo tee -a /etc/profile
source /etc/profile
配置Hadoop
编辑
/usr/local/hadoop/etc/hadoop/hadoop-env.sh
,设置JAVA_HOME
:
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
- 安装MySQL
sudo apt-get update
sudo apt-get install mysql-server
- 安装Hive
wget https://downloads.apache.org/hive/hive-3.1.2/apache-hive-3.1.2-bin.tar.gz
tar -xzf apache-hive-3.1.2-bin.tar.gz
sudo mv apache-hive-3.1.2-bin /usr/local/hive
echo 'export HIVE_HOME=/usr/local/hive' | sudo tee -a /etc/profile
echo 'export PATH=$PATH:$HIVE_HOME/bin' | sudo tee -a /etc/profile
source /etc/profile
配置Hive
配置
hive-site.xml
,设置MySQL作为元数据存储:
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive_metadata?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>password</value>
</property>
</configuration>
- 添加MySQL JDBC驱动到Hive的
lib
目录
wget https://repo1.maven.org/maven2/mysql/mysql-connector-java/8.0.23/mysql-connector-java-8.0.23.jar
cp mysql-connector-java-8.0.23.jar /usr/local/hive/lib/
- 初始化Hive
schematool -initSchema -dbType mysql -upgradeSchema
- 启动Hadoop和Hive
start-dfs.sh
hive
请注意,这些命令可能需要根据您的具体环境进行调整,例如,Hadoop、Hive和MySQL的版本
评论已关闭