Mysql、Hive、Sqoop的安装及配置

Mysql的安装及配置

1、查看系统安装的MariaDB

rpm -qa|grep mariadb

2、删除查询到的MariaDB,这里的mariadb-libs-5.5.68-1.el7.x86_64是通过上一步查出来的

rpm -e --nodeps mariadb-libs-5.5.68-1.el7.x86_64

3、执行以下命令下载并安装mysql5.7

wget http://dev.mysql.com/get/mysql57-community-release-el7-10.noarch.rpm
yum -y install mysql57-community-release-el7-10.noarch.rpm
yum install mysql-community-server

若出现 mysql-community-client-5.7.40-1.el7.x86_64.rpm 的公钥尚未安装 需要执行这一句

rpm --import https://repo.mysql.com/RPM-GPG-KEY-mysql-2022

再执行

yum install mysql-community-server

4、启动MySQL服务

systemctl start mysqld.service

5、查看MySQL状态

systemctl status mysqld.service

6、获取生成的初始密码 root@lovalhost: ,密码为*

grep "password" /var/log/mysqld.log

Mysql、Hive、Sqoop的安装及配置

7、进入MySQL

mysql -u root -pW#gfyW.y7,v

8、修改密码

mysql> set global validate_password_policy=0;
mysql> set global validate_password_length=4;
mysql> set password=password("123456");

9、设置mysql可远程登录

mysql> grant all privileges on *.* To 'root'@'%' identified by '123456';
mysql> flush privileges;

10、退出

mysql>exit

11、重新登录MySQL

mysql -u root -p123456

Hive的安装及配置

1、进入hive安装包位置,解压

cd /opt/packages
tar -zxvf apache-hive-1.2.2-bin.tar.gz -C /opt/programs/

2、进入MySQL 在mysql中创建数据库hive

mysql -u root -p123456
mysql> create database hive character set latin1;
exit

3、通过xftp将本地的mysql-connector-java-5.1.48.jar上传到Hive的lib目录下

cd /opt/programs/apache-hive-1.2.2-bin/lib

4、进入指定目录,新建hive-site.xml 并配置

cd /opt/programs/apache-hive-1.2.2-bin/conf

hive-site.xml

"1.0" encoding="UTF-8" standalone="no"?>
type="text/xsl" href="configuration.xsl"?>

    javax.jdo.option.ConnectionURL</name>
    jdbc:mysql://hadoop0:3306/hive?useSSL=false</value>
</property>

    javax.jdo.option.ConnectionDriverName</name>
    com.mysql.jdbc.Driver</value>
</property>

    javax.jdo.option.ConnectionUserName</name>
    root</value>
</property>

    javax.jdo.option.ConnectionPassword</name>
    123456</value>
</property>

    hive.metastore.schema.verification</name>
    false</value>
</property>
</configuration>

5、环境变量

vim /etc/profile
export HIVE_HOME=/opt/programs/apache-hive-1.2.2-bin
export PATH=$PATH:$HIVE_HOME/bin
export HIVE_CONF_DIR=$HIVE_HOME/conf
source /etc/profile

6、初始化元数据库

cd /opt/programs/apache-hive-1.2.2-bin/bin
schematool -initSchema  -dbType mysql -verbose

7、执行命令,检验是否安装成功

hive

Sqoop的安装及配置

1、进入Sqoop安装包位置,解压

cd /opt/packages
tar -zxvf sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz -C /opt/programs/

2、通过xftp将本地mysql-connector-java-5.1.48.jar传到sqoop的lib目录下

cd /opt/programs/sqoop-1.4.7.bin__hadoop-2.6.0/lib

3、进入sqoop目录下conf文件夹,将sqoop-env-template.sh 文件复制并重命名为sqoop-env.sh

cd /opt/programs/sqoop-1.4.7.bin__hadoop-2.6.0/conf
cp sqoop-env-template.sh sqoop-env.sh

4、修改sqoop-env.sh文件

vim sqoop-env.sh

文件末尾加上

export HADOOP_COMMON_HOME=/opt/programs/hadoop-2.7.2
export HADOOP_MAPRED_HOME=/opt/programs/hadoop-2.7.2
export HIVE_HOME=/opt/programs/apache-hive-1.2.2-bin

5、环境变量

vim /etc/profile
export SQOOP_HOME=/opt/programs/sqoop-1.4.7.bin__hadoop-2.6.0
export PATH=$PATH:${SQOOP_HOME}/bin
export CLASSPATH=$CLASSPATH:${SQOOP_HOME}/lib
source /etc/profile

6、执行命令,检验是否安装成功

sqoop version

出现这些内容说明成功

22/11/13 13:50:59 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
Sqoop 1.4.7
git commit id 2328971411f57f0cb683dfb79d19d4d19d185dd8
Compiled by maugli on Thu Dec 21 15:59:58 STD 2017

将mysql中数据导入到Hive中

1、进入mysql ,输入以下代码

mysql -u root -p123456
create database test;

use test;

create table user(user_id int,user_name varchar(64));

insert into user values (1,'Justin');
insert into user values (2,'Mars');
insert into user values (3,'Alano');
insert into user values (4,'Alex');

2、进入hive,输入以下代码

hive
create table user_mysql(user_id int, user_name varchar(64))row format delimited fields terminated by ",";

3、将mysql中user表的数据导入到Hive中的user_mysql表中

sqoop import

将Hive数据导出到MySQL

1、进入mysql ,输入以下代码

mysql -u root -p123456
create table user2 like user;

2、将hive中usera_mysql表中的数据导出到mysql的user2表中

sqoop export

有可能的错误

1、如果出现这个错误

ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.

说明环境变量缺少HIVE_CONF_DIR
解决方法

vim /etc/profile

export HIVE_CONF_DIR=$HIVE_HOME/conf

source /etc/profile

2、如果出现这个错误

ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.

ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf

解决方法
将hive/lib包中的hive-common-1.2.2.jar和hive-exec-1.2.2.jar拷贝到sqoop/lib包中

cp /opt/programs/apache-hive-1.2.2-bin/lib/hive-common-1.2.2.jar  /opt/programs/sqoop-1.4.7.bin__hadoop-2.6.0/lib/
cp /opt/programs/apache-hive-1.2.2-bin/lib/hive-exec-1.2.2.jar  /opt/programs/sqoop-1.4.7.bin__hadoop-2.6.0/lib/

3、如果出现这个错误

ERROR tool.ImportTool: Import failed: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs:

解决方法
因为hdfs已经存在了/user/mysql 需要删除

hdfs dfs -rm -r  /user/mysql

Original: https://blog.csdn.net/weixin_45942827/article/details/127927558
Author: 山乀
Title: Mysql、Hive、Sqoop的安装及配置

原创文章受到原创版权保护。转载请注明出处:https://www.johngo689.com/817581/

转载文章受原作者版权保护。转载请注明原作者出处!

(0)

大家都在看

亲爱的 Coder【最近整理,可免费获取】👉 最新必读书单  | 👏 面试题下载  | 🌎 免费的AI知识星球