SpringBoot整合Hive(开启Kerberos认证)作三方数据源

Hive数据库连接说明

  • 1、没有开启kerberos认证,需要正常的jdbc url, 账号+密码就能获取到Connection
  • 2、开启了kerberos认证,不需要密码,需要密钥文件(kertab文件),认证配置文件(kbr5文件)
  • 3、这两个文件从哪儿来,由Hive数据库的管理员哪儿获取

开启Kerberos认证后连接遇到的坑

  • 1、直接认证不通过,一般是账户,kbr5文件,kertab文件错误
  • 2、认证成功,但是获取不到连接,发现使用IP连接,但是kbr5文件配置的是域名,认证不成功,统一使用域名解决
  • 3、获取连接成功,但是执行SQL失败。发现是Hive的数据库名错了,它也能连接,只是库下面没有表而已

编写HiveJdbc连接参数类

包含了JDBC连接基础类,后期还会集成Oracle,Mysql, MaxCompute, Dataworks(前面文章已经集成)等数据源


@Data
public class BaseJdbcConnParam implements Serializable {

    private String driverName;

    private String ip;

    private Integer port;

    private String dbName;

    private String username;

    private String password;
}

@Data
@EqualsAndHashCode(callSuper = false)
public class HiveJdbcConnParam extends BaseJdbcConnParam {

    private boolean enableKerberos;

    private String principal;

    private String kbr5FilePath;

    private String keytabFilePath;
}

编写Hive连接工具类

主要用于获取Hive的连接,包括普通连接和基于Kerberos认证的连接


@Slf4j
public class HiveConnUtil {

    private final HiveJdbcConnParam connParam;

    private final Connection connection;

    public HiveConnUtil(HiveJdbcConnParam connParam) {
        this.connParam = connParam;
        this.connection = buildConnection();
    }

    public Connection getConnection() {
        return connection;
    }

    private Connection buildConnection(){
        try {

            Class.forName(connParam.getDriverName());
        } catch (ClassNotFoundException e) {
            e.printStackTrace();
            throw new BizException(ResultCode.HIVE_DRIVE_LOAD_ERR);
        }

        String jdbcUrl = "jdbc:hive2://%s:%s/%s";
        String ip = connParam.getIp();
        String port = connParam.getPort() + "";
        String dbName = connParam.getDbName();
        final String username = connParam.getUsername();
        final String password = connParam.getPassword();

        final boolean enableKerberos = connParam.isEnableKerberos();

        Connection connection;

        try {
            if (!enableKerberos) {
                jdbcUrl = String.format(jdbcUrl, ip, port, dbName);
                connection = DriverManager.getConnection(jdbcUrl, username, password);
            } else {
                final String principal = connParam.getPrincipal();
                final String kbr5FilePath = connParam.getKbr5FilePath();
                final String secretFilePath = connParam.getKeytabFilePath();

                String format = "jdbc:hive2://%s:%s/%s;principal=%s";
                jdbcUrl = String.format(format, ip, port, dbName, principal);

                System.setProperty("java.security.krb5.conf", kbr5FilePath);
                System.setProperty("javax.security.auth.useSubjectCredsOnly", "false");

                org.apache.hadoop.conf.Configuration conf = new org.apache.hadoop.conf.Configuration();
                conf.set("hadoop.security.authentication", "Kerberos");
                conf.set("keytab.file", secretFilePath);
                conf.set("kerberos.principal", principal);
                UserGroupInformation.setConfiguration(conf);
                try {
                    UserGroupInformation.loginUserFromKeytab(username, secretFilePath);
                } catch (IOException e) {
                    e.printStackTrace();
                    throw new BizException(ResultCode.KERBEROS_AUTH_FAIL_ERR);
                }
                try {
                    connection = DriverManager.getConnection(jdbcUrl);
                } catch (SQLException e) {
                    e.printStackTrace();
                    throw new BizException(ResultCode.KERBEROS_AUTH_SUCCESS_GET_CONN_FAIL_ERR);
                }
            }
            log.info("=====>>>获取hive连接成功:username:{},jdbcUrl: {}", username, jdbcUrl);
            return connection;
        } catch (SQLException e) {
            e.printStackTrace();
            throw new BizException(ResultCode.HIVE_CONN_USER_PWD_ERR);
        } catch (BizException e){
            throw e;
        }
        catch (Exception e) {
            e.printStackTrace();
            throw new BizException(ResultCode.HIVE_CONN_ERR);
        }
    }

}

编写Sql操作工具类

用于根据连接去执行SQL,测试时候使用,正常整合三方数据源时作为执行三方数据源的SQL语句操作的工具类

package com.itdl.util;

import com.alibaba.fastjson.JSONObject;
import com.google.common.collect.Lists;
import com.itdl.common.base.ResultCode;
import com.itdl.exception.BizException;
import com.itdl.properties.HiveJdbcConnParam;

import java.sql.*;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;

public class SqlUtil {

    private final Connection connection;

    public SqlUtil(Connection connection) {
        this.connection = connection;
    }

    public static SqlUtil build(Connection connection){
        return new SqlUtil(connection);
    }

    public List<LinkedHashMap<String, Object>> querySql(String sql){

        Statement statement = null;
        ResultSet resultSet = null;
        try {
            statement = connection.createStatement();
            resultSet = statement.executeQuery(sql);
            return buildListMap(resultSet);
        } catch (SQLException e) {
            e.printStackTrace();
            throw new BizException(ResultCode.SQL_EXEC_ERR.getCode(), e.getMessage());
        }finally {

            close(resultSet, statement);
        }
    }

    private void close(Object ...objs){
        if (objs == null || objs.length == 0){
            return;
        }

        for (Object obj : objs) {
            if (obj instanceof Statement){
                try {
                    ((Statement) obj).close();
                }catch (Exception e){
                    e.printStackTrace();
                }
            }

            if (obj instanceof ResultSet){
                try {
                    ((ResultSet) obj).close();
                }catch (Exception e){
                    e.printStackTrace();
                }
            }

            if (obj instanceof Connection){
                try {
                    ((Connection) obj).close();
                }catch (Exception e){
                    e.printStackTrace();
                }
            }
        }
    }

    private List<LinkedHashMap<String, Object>> buildListMap(ResultSet resultSet) throws SQLException {
        if (resultSet == null) {
            return Lists.newArrayList();
        }

        List<LinkedHashMap<String, Object>> resultList = new ArrayList<>();

        ResultSetMetaData metaData = resultSet.getMetaData();
        while (resultSet.next()) {

            int columnCount = metaData.getColumnCount();
            LinkedHashMap<String, Object> map = new LinkedHashMap<>();
            for (int i = 0; i < columnCount; i++) {
                String columnName = metaData.getColumnName(i + 1);

                if("mm.row_num_01".equalsIgnoreCase(columnName)
                        || "row_num_01".equalsIgnoreCase(columnName)){
                    continue;
                }

                if (columnName.startsWith("mm.")){
                    columnName = columnName.substring(columnName.indexOf(".") + 1);
                }

                Object object = resultSet.getObject(columnName);
                map.put(columnName, object);
            }

            resultList.add(map);
        }
        return resultList;
    }
}

测试方法

public static void main(String[] args) {
    final HiveJdbcConnParam connParam = new HiveJdbcConnParam();
    connParam.setDriverName("org.apache.hive.jdbc.HiveDriver");
    connParam.setIp("IP或者域名");
    connParam.setPort(10000);
    connParam.setDbName("数据库名");

    connParam.setUsername("账号");

    connParam.setPassword("1212121221");

    connParam.setEnableKerberos(true);

    connParam.setPrincipal("库名/主机@域名");

    connParam.setKbr5FilePath("C:\\workspace\\krb5.conf");

    connParam.setKeytabFilePath("C:\\workspace\\用户名.keytab");

    final Connection connection = new HiveConnUtil(connParam).getConnection();
    final SqlUtil sqlUtil = SqlUtil.build(connection);
    final List<LinkedHashMap<String, Object>> tables = sqlUtil.querySql("show databases");
    for (LinkedHashMap<String, Object> table : tables) {
        final String s = JSONObject.toJSONString(table);
        System.out.println(s);
    }

    sqlUtil.close(connection);
}

连接都拿到了,也能执行SQL了,工具类也有了,做一个三方数据源管理还有什么能难道天才般的你呢?

测试日志

18:04:14.719 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, about=, type=DEFAULT, value=[Rate of successful kerberos logins and latency (milliseconds)], valueName=Time)
18:04:14.729 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, about=, type=DEFAULT, value=[Rate of failed kerberos logins and latency (milliseconds)], valueName=Time)
18:04:14.729 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, about=, type=DEFAULT, value=[GetGroups], valueName=Time)
18:04:14.736 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl - UgiMetrics, User and group related metrics
18:04:14.796 [main] DEBUG org.apache.hadoop.security.Groups -  Creating new Groups object
18:04:14.799 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...

18:04:14.802 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: C:\workspace\software\hadoop\winutils\hadoop-3.0.1\bin\hadoop.dll: Can't load AMD 64-bit .dll on a IA 32-bit platform
18:04:14.803 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - java.library.path=C:\Program Files (x86)\Java\jdk1.8.0_271\bin;C:\windows\Sun\Java\bin;C:\windows\system32;C:\windows;C:\Program Files (x86)\Common Files\Oracle\Java\javapath;C:\windows\system32;C:\windows;C:\windows\System32\Wbem;C:\windows\System32\WindowsPowerShell\v1.0\;C:\windows\System32\OpenSSH\;C:\Program Files\Docker\Docker\resources\bin;C:\ProgramData\DockerDesktop\version-bin;C:\Program Files (x86)\NetSarang\Xshell 7\;C:\Program Files (x86)\NetSarang\Xftp 7\;C:\Program Files\TortoiseGit\bin;C:\Program Files\MIT\Kerberos\bin;C:\workspace\software\python\Scripts\;C:\workspace\software\python\;C:\Users\donglin.he\AppData\Local\Programs\Python\Launcher\;C:\Users\donglin.he\AppData\Local\Microsoft\WindowsApps;C:\workspace\software\Git\cmd;C:\workspace\software\maven\apache-maven-3.6.3\bin;C:\Program Files (x86)\Java\jdk1.8.0_271\bin;C:\workspace\software\PyCharm 2022.1.2\bin;;C:\workspace\software\hadoop\winutils\hadoop-3.0.1\bin;C:\workspace\software\python;C:\workspace\software\python\Scripts;;.

18:04:14.803 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18:04:14.803 [main] DEBUG org.apache.hadoop.util.PerformanceAdvisory - Falling back to shell based
18:04:14.803 [main] DEBUG org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
18:04:14.876 [main] DEBUG org.apache.hadoop.security.Groups - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
18:04:15.626 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login
18:04:15.627 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login commit
18:04:15.628 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - using kerberos user:你的用户名
18:04:15.628 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Using user: "你的用户名" with name 你的用户名
18:04:15.628 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - User entry: "你的用户名"
18:04:15.628 [main] INFO org.apache.hadoop.security.UserGroupInformation - Login successful for user 你的用户名 using keytab file C:\workspace\zhouyu.keytab
18:04:15.641 [main] INFO org.apache.hive.jdbc.Utils - Supplied authorities: Hive的域名:10000
18:04:15.642 [main] INFO org.apache.hive.jdbc.Utils - Resolved authority: Hive的域名:10000
18:04:15.656 [main] DEBUG org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge - Current authMethod = KERBEROS
18:04:15.656 [main] DEBUG org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge - Not setting UGI conf as passed-in authMethod of kerberos = current.

18:04:15.678 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:你的用户名 (auth:KERBEROS) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
18:04:15.678 [main] DEBUG org.apache.thrift.transport.TSaslTransport - opening transport org.apache.thrift.transport.TSaslClientTransport@1b9f5a4
18:04:15.812 [main] DEBUG org.apache.thrift.transport.TSaslClientTransport - Sending mechanism name GSSAPI and initial response of length 567
18:04:15.819 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status START and payload length 6
18:04:15.820 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status OK and payload length 567
18:04:15.820 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Start message handled
18:04:15.952 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Received message with status OK and payload length 104
18:04:15.954 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status OK and payload length 0
18:04:15.993 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Received message with status OK and payload length 50
18:04:15.994 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Writing message with status COMPLETE and payload length 50
18:04:15.994 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Main negotiation loop complete
18:04:15.994 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: SASL Client receiving last message
18:04:16.034 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: Received message with status COMPLETE and payload length 0
18:04:16.053 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 67
18:04:16.155 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109
18:04:16.239 [main] INFO com.itdl.util.HiveConnUtil - =====>>>获取hive连接成功:username:你的用户名,jdbcUrl: jdbc:hive2://Hive的域名:10000/库名;principal=hive/你的principle
18:04:16.247 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 132
18:04:35.551 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 109
18:04:35.560 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 100
18:04:35.618 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 255
18:04:35.629 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 102
18:04:35.668 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 136
18:04:35.699 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 112
18:04:35.755 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 325
18:04:35.775 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.hive.jdbc.HiveQueryResultSet - Fetched row string:
18:04:35.776 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 112
18:04:35.816 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 96
18:04:35.820 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 96
18:04:35.873 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 42
{"database_name":"db_01"}
{"database_name":"db_02"}
{"database_name":"db_03"}
{"database_name":"db_04"}
{"database_name":"communication_bank"}
18:04:35.965 [main] DEBUG org.apache.thrift.transport.TSaslTransport - writing data length: 83
18:04:36.007 [main] DEBUG org.apache.thrift.transport.TSaslTransport - CLIENT: reading data length: 40

项目地址

https://github.com/HedongLin123/db-connection-demo

Original: https://blog.csdn.net/qq_35267557/article/details/126276262
Author: IT_DLin
Title: SpringBoot整合Hive(开启Kerberos认证)作三方数据源

原创文章受到原创版权保护。转载请注明出处:https://www.johngo689.com/817410/

转载文章受原作者版权保护。转载请注明原作者出处!

(0)

大家都在看

亲爱的 Coder【最近整理,可免费获取】👉 最新必读书单  | 👏 面试题下载  | 🌎 免费的AI知识星球