This article is shared from Huawei cloud community< DBeaver docking FusionInsight MRS Spark2x >, author: Jin Hongqing.
Docking Spark2x with custom JDBC
- Create a new directory in C:\ecotesting\Fiber\conf jaas.conf File, as follows:
Client { com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true keyTab="C:\\ecotesting\\Fiber\\conf\\user.keytab" principal="developuser" useTicketCache=false storeKey=true debug=true; };
Note: the keyTab parameter and the principal parameter are the corresponding authentication user name and authentication file path
- Download the dbaver software and complete the installation
- Specify the JDK virtual machine of DBeaver. Under the DBeaver installation directory, open the dbeaver.ini setting - vm The value of the parameter. A line break is required between the parameter and the value
Examples are as follows:
-vm C:\java64\jdk\bin Add the following at the end of the configuration file: -Djava.security.auth.login.config=C:\\ecotesting\\Fiber\\conf\\jaas.conf -Dzookeeper.sasl.clientconfig=Client -Dzookeeper.auth.type=kerberos -Dzookeeper.server.principal=zookeeper/hadoop.hadoop.com
- Restart DBeaver
After modifying dbeaver.ini, you need to restart DBeaver to take effect
- (important fi6.5.1) prepare the spark2x jdbc connection driver jar package
Log in to the spark2x client on the linux side to find JDBC related dependencies, such as: / opt/145_651hdclient/hadoopclient/Spark2x/spark/jars/jdbc
Copy all jar packages under this path to the local directory of windows, such as E:5config\spark2xjars. Note that there is a JDBC in it_ Pom.xml file, which needs to be deleted
Continue to log in to the spark2x client path on the linux side / opt / 145_ 651hdclient / Hadoop client / spark2x / spark / jars, find the following four jar packages and copy them to the local directory of windows, such as E:5config\spark2xjars
log4j-1.2.17.jar woodstox-core-5.0.3.jar stax2-api-3.1.4.jar commons-configuration2-2.1.1.jar
Note: for version 8.0.2 of MRS, the prepared dependencies are:
- Log in to the spark2x client on the linux side to find jdbc related dependencies, such as: / opt / 80_ 135_ All dependencies under Hadoop client2 / Hadoop client / spark2x / spark / jars / jdbc
- Continue to log in to the spark2x client path on the linux side / opt / 80_ 135_ Hadoop client2 / Hadoop client / spark2x / spark / jars, find the following five jar packages
log4j-1.2.17-atlassian-13.jar commons-lang-2.6.jar woodstox-core-5.0.3.jar stax2-api-3.1.4.jar commons-configuration2-2.1.jar
- Enter the dbaver interface, select database - > driver manager from the menu, and click in the pop-up dialog box New.
- The name of the new connection is fi spark2x-651-direct. The connection information is as follows. Click OK after completion
1. org.apache.hive.jdbc.HiveDriver 2. jdbc:hive2://172.16.4.141:24002,172.16.4.142:24002,172.16.4.143:24002/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=sparkthriftserver2x;saslQop=auth-conf;auth=KERBEROS;principal=spark2x/hadoop.hadoop.com@HADOOP.COM;user.principal=developuser;user.keytab=E:/145config/user.keytab 3. Hadoop 4. spot Add File Configured in the above steps spark2x Connect all jar Add the bag
Note: user. Principal = developer of the connection url; User. KeyTab = e: / 145config / user. KeyTab these two parameters must be added, and they must be correct
To connect the url string, you can use the spark beeline command on the Linux client to obtain the reference:
- Select File - > New - > database connection from the menu bar. Click Next.
- Select FI-spark2x-651-direct and click NEXT
- Click Finish
- Right click FI-spark2x-651-direct and click Edit Connection
- Click Test connection
The test results of mrs 8.0.2 are as follows:
- View result data
The test results of mrs 8.0.2 are as follows:
Click focus to learn about Huawei cloud's new technologies for the first time~