Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to solve the problem of HiveServer2 error in DataGrip connection

2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)05/31 Report--

This article mainly introduces how to solve the problem of DataGrip connection HiveServer2 error, the article is very detailed, has a certain reference value, interested friends must read it!

DataGrip connection HiveServer2 error

1. Heap memory

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

Because DataGrip can open several Session, I have four Session here.

So sometimes if the statement executed is too large, it will cause heap memory not to be reclaimed. Resulting in heap memory overflow, the reaction in DataGrip is that the execution statement gets stuck and does not respond, wait a long time and then report an error. There is a timeout. After waiting for a long time, an error will be reported after the timeout.

We can appropriately increase the heap memory:

# in hive-config.sh under the hive/bin directory, the last line is to set the size of the heap. 69 # Default to use 256MB 70 export HADOOP_HEAPSIZE=$ {HADOOP_HEAPSIZE:-2048}

II. Too many open files

Lzq@hadoop102 ~ ulimit-a murt: cpu time (seconds) unlimited-f: file size (blocks) unlimited-d: data seg size (kbytes) unlimited-s: stack size (kbytes) 8192 Merc: core file size (blocks) 0murm: resident set size (kbytes) unlimited-u: processes 10000 Muthl: file descriptors 4096Mul: locked-in-memory size (kbytes) 64Muv: address space (kbytes) unlimited-x: file locks unlimited-i : pending signals 31830murq: bytes in POSIX msg queues 819200 Mure: max nice 0murr: max rt priority 0Mel N 15: unlimited

Modify the maximum number of file connections

Linux any external device (socket, file) can be regarded as a file, and a file can be read and written through a file descriptor.

-n: file descriptors 4096 describes the maximum number of files a process can open

The default is 1024. We can adjust the maximum open file to 4096 by ulimit-n 4096, which will be restored to the default value if restarted.

Permanent setting method

Vim / etc/security/limits.conf adds * soft nofile 65535 * hard nofile 65535 at the end

Install lsof

Sudo yum install lsof-y

View the number of files currently open by the system

Lsof | wc-l watch "lsof | wc-l" lzq@hadoop102 ~ jps25696 Jps1522 NameNode22627 RunJar1716 DataNode3140 Kafka2309 NodeManager2647 QuorumPeerMain22889 RunJar23322 RunJar

Check the number of open files for a process

# RunJar is the process of HiveServer2 lsof-p pid | wc-llsof-p 22627 | wc-llsof-p 22889 | wc-llsof-p 23322 | wc-l is all the contents of this article entitled "how to solve the problem of DataGrip connection HiveServer2 reporting errors". Thank you for reading! Hope to share the content to help you, more related knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report