Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to insert a large amount of data in bulk by MyBatis

2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article will explain in detail how MyBatis inserts a large amount of data in batches. The editor thinks it is very practical, so I share it for you as a reference. I hope you can get something after reading this article.

Problem background: only the foreach in MyBatis is used to insert data in batches. When more than a thousand entries are inserted at a time, MyBatis starts to report an error. Project usage technology: SpringBoot, MyBatis

Problems encountered in bulk insertion:

Java.lang.StackOverflowError: null

This problem is caused by inserting 1w items of data at one time. The specific insertion code is as follows:

UserDao.batchInsert (list); INSERT INTO USER ID, AGE, NAME, EMAIL SELECT A.* FROM (SELECT sys_guid (), # {user.age}, # {user.name}, # {user.email} FROM dual) A

The above insertion code is also a way of batch insertion, but its spiritual boundary point is not high. When inserting too much data, we may need to use the code in batches at a time. Of course, you can use it directly if the inserted data is no more than 5000.

Insert 1w pieces of data and find an error because the amount of data is too large and the stack memory overflows. Inserting data directly using foreach in mybatis is equivalent to pre-splicing all the sql together and then submitting them together. This is a batch insertion solution in itself, but it does not meet our requirements. The main reason is that the insertion has an upper limit. If we need more data import, we need to change a way to solve this problem, the use of ExecutorType in mybatis.

The use of ExecutorType in mybatis

There are three kinds of ExecutorType built into Mybatis. Simple is the default for SIMPLE, REUSE and BATCH;. In this mode, it creates a new preprocessing statement for the execution of each statement, a single sql; is submitted, while the batch mode reuses the preprocessed statements, and all update statements are executed in batches. Obviously, batch performance will be better. However, the batch mode also has its own problems. For example, in the Insert operation, there is no way to get the self-increasing id before the transaction is committed, which does not meet the business requirements in a certain situation.

A solution for inserting large amounts of data, using ExecutorType

In order to be efficient and solve the above problems, we use ExecutorType and insert it in batches. The code is as follows:

/ / We use @ Autowiredprivate SqlSessionTemplate sqlSessionTemplate;public void insertExcelData (List list) that can be injected by springboot,sqlSessionTemplate. {/ / if autocommit is set to true, the number of entries that cannot be controlled will be changed to the final unified submission, which may lead to memory overflow SqlSession session = sqlSessionTemplate.getSqlSessionFactory (). OpenSession (ExecutorType.BATCH, false); / / do not automatically commit try {UserDao userDao = session.getMapper (UserDao.class) For (int I = 0; I < list.size (); if +) {userDao.insert (list.get (I)); if (I% 400 = = 0 | | I = = list.size ()-1) {/ / manually submit every 400 entries, session.commit () cannot be rolled back after submission / / Clean the cache to prevent overflow of session.clearCache ();}} catch (Exception e) {/ / data that is not submitted can be rolled back to session.rollback ();} finally {session.close ();}} userDao.insert (User user) Insert into USER (id, age, name, email) values (sys_guid (), # {age}, # {name}, # {email})

Here you use a single insert, using a for loop directly, but using ExecutorType.BACTH is equivalent to manual submission. This is also the effect we need, so we determine in the loop whether it reaches the 400th stroke, submit directly if it reaches the 400th stroke, and then empty the cache to prevent overflow. In this way, the batch insertion is effectively realized, and the overflow problem is ensured at the same time.

This is the end of the article on "how MyBatis inserts a large amount of data in bulk". I hope the above content can be of some help to you, so that you can learn more knowledge. if you think the article is good, please share it out for more people to see.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report