Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to implement C++ memory Pool

2025-02-14 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)05/31 Report--

This article mainly introduces the relevant knowledge of how to achieve C++ memory pool, the content is detailed and easy to understand, easy to operate, and has a certain reference value. I believe you will gain something after reading this article on how to achieve C++ memory pool. let's take a look.

1. Basic knowledge of memory Pool 1. What is memory Pool 1.1 pooling Technology

Pooling technology is a design pattern in computers, which mainly means that the computer resources that are often used in the program are applied in advance and managed by the program itself, and the program is directly obtained from the "pool" when it is in use. it not only ensures the number of resources occupied by the program, but also reduces the application and release time of resources. Common pooling techniques include memory pool, thread pool, connection pool and so on.

1.2 memory pool

Memory pool is a dynamic memory allocation and management technology. Its core idea is to apply for a section of memory space in advance, manage it with an efficient data structure (hash, linked list), and allocate a piece of memory to the program directly from the memory pool when the program needs memory. also return it to the memory pool when it is finished. The advantage of this is that it reduces the time of directly using new/delete, malloc/free and other API to apply for and release memory, and improves the efficiency of the program; at the same time, every time the program directly uses new/delete or malloc/free to apply for space from memory, it will lead to the problem of memory fragmentation, and memory pool directly applies for large chunks of memory to reduce memory fragmentation.

2. The role of memory pool 2.1 efficiency

Usually, the requested memory is applied for a piece of memory directly from the heap area of the memory through the new/delete and malloc/free interfaces, and the release is also directly released to the heap. Frequent application and release will inevitably consume a lot of time and reduce the efficiency of the program.

For example, assuming that the node size of each linked list is 16 bytes, when the linked list needs to insert nodes frequently, frequent memory request manipulation is required, and it takes a certain amount of time to apply for 16 bytes from the heap each time. It also takes time to free memory. Using the memory pool, we can apply for "a batch of nodes" directly from memory, and allocate the pre-applied memory to the program directly without directly applying in the heap when the program needs memory.

2.2 memory fragmentation

Frequent requests for small chunks of memory can lead to memory fragmentation problems. Memory fragments can be divided into two types: internal fragments and external fragments.

1) external debris

Outer fragments are what we often call memory fragments. For example, every time we apply for a 16-byte block of memory from memory, there will be many 16-byte blocks in the memory, which may cause memory fragmentation when the memory is released, as shown below:

The size of free memory in memory is 88 bytes, but the maximum memory block we can request is 21 bytes.

2) Internal debris

Internal fragmentation refers to small unused chunks of memory that exist in the allocated memory. Although the memory pool technology solves the problem of random memory, it also causes the problem of internal fragmentation. Internal fragmentation is inevitable, but it can be reduced by program optimization.

For example: the actual need is to apply for 10byte memory, fixed-length memory pool may be memory alignment, an one-time allocation of 16 bytes of memory, the extra 6 bytes are not actually used, these 6 bytes are memory fragments.

3. The evolution of memory pool technology

1) the most "simple" memory pool

Make a linked list that points to free memory. Allocation is to take a piece from the linked list and return it to pop, and to release is to push the memory into the linked list. Need to do a good job of merging, marking and protection to prevent secondary memory release problems.

2) fixed-length memory pool

Implement a FreeList class, its essence is a linked list, the node is a fixed size of memory, using head insertion and head deletion to apply for memory release. There are two linked lists in each fixed memory allocator: OpenList for storing unallocated free memory objects (FreeList objects) and CloseList for storing allocated memory objects.

To allocate memory is to take an object from IOpenLsit to the program, and to free memory is to push the object into CloseList. When there is not enough memory, OpenList requests a large chunk of memory that is cut into small chunks of memory of a fixed length.

3) memory pool in C++STL library

The problem of fixed-length memory pool is that we can only apply for fixed-length memory, but in practice, the amount of memory we need to apply for may be fixed. In the C++STL library, a memory pool is implemented by the combination of hash table and fixed-length memory pool. The details are as follows

Construct multiple fixed-length memory pools and align them with a fixed number of alignments (for example, 8 bytes). The memory object size of the first fixed-length memory pool is 8 (at least the next pointer type can be saved on both 64-bit and 32-bit systems), and the second memory pool object size is 16. The last memory pool object size is 128byte, and when the requested memory size exceeds 128bytes, it is applied through the secondary space configuration (directly from memory).

Construct a hash table and hang memory objects of different sizes in the hash table. As shown below:

Apply for memory: add a memory size of 8 bytes to be applied for and allocate a piece of memory directly at index = 0. Of course, 8 bytes of memory will also be allocated directly when the applied memory is less than 8 bytes. Apply for a piece of memory from memory when Free_ list [index] is nullptr, cut it into fixed size 'and hang it in "Free_ list [index] location.

Free memory: calculates the index location where index is inserted into the hash table based on the size of the memory object.

Second, the principle of simple memory pool 1. Overall design 1.1 memory pool structure

Two linked lists, RequestMemory and ReleaseMemory.

The RequestMemory linked list stores unused blocks of memory that are requested from physical memory using new or malloc, which are memNode nodes.

The ReleaseMemory linked list stores fixed-size blocks of memory that have been released after use.

1.2 request memory

Look for it in ReleaseMemory first. If you have memory, you can use it directly by pop.

When ReleaseMemory is nullptr, look in RequestMemory.

The header node of RequestMemory represents a new application. When you apply for memory, you only need to look in the header node to determine whether the useCount and sumCount of the header node are equal. When useCount is equal to sumCount, it means that it has been used up, so you need to apply in physical memory, otherwise you will push it directly from the header.

When requesting memory from physical memory, the size of the requested memory block is twice the size of the last requested memory block, and the requested memory block is push to the RequestMemory header.

1.3 release memory

When freeing memory, push the memory to be freed directly to the head of the ReleaseMemory.

2. Analyze the structure of 2.1 blockNode in detail.

BlockNode represents blocks of newly applied memory that are managed by a single structure. The members of blockNode are as follows:

Void* _ memory: indicates the first address of the newly applied memory block

BlockNode * _ next: store next nodes

_ objNum: the number of memory block objects

Note: each time the size of the blockNode is twice that of the previous one, which is a prime growth, so you should set an upper limit to increase linearly when you reach a certain size. Here, the size of the maximum memory block is 100000*sizeof (T), and T represents the type of node requested.

2.2 the size of a single object

The single object here refers to the node size of the ReleaseMemory. When the memory size sizeof (T) requested by the user is less than sizeof (T*), in order to link the object to the ReleaseMemory, it should be allocated according to T*.

3. Performance comparison

Request and release 110000 memory using malloc/free, new/delete, and memPool, respectively, at the following time:

3. Simple memory pool complete source code # include#include#includeusing namespace std; templateclass MemPool {private: / / memory block structure typedef struct BlockNode {void* _ memory;// memory block address BlockNode* _ next;// next blockNode size_t _ objNum / / number of memory block objects / / Constructor-num indicates the number of application objects BlockNode (size_t num): _ objNum (num), _ next (nullptr) {_ memory = malloc (_ objNum*_size) } ~ BlockNode () {free (_ memory); _ memory = nullptr; _ next = nullptr; _ objNum = 0;}} BlockNode;protected: static size_t _ size / / size of a single object T* _ releaseMemory = memory freed by nullptr;// BlockNode* _ requestMemory;// requested memory block size_t _ maxNum;// maximum size of memory block size_t _ useCount / / the number of objects already used in the current memory block protected: / / set the size of a single object static size_t setSize () {return (sizeof (T) > = sizeof (T*)? Sizeof (T): sizeof (T*);} public: MemPool (): _ useCount (0), _ releaseMemory (nullptr), _ maxNum (100000*_size) {/ / start applying for 32 spaces of the size of _ size _ requestMemory = new BlockNode (32) } ~ MemPool () {BlockNode* cur = _ requestMemory; while (cur) {BlockNode* del = cur; cur = cur- > _ next; delete del / / will automatically call ~ BlockNode ()} T* New () {/ / first find if (_ releaseMemory) {T* obj = _ releaseMemory; _ releaseMemory = * ((releaseMemory *) _ releaseMemory) in releaseMemory. / / the first few bytes of releaseMemory store the address return obj of the next node } else {/ / determine whether there is any free memory in requesetMemory if (_ requestMemory- > _ objNum = = _ useCount) {/ / request a piece of memory from physical memory Size_t size = 2 * _ useCount > = _ maxNum? _ maxNum: 2 * _ useCount BlockNode* newBlock = new BlockNode (size); newBlock- > _ next = _ requestMemory; _ requestMemory = newBlock; _ useCount = 0 } / / when you come here, there must be memory T * obj = (T*) ((char*) _ requestMemory- > _ memory+_useCount*_size); _ useCount++; return new (obj) T () / / initialize this space with positioning new}} void Delete (T* obj) {if (obj) {obj- > ~ T (); * ((releaseMemory; *) obj) = _ releaseMemory; _ releaseMemory = obj }; / / static member variable, templatesize_t MemPool::_size = MemPool::setSize (); struct TreeNode {int _ val; TreeNode* _ left; TreeNode* _ right;}; void test1 () {MemPool mp; vector v; for (int I = 0; I < 10) ITunes +) {TreeNode* mem = mp.New (); v.push_back (mem);} for (int I = 0; I < 10; iTunes +) {mp.Delete (v [I]);}} this is the end of the article on "how C++ memory pools are implemented". Thank you for reading! I believe you all have a certain understanding of the knowledge of "how to realize C++ memory pool". If you want to learn more knowledge, you are welcome to follow the industry information channel.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report