Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to solve the problem of locating and repairing memory leakage in Go

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article will explain in detail how to solve the problem of locating and fixing memory leaks in Go. Xiaobian thinks it is quite practical, so share it with you for reference. I hope you can gain something after reading this article.

Google Cloud Go client libraries [1] typically use gRPC in the background to connect to Google Cloud APIs. When you create an API client, the library initializes a connection to the API and keeps that connection open until you call Client.Close.

client, err := api.NewClient()// Check err.defer client.Close()

Clients can be safely used simultaneously, so you should keep the same Client until your task is complete. But what happens if you don't Close the client when it should?

Memory leaks occur. The underlying connections are never cleaned up.

Google has a bunch of GitHub automated bots to help manage hundreds of GitHub repositories. Some of our bots proxy their requests through Go servers [3] running on Cloud Run [2]. Our memory usage looks like a classic zigzag memory leak:

I started debugging by adding the pprof.Index handler to the server:

mux.HandleFunc("/debug/pprof/", pprof.Index)

`pprof` [4] provides runtime profiling data, such as memory usage. For more information, see the profiling Go program on Go's official blog [5].

Then I built and started the server locally:

$ go build$ PROJECT_ID=my-project PORT=8080 ./ serverless-scheduler-proxy

Then send some requests to the server:

for i in {1.. 5}; do curl --header "Content-Type: application/json" --request POST --data '{"name": "HelloHTTP", "type": "testing", "location": "us-central1"}' localhost:8080/v0/cron echo " -- $i"done

The exact payloads and endpoints are specific to our server and are irrelevant to this article.

To get a baseline of memory being used, I collected some initial pprof data:

curl http://localhost:8080/debug/pprof/heap > heap.0.pprof

Examining the output, you can see some memory usage, but nothing immediately becomes a big problem (which is good! We just started the server!):

$ go tool pprof heap.0.pprofFile: serverless-scheduler-proxyType: inuse_spaceTime: May 4, 2021 at 9:33am (EDT)Entering interactive mode (type "help" for commands, "o" for options)(pprof) top10Showing nodes accounting for 2129.67kB, 100% of 2129.67kB totalShowing top 10 nodes out of 30 flat flat% sum% cum cum% 1089.33kB 51.15% 51.15% 1089.33kB 51.15% google.golang.org/grpc/internal/transport.newBufWriter (inline) 528.17kB 24.80% 75.95% 528.17kB 24.80% bufio.NewReaderSize (inline) 512.17kB 24.05% 100% 512.17kB 24.05% google.golang.org/grpc/metadata.Join 0 0% 100% 512.17kB 24.05% cloud.google.com/go/secretmanager/apiv1. (*Client).AccessSecretVersion 0 0% 100% 512.17kB 24.05% cloud.google.com/go/secretmanager/apiv1. (*Client).AccessSecretVersion.func1 0 0% 100% 512.17kB 24.05% github.com/googleapis/gax-go/v2.Invoke 0 0% 100% 512.17kB 24.05% github.com/googleapis/gax-go/v2.invoke 0 0% 100% 512.17kB 24.05% google.golang.org/genproto/googleapis/cloud/secretmanager/v1. (*secretManagerServiceClient).AccessSecretVersion 0 0% 100% 512.17kB 24.05% google.golang.org/grpc. (*ClientConn).Invoke 0 0% 100% 1617.50kB 75.95% google.golang.org/grpc. (*addrConn).createTransport

The next step is to send a bunch of requests to the server to see if we can (1) reproduce the possible memory leak and (2) determine what the leak is.

Send 500 requests:

for i in {1.. 500}; do curl --header "Content-Type: application/json" --request POST --data '{"name": "HelloHTTP", "type": "testing", "location": "us-central1"}' localhost:8080/v0/cron echo " -- $i"done

Collect and analyze more pprof data:

$ curl http://localhost:8080/debug/pprof/heap > heap.6.pprof$ go tool pprof heap.6.pprofFile: serverless-scheduler-proxyType: inuse_spaceTime: May 4, 2021 at 9:50am (EDT)Entering interactive mode (type "help" for commands, "o" for options)(pprof) top10Showing nodes accounting for 94.74MB, 94.49% of 100.26MB totalDropped 26 nodes (cum

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report