In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/03 Report--
This article will explain in detail how to use Go seconds to climb 100 pages of news in the blog park. The editor thinks it is very practical, so I share it with you as a reference. I hope you can get something after reading this article.
The speed of crawling web pages is quite fast by making use of the concurrent advantage of the go language, and it takes only one second to crawl all the 100 pages of news headlines in the blog park.
Package mainimport ("bytes"fmt"github.com/PuerkitoBio/goquery"log"net/http"runtime"strconv"sync") func Scraper (page string) string {/ / Request the HTML page. ScrapeURL: = "https://news.cnblogs.com/n/page/" + page client: = & http.Client {} reqest, _: = http.NewRequest (" GET ", ScrapeURL, nil) reqest.Header.Set (" Accept "," text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 ") reqest.Header.Set (" Accept-Charset "," GBK,utf-8;q=0.7,* " Qroom0.3 ") / reqest.Header.Set (" Accept-Encoding "," gzip,deflate,sdch ") reqest.Header.Set (" Accept-Language "," zh-CN,zh;q=0.8 ") reqest.Header.Set (" Cache-Control "," max-age=0 ") reqest.Header.Set (" Connection "," keep-alive ") reqest.Header.Set (" User-Agent "," Mozilla/5.0 (Windows NT 10.0; Win64 ") X64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/68.0.3440.75 Safari/537.36 ") res, err: = client.Do (reqest) if err! = nil {log.Fatal (err)} defer res.Body.Close () if res.StatusCode! = 200 {log.Fatalf (" status code error:% d% s ", res.StatusCode, res.Status)} / / Load the HTML document doc Err: = goquery.NewDocumentFromReader (res.Body) if err! = nil {log.Fatal (err)} / / Find the review items var buffer bytes.Buffer buffer.WriteString ("* Scraped page" + page + "*\ n") doc.Find (".content .news _ entry") .Each (func (I int, s * goquery.Selection) {/ / For each item found Get the band and title title: = s.Find ("a"). Text () url, _: = s.Find ("a"). Attr ("href") buffer.WriteString ("Review" + strconv.Itoa (I) + ":" + title + "\ n https://news.cnblogs.com" + url +"\ n ") return buffer.String ()} func main () {runtime.GOMAXPROCS (runtime.NumCPU ()) ch: = make (chan string) Wg: = & sync.WaitGroup {} var page string for i: = 1 I < 101; wg.Add + {wg.Add (1) go func (I int) {page = strconv.Itoa (I) fmt.Printf ("Scraping page% s...\ n", page) ch
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.