In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-13 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/03 Report--
Editor to share with you a variety of wget options classification example analysis, I believe that most people do not know much about it, so share this article for your reference, I hope you will learn a lot after reading this article, let's go to know it!
Wget classified list of various options
"start
-V,-- version exits after displaying the version of wget
-h,-- help print grammar help
-b,-- background is transferred to the background for execution after startup.
-e,-- execute=COMMAND executes commands in the format `.wgetrc'. See / etc/wgetrc or ~ / .wgetrc in wgetrc format.
"record and input files
-o,-- output-file=FILE writes the record to the FILE file
-a,-- append-output=FILE appends records to the FILE file
-d,-- debug print debug output
-Q,-- quiet quiet mode (no output)
-v,-- verbose verbose mode (this is the default)
-nv,-- non-verbose turns off verbose mode, but not quiet mode
-I,-- input-file=FILE download the URLs that appears in the FILE file
-F,-- force-html treats input files as if they were in HTML format
-B,-- base=URL prefix URL as the relative link that appears in the file specified by the-F-I parameter
-- sslcertfile=FILE optional client certificate
-- KEYFILE of sslcertkey=KEYFILE optional client certificate
-- egd-file=FILE specifies the file name of the EGD socket
"download
-- bind-address=ADDRESS specifies the local use address (hostname or IP, which is used when there is more than one IP or name locally)
-t,-- tries=NUMBER sets the maximum number of attempted links (0 means unlimited).
-O-- output-document=FILE writes the document to the FILE file
-nc,-- no-clobber do not overwrite existing files or use the. # prefix
-c,-- continue then downloads the undownloaded files
-- progress=TYPE sets the process bar flag
-N,-- timestamping do not download the file again unless it is newer than the local file
-S,-- response from server-response print server
Spider doesn't download anything.
-T,-- timeout=SECONDS sets the number of seconds for response timeout
-w,-- wait=SECONDS SECONDS seconds between attempts
-- waitretry=SECONDS waits for 1...SECONDS seconds between relinks
-- random-wait waits 0...2*WAIT seconds between downloads
-Y,-- proxy=on/off turns the agent on or off
-Q,-- quota=NUMBER sets the capacity limit for downloads
-- limit-rate=RATE limits download output rate
"Catalog
-nd-- no-directories does not create a directory
-x,-- force-directories forces the creation of directories
-nH,-- no-host-directories does not create host directories
-P,-- directory-prefix=PREFIX saves the file to the directory PREFIX/...
-- cut-dirs=NUMBER ignores NUMBER-tier remote directories
"HTTP option
-- http-user=USER sets the HTTP user name to USER.
Http-passwd=PASS sets the http password to PASS.
-C,-- cache=on/off allows / disallows server-side data caching (generally allowed).
-E,-- html-extension saves all text/html documents with the .html extension
-- ignore-length ignores `Content-Length' header domain
-- header=STRING inserts the string STRING into headers
-- proxy-user=USER sets the user name of the agent to USER
-- proxy-passwd=PASS sets the password of the agent to PASS
-- referer=URL includes the `Referer: URL' header in the HTTP request
-s,-- save-headers saves the HTTP header to the file
-U,-- user-agent=AGENT sets the name of the proxy to AGENT instead of Wget/VERSION.
-- no-http-keep-alive closes HTTP active links (permanent links).
-- cookies=off does not use cookies.
-- load-cookies=FILE loads cookie from the file FILE before starting the session
-- save-cookies=FILE saves the cookies to the FILE file after the session ends
"FTP option
-nr,-- dont-remove-listing does not remove the `.upload 'file
-g,-- glob=on/off turns on or off the globbing mechanism for filenames
-- passive-ftp uses passive transfer mode (default).
-- active-ftp uses active transfer mode
-- retr-symlinks points a link to a file (not a directory) when recursive
"Recursive download
-r,-- recursive recursive download-- use with caution!
-l,-- level=NUMBER maximum recursive depth (inf or 0 for infinity).
-- delete-after partially deletes the file after it is finished now
-k,-- convert-links convert non-relative links to relative links
-K,-- backup-converted backs up file X as X.orig before converting it
-m,-- mirror is equivalent to-r-N-l inf-nr.
-p,-- page-requisites downloads all the pictures showing the HTML file
"accept/reject and do not include in recursive downloads
-A,-- accept=LIST semicolon delimited list of accepted extensions
-R,-- reject=LIST semicolon delimited list of unacceptable extensions
-D,-- domains=LIST semicolon delimited list of accepted domains
-- A list of unacceptable domains separated by exclude-domains=LIST semicolons
-- follow-ftp tracks FTP links in HTML documents
-- follow-tags=LIST semicolon separated list of tracked HTML tags
-G,-- ignore-tags=LIST semicolon separated list of ignored HTML tags
-H,-- span-hosts goes to the external host when recursive
-L,-- relative only tracks relative links
-I,-- list of include-directories=LIST allowed directories
-X,-- exclude-directories=LIST is not included in the list of directories
-np,-- no-parent do not trace back to the parent directory
The above is all the contents of the article "sample Analysis of wget various option categories". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.