In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-03 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly shows you "how to use wget commands in Linux to download files", the content is easy to understand, clear, hope to help you solve your doubts, the following let Xiaobian lead you to study and learn "how to download files using wget commands in Linux" this article.
Wget command
The wget command is used to download files from the specified URL. Wget is very stable, it has strong adaptability in the case of narrow bandwidth and unstable network, if the download fails due to network reasons, wget will keep trying until the entire file has been downloaded. If the server interrupts the download process, it will connect to the server again and continue the download from where it stopped. This is useful for downloading large files from servers that limit link time.
Syntax wget (option) (parameter) option-a: record the execution of the data in the specified log file;-A: specify the suffix name of the file to download, separated by a comma;-b: run wget;-B in the background: set the base address of the referenced connection address;-c: continue with the last terminal task -C: set the server block function flag on to active, off to off, default to on;-d: debug mode run instruction;-D: set the list of domain names along, separated by ",";-e: execute the specified instruction as part of the file ".wgetrc";-h: display instruction help information;-I: get the URL address to download from the specified file -l: set up a list of directories along which multiple directories are separated by ",";-L: only follow the associated connection;-r: recursive download method;-nc: download file does not overwrite the original file when the file exists;-nv: download only displays updates and error messages, not detailed instruction execution;-Q: does not display instruction execution;-nh: does not query host name. -v: displays the detailed execution process;-V: displays version information;-- passive-ftp: connects to the FTP server using passive mode PASV;-- follow-ftp: downloads the FTP connection file from the HTML file. Parameters.
URL: download the specified URL address.
Example
Download a single file using wget
Wget http://www.linuxde.net/testfile.zip
The following example downloads a file from the network and saves it in the current directory, and a progress bar is displayed during the download process, including (percentage of download completed, bytes already downloaded, current download speed, remaining download time).
Download and save under a different file name
Wget-O wordpress.zip http://www.linuxde.net/download.aspx?id=1080
By default, wget commands with the last character that matches /, and the file name is usually incorrect for dynamically linked downloads.
Error: the following example downloads a file and saves it with the name download.aspx?id=1080:
Wget http://www.linuxde.net/download?id=1
Even if the downloaded file is in zip format, it still uses the download.php?id=1080 command.
Correct: to solve this problem, we can use the parameter-O to specify a file name:
Wget-O wordpress.zip http://www.linuxde.net/download.aspx?id=1080
Wget speed limit download
Wget-- limit-rate=300k http://www.linuxde.net/testfile.zip
When you execute wget, it takes up all possible broadband downloads by default. But the speed limit is necessary when you are ready to download a large file and you need to download other files.
Use wget breakpoint to resume transmission
Wget-c http://www.linuxde.net/testfile.zip
Using wget-c to restart interrupted files is very helpful for us to download large files when suddenly interrupted due to network and other reasons, we can continue to download instead of re-download a file. You can use the-c parameter when you need to continue interrupting downloads.
Download using wget background
Wget- b http://www.linuxde.net/testfile.zipContinuing in background, pid 1840.Output will be written to `wget-log'.
For downloading very large files, we can use the parameter-b for background download. You can use the following command to check the download progress:
Tail-f wget-log
Masquerade agent name download
Wget-user-agent= "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16" http://www.linuxde.net/testfile.zip
Some websites can reject your download request by judging that the proxy name is not a browser. However, you can use the-- user-agent parameter to disguise.
Test download link
When you plan to download regularly, you should test whether the download link is valid at the scheduled time. We can add the-- spider parameter to check.
Wget-spider URL
If the download link is correct, it will display:
Spider mode enabled. Check if remote file exists.HTTP request sent, awaiting response... 200 OKLength: unspecified [text/html] Remote file exists and could contain further links,but recursion is disabled-- not retrieving.
This ensures that the download will take place at the scheduled time, but when you give the wrong link, the following error will be displayed:
Wget-- spider urlSpider mode enabled. Check if remote file exists.HTTP request sent, awaiting response... 404 Not FoundRemote file does not exist-- broken Linklink!
You can use the-spider parameter in the following situations:
Check the interval before regular download to check whether the website is available to check the dead links on the page of the website.
Increase the number of retries
Wget-tries=40 URL
If there is a problem with the network or downloading a large file may also fail. By default, wget retries the connection to download the file 20 times. If necessary, you can use-- tries to increase the number of retries.
Download multiple files
Wget-I filelist.txt
First, save a download link file:
Cat > filelist.txturl1url2url3url4
Then use this file and the parameter-I to download.
Mirror website
Wget-- mirror-p-- convert-links-P. / LOCAL URL
Download the entire website locally.
-- download miror account opening image. -p download all files that are displayed normally for the html page. -- after convert-links is downloaded, it is converted to a local link. -P. / LOCAL saves all files and directories to the local specified directory.
Filter specified format downloads
Wget-reject=gif ur
Download a website, but you don't want to download pictures, you can use this command.
Save the download information in the log file
Wget-o download.log URL
Do not want the download information to be displayed directly on the terminal but in a log file that can be used.
Limit the total size of downloaded files
Wget-Q5m-I filelist.txt
When you want to download more than 5m files and exit the download, you can use. Note: this parameter does not work for a single file download and is only valid for recursive downloads.
Download files in the specified format
Wget-r-A.pdf url
You can use this feature in the following situations:
Download all the pictures on a website. Download all the videos from a website. Download all the PDF files for a website.
FTP download
Wget ftp-urlwget-ftp-user=USERNAME-ftp-password=PASSWORD url
You can use wget to download the ftp link.
Download using wget anonymous ftp:
Wget ftp-url
Ftp downloads authenticated with wget username and password:
Wget-- ftp-user=USERNAME-- ftp-password=PASSWORD url above are all the contents of this article entitled "how to download files using wget commands in Linux". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.