In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-11 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces how to download files from the Linux terminal, the article is very detailed, has a certain reference value, interested friends must read it!
Use the wget command to download files from the Linux terminal
Wget is probably the most frequently used command line download manager in Linux and UNIX-like systems. You can use wget to download a file, multiple files, an entire directory, or even an entire website.
Wget is non-interactive and can easily work in the background. This means that you can easily use it in scripts or even build tools like the uGet download manager.
Let's take a look at how to download files from a terminal using wget.
Install wget
Most Linux distributions come pre-installed with wget. It can also be found in the repositories of most distributions, and you can easily install it using the distribution's package manager.
On Ubuntu-and Debian-based distributions, you can use the apt package manager command:
Sudo apt install wget
Download a file or web page using wget
You only need to provide the URL of the file or web page. It will download the file with the original name in your directory.
Wget URL
To download multiple files, you must save their URL in a text file and provide it to wget as input, like this:
Wget-I download_files.txt
Download files with different names with wget
You will notice that web pages are almost always saved as index.html in wget. It's a good idea to provide custom names for downloaded files.
You can use the-O (uppercase O) option when downloading to provide the output file name:
Wget-O filename URL
Download a folder with wget
Suppose you are browsing a FTP server and you need to download the entire directory. You can use the recursive option-r:
Wget-r ftp://server-address.com/directory
Download the entire website using wget
Yes, you can do that. You can mirror the entire site with wget. When I say download the whole website, I mean the whole structure of the website facing the public.
Although you can use the mirror option-m directly, it is best to add:
-convert-links-page-requisiteswget-m-convert-links-- page-requisites website_address
Additional hint: resume incomplete downloads
If you abort the download by pressing CTRL-C for some reason, you can resume the previous download with the option-c:
Wget-c uses curl to download files from the Linux command line
Like wget, curl is one of the most commonly used commands for downloading files in Linux terminals. There are many ways to use curl, but I only focus on simple downloads here.
Install curl
Although curl is not pre-installed, it is available in the official repository of most distributions. You can use your distribution's package manager to install it.
To install curl on Ubuntu and other Debian-based distributions, use the following command:
Sudo apt install curl
Download a file or web page using curl
If you use the curl command without any options in the URL, it will read the file and print it on the terminal.
To download files using the curl command in the Linux terminal, you must use the-O (uppercase O) option:
Curl-O URL
In Linux, it is relatively easy to download multiple files with curl. You only need to specify multiple URL:
Curl-O URL1 URL2 URL3
Remember, curl is not as simple as wget. Wget can save a web page as index.html, but curl complains that the remote file does not have the name of the web page. You must save it with a custom name as described in the next section.
Download the file under a different name
This may be confusing, but if you want to provide a custom name for the downloaded file (instead of the original name), you must use the-o (lowercase O) option:
Curl-o filename URL
Sometimes, curl doesn't download files as you expect, and you have to use the option-L (for location) to download correctly. This is because sometimes the link will be redirected to another link, and using the option-L, it will follow the final link.
Pause and resume downloads with curl
Like wget, you can resume paused downloads with the-c option of curl:
Curl-c URL above are all the contents of the article "how to download files from Linux terminals". Thank you for reading! Hope to share the content to help you, more related knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.