Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to use curl Command in Linux system

2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly explains "how to use the curl command in the Linux system". Interested friends may wish to have a look. The method introduced in this paper is simple, fast and practical. Let's let the editor take you to learn how to use the curl command in the Linux system.

Command: curl

In Linux, curl is a file transfer tool that uses URL rules to work under the command line, and it can be said to be a very powerful http command line tool. It supports uploading and downloading of files, and is an integrated transfer tool, but traditionally, url is called a download tool.

Syntax: # curl [option] [url]

Common parameters:

-A/--user-agent sets the user agent to send to the server

-b/--cookie cookie string or file read location

-write cookie to this file after the c/--cookie-jar operation is completed

-C/--continue-at breakpoint continuation

-D/--dump-header writes header information to the file

-e/--referer source URL

-http error is not displayed when f/--fail connection fails

-o/--output writes the output to this file

-O/--remote-name writes the output to the file, keeping the file name of the remote file

-r/--range retrieves byte range from HTTP/1.1 or FTP server

-s/--silent mute mode. Don't output anything.

-T/--upload-file uploads files

-u/--user sets the user and password of the server

-w/--write-out [format] after what output is completed

-x/--proxy uses a HTTP proxy on a given port

-# /-- the progress-bar progress bar shows the current transfer status

Example:

1. Basic usage

The code is as follows:

# curl http://www.linux.com

After execution, the html of www.linux.com will be displayed on the screen.

Ps: since linux is often installed without a desktop, which means no browser, this method is often used to test whether a server can reach a website

2. Save the visited web page

2.1Use linux's redirect feature to save

The code is as follows:

# curl http://www.linux.com > > linux.html

2.2: you can use curl's built-in option:-o (lowercase) to save web pages

The code is as follows:

$curl-o linux.html http://www.linux.com

After the execution is completed, the following interface will be displayed, and 100% will indicate that the save is successful.

Total Received Xferd Average Speed Time Time Time Current

Dload Upload Total Spent Left Speed

79684 0 79684 00 3437k 0 -:-7781k

2.3: you can use curl's built-in option:-O (uppercase) to save files in web pages

Note that the url behind here should be specific to a file, otherwise you won't be able to catch it.

The code is as follows:

# curl-O http://www.linux.com/hello.sh

3. Test the returned value of the web page

The code is as follows:

# curl-o / dev/null-s-w% {http_code} www.linux.com

Ps: in scripts, this is a common way to test whether a website is working properly

4. Specify the proxy server and its port

Many times you need to use a proxy server to surf the Internet (for example, when you use a proxy server to surf the Internet or when your IP address is blocked by others because you use curl other people's websites). Fortunately, curl supports setting proxies by using built-in option:-x.

The code is as follows:

# curl-x 192.168.100.100purl 1080 http://www.linux.com

5 、 cookie

Some websites use cookie to record session information. For browsers like chrome, it is easy to deal with cookie information, but it is also easy to deal with cookie by adding relevant parameters in curl.

Save the cookie information in the response of http. Built-in option:-c (lowercase)

The code is as follows:

# curl-c cookiec.txt http://www.linux.com

After execution, the cookie information is stored in the cookiec.txt.

5.2. save the header information in the response of http. Built-in option:-D

The code is as follows:

# curl-D cookied.txt http://www.linux.com

After execution, the cookie information is stored in the cookied.txt.

Note: the cookie generated by-c (lowercase) is different from the cookie in-D.

5.3Use cookie

Many websites monitor your cookie information to determine whether you visit their site according to the rules, so we need to use the saved cookie information. Built-in option:-b

The code is as follows:

# curl-b cookiec.txt http://www.linux.com

6. Imitate the browser

Some sites need to use specific browsers to access them, while others need to use specific versions. Curl's built-in option:-An allows us to specify a browser to visit the website.

The code is as follows:

# curl-A "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.0)" http://www.linux.com

In this way, the server side will think that it is accessed using IE8.0.

7. Counterfeit referer (hotlink)

Many servers check the referer accessed by http to control access. For example, you first visit the home page, and then visit the mailbox page in the home page. Here, the referer address of the mailbox is the page address after the access to the home page is successful. If the server finds that the referer address visited to the mailbox page is not the address of the home page, it is concluded that it is a stolen company.

The built-in option:-e in curl allows us to set referer

The code is as follows:

# curl-e "www.linux.com" http://mail.linux.com

This will make the server think that you clicked a link from www.linux.com.

8. Download the file

8.1: download files using curl.

The code is as follows:

# use built-in option:-o (lowercase)

# curl-o dodo1.jpg http:www.linux.com/dodo1.JPG

# use built-in option:-O (uppercase)

# curl-O https://cache.yisu.com/upload/information/20210312/303/123386.JPG

This saves the file locally with the name on the server

8.2: circular download

Sometimes the download picture can be the first part of the name is the same, but the last caudal vertebra name is not the same

The code is as follows:

# curl-O https://cache.yisu.com/upload/information/20210312/303/123387.JPG

In this way, all dodo1,dodo2,dodo3,dodo4,dodo5 will be saved.

8.3: download rename

The code is as follows:

# curl-O https://cache.yisu.com/upload/information/20210312/303/123388.JPG

Because the file name in both the downloaded hello and bb is dodo1,dodo2,dodo3,dodo4,dodo5. So the second download will overwrite the first download, so the file needs to be renamed.

The code is as follows:

# curl-o # 1_#2.JPG https://cache.yisu.com/upload/information/20210312/303/123388.JPG

In this way, files downloaded in hello/dodo1.JPG will become hello_dodo1.JPG, other files, and so on, thus effectively preventing files from being overwritten.

8.4: block download

Sometimes the download will be relatively large, at this time we can download it in segments. Use built-in option:-r

The code is as follows:

# curl-r 0-100-o dodo1_part1.JPG https://cache.yisu.com/upload/information/20210312/303/123386.JPG

# curl-r 100,200-o dodo1_part2.JPG https://cache.yisu.com/upload/information/20210312/303/123386.JPG

# curl-r 200-o dodo1_part3.JPG https://cache.yisu.com/upload/information/20210312/303/123386.JPG

# cat dodo1_part* > dodo1.JPG

So you can view the contents of dodo1.JPG.

8.5: download files through ftp

Curl can download files through ftp, and curl provides two syntax for downloading from ftp

The code is as follows:

# curl-O-u username: password ftp://www.linux.com/dodo1.JPG

# curl-O ftp:// username: password @ www.linux.com/dodo1.JPG

8.6: display the download progress bar

The code is as follows:

# curl-#-O https://cache.yisu.com/upload/information/20210312/303/123386.JPG

8.7: download progress information will not be displayed

The code is as follows:

# curl-s-O https://cache.yisu.com/upload/information/20210312/303/123386.JPG

9. Breakpoint continuation

In windows, we can use software like Xunlei to resume transmission at breakpoint. Curl can also achieve the same effect through built-in option:-C.

If you suddenly drop the line while downloading dodo1.JPG, you can use the following ways to resume the download

The code is as follows:

# curl-C-O https://cache.yisu.com/upload/information/20210312/303/123386.JPG

10. Upload files

Curl can not only download files, but also upload files. It is realized through built-in option:-T

The code is as follows:

# curl-T dodo1.JPG-u username: password ftp://www.linux.com/img/

This uploads the file dodo1.JPG to the ftp server

11. Display crawl error

The code is as follows:

# curl-f http://www.linux.com/error

Other parameters (translated here as reproduced):

-when a/--append uploads a file, it appends to the target file

Anyauth can use "any" authentication method

-- basic uses HTTP basic authentication

-B/--use-ascii uses ASCII text transfer

-transfer data in d/--data HTTP POST mode

-- data-ascii post data in an ascii way

-- data-binary post data in binary way

-- negotiate uses HTTP authentication

-- digest uses digital authentication

-- disable-eprt prohibits the use of EPRT or LPRT

-- disable-epsv prohibits the use of EPSV

-- egd-file sets EGD socket path for random data (SSL)

-- tcp-nodelay uses TCP_NODELAY option

-E/--cert client certificate file and password (SSL)

-- cert-type certificate file type (DER/PEM/ENG) (SSL)

-- key private key file name (SSL)

-- key-type private key file type (DER/PEM/ENG) (SSL)

-- pass private key password (SSL)

-- engine encryption engine (SSL). "--engine list" for list

-- cacert CA Certificate (SSL)

-- capath CA (made using c_rehash) to verify peer against (SSL)

-- ciphers SSL password

-- compressed requires a compressed situation to be returned (using deflate or gzip)

-- connect-timeout sets the maximum request time

-- create-dirs establishes a directory hierarchy of local directories

-- crlf upload transforms LF into CRLF

-- ftp-create-dirs create a remote directory if it does not exist

-- ftp-method [multicwd/nocwd/singlecwd] controls the use of CWD

-- ftp-pasv uses PASV/EPSV instead of port

-- ignore the IP address when ftp-skip-pasv-ip uses PASV

-- ftp-ssl attempts to use SSL/TLS for ftp data transmission

-- ftp-ssl-reqd requires SSL/TLS for ftp data transmission

-F/--form simulates http form submission data

-form-string simulates http form submission data

-g/--globoff disables URL sequences and ranges using {} and []

-G/--get sends data in get mode

-h/--help help

-H/--header custom header information is passed to the server

-- length of HTTP header information ignored by ignore-content-length

-i/--include output includes header information

-I/--head displays only document information

-j/--junk-session-cookies ignores session cookie when reading files

-- interface uses the specified network interface / address

-- krb4 uses krb4 with a specified security level

-k/--insecure allows you to go to SSL sites without using certificates

-read the configuration file specified by K/--config

-l/--list-only lists the names of files in the ftp directory

-- limit-rate sets the transmission speed

-- local-port forces the use of local port numbers

-m/--max-time sets the maximum transfer time

-- max-redirs sets the maximum number of directories to read

-- max-filesize sets the maximum number of downloaded files

-M/--manual display is fully manual

-n/--netrc reads username and password from netrc file

-- netrc-optional uses .netrc or URL to override-n

-- ntlm uses HTTP NTLM authentication

-N/--no-buffer disables buffered output

-p/--proxytunnel uses HTTP proxy

-- proxy-anyauth chooses any proxy authentication method

-- proxy-basic uses basic authentication on agents

-- proxy-digest uses digital authentication on agents

-- proxy-ntlm uses ntlm authentication on the agent

-P/--ftp-port uses port addresses instead of PASV

-send commands to the server before Q/--quote file transfer

Random files read (SSL) by range-file

-R/--remote-time retains remote file time when generating files locally

-- the number of retries when there is a problem with retry transmission

-- set the retry interval when there is a problem with retry-delay transmission

-- set the maximum retry time when there is a problem with retry-max-time transmission

-S/--show-error display error

-- socks4 uses socks4 to proxy a given host and port

-- socks5 uses socks5 to proxy a given host and port

-t/--telnet-option Telnet option settin

-- trace debug the specified file

-- trace-ascii Like-- trace but no hex output

-- add a timestamp when trace-time trace / detailed output

-- url Spet URL to work with

-U/--proxy-user sets the proxy username and password

-V/--version displays version information

-what command does X/--request specify

-the time required for y/--speed-time to give up the speed limit. The default is 30

-Y/--speed-limit stop transmission speed limit, speed time's second

-z/--time-cond transfer time setting

-0/--http1.0 uses HTTP 1.0

-1/--tlsv1 uses TLSv1 (SSL)

-2/--sslv2 uses SSLv2 (SSL)

-SSLv3 (SSL) used by 3/--sslv3

-- 3p-quote like-Q for the source URL for 3rd party transfer

-- 3p-url uses url for third party transmission

-- 3p-user uses username and password for third-party transmission

-4/--ipv4 uses IP4

-6/--ipv6 uses IP6

Use the curl command to get the file download speed

You can download web content using curl, so how to get the download speed of curl download, use the following command:

The code is as follows:

# curl-Lo / dev/null-skw "% {speed_download}\ n" http://mirrors.163.com/ubuntu/ls-lR.gz

226493.000

Of course, you can also get more data such as connection time, redirection time, and so on:

The code is as follows:

# curl-Lo / dev/null-skw "time_connect:% {time_connect} s\ ntime_namelookup:% {time_namelookup} s\ ntime_pretransfer:% {time_pretransfer} s\ ntime_starttransfer:% {time_starttransfer} s\ ntime_redirect:% {time_redirect} s\ nspeed_download:% {speed_download} s\ ntime_total:% {time_total} s\ n\ n" http://www.sina.com

Time_connect: 0.154 s

Time_namelookup: 0.150 s

Time_pretransfer: 0.154 s

Time_starttransfer: 0.163 s

Time_redirect: 0.157 s

Speed_download: 324679.000 B/s

Time_total: 1.692 s

At this point, I believe you have a deeper understanding of "the use of curl commands in the Linux system". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report