Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What is the use of the curl command in Linux

2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces the use of the curl command in Linux. It is very detailed and has a certain reference value. Friends who are interested must finish reading it.

Curl is a very practical tool for transferring data between servers; supporting protocols such as DICT, FILE, FTP, FTPS can be said to be a very powerful http command line tool, the following is to share with you the specific use of Linux curl commands.

The curl command explains the syntax: # curl [option] [url] Common parameters:

-A/--user-agent sets the user agent to send to the server-b/--cookie cookie string or file read location-write cookie to this file after the c/--cookie-jar operation ends-C/--continue-at breakpoint continues-D/--dump-header writes header information to the text Piece-e/--referer source URL-http error is not displayed when f/--fail connection fails-o/--output writes output to this file-O/--remote-name writes output to this file Keep the file name of the remote file-r/--range retrieves the byte range from the HTTP/1.1 or FTP server-s/--silent mute mode. Do not output anything-T/--upload-file upload file-u/--user set server user and password-w/--write-out [format] what output is complete-x/--proxy uses HTTP proxy-# /-- progress-bar on a given port The progress bar shows an example of the current transmission status: 1. Basic usage

After # curl http://www.linux.com is executed, the html of www.linux.com will be displayed on the screen Ps: since linux is often installed without a desktop, which means there is no browser, this method is often used to test whether a server can reach a website.

2. Save visited web pages 2.1: save using linux's redirect function

# curl http://www.linux.com > > linux.html2.2: you can use curl's built-in option:-o (lowercase) to save web pages

The following interface is displayed after the execution of $curl-o linux.html http://www.linux.com is completed, and 100% indicates that the save is successful.

% Total% Received% Xferd Average Speed Time Current Dload Upload Total Spent Left Speed 79684 79684 00 3437k 0 -::-7781k2.3: you can use curl's built-in option:-O (uppercase) to save files in web pages. Note that the url behind here should be specific to a file, otherwise you won't be able to capture it.

# curl-O http://www.linux.com/hello.sh3, returned value from the test page

# curl-o / dev/null-s-w% {http_code} www.linux.comPs: this is a common way to test whether a website is working properly in scripts

4. Specify the proxy server and its port. Most of the time, you need to use a proxy server to surf the Internet (for example, when you use a proxy server to surf the Internet or when your IP address is blocked by others because you use curl other people's websites). Fortunately, curl supports setting proxies by using built-in option:-x.

# curl-x 192.168.100.100cookie 1080 http://www.linux.com5, some websites use cookie to record session information. For browsers like chrome, it is easy to deal with cookie information, but it is also easy to deal with cookie 5.1 by adding relevant parameters in curl: save the cookie information in the response of http. Built-in option:-c (lowercase)

# curl-c cookiec.txt http://www.linux.com cookie information is stored in cookiec.txt after execution

5.2. save the header information in the response of http. Built-in option:-D

# curl-D cookied.txt http://www.linux.com cookie information is stored in cookied.txt after execution

Note: the cookie generated by-c (lowercase) is different from the cookie in-D.

Using cookie: many websites monitor your cookie information to determine whether you visit their site in accordance with the rules, so we need to use saved cookie information. Built-in option:-b

# curl-b cookiec.txt http://www.linux.com6, imitating browsers some websites need to use specific browsers to access them, and some need to use specific versions. Curl's built-in option:-An allows us to specify a browser to visit the website.

# curl-A "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.0)" http://www.linux.com so that the server will assume that it is accessed using IE8.0

7. Fake referer (hotlink) many servers check the referer accessed by http to control access. For example, you visit the home page first, and then visit the mailbox page in the home page. The referer address of the mailbox is the page address after successful access to the home page. If the server finds that the referer address accessed to the mailbox page is not the address of the home page, it is concluded that it is an illegal connection. The built-in option:-e in curl allows us to set referer.

# curl-e "www.linux.com" http://mail.linux.com so that the server will think you clicked on a link from www.linux.com

8. Download File 8.1: download files using curl. # use built-in option:-o (lowercase)

# curl-o dodo1.jpg http:www.linux.com/dodo1.JPG# uses built-in option:-O (uppercase)

# curl-O http://www.linux.com/dodo1.JPG so that the file is saved locally with the name on the server

8.2: circular download sometimes the download picture can be the first part of the name is the same, but the last caudal vertebra name is not the same

# curl-O http://www.linux.com/dodo[1-5].JPG so that all the dodo1,dodo2,dodo3,dodo4,dodo5 will be saved

8.3: download rename

# curl-O http://www.linux.com/{hello,bb}/dodo[1-5].JPG because the file name in both the downloaded hello and bb is dodo1,dodo2,dodo3,dodo4,dodo5. So the second download will overwrite the first download, so the file needs to be renamed.

# curl-o # 1_#2.JPG http://www.linux.com/{hello,bb}/dodo[1-5].JPG so that files downloaded in hello/dodo1.JPG will become hello_dodo1.JPG, other files, and so on, thus effectively preventing files from being overwritten

8.4: block download sometimes the download is relatively large, at this time we can download it in segments. Use built-in option:-r

# curl-r 0-100-o dodo1_part1.JPG http://www.linux.com/dodo1.JPG # curl-r 100-200-o dodo1_part2.JPG http://www.linux.com/dodo1.JPG # curl-r 200-o dodo1_part3.JPG http://www.linux.com/dodo1.JPG # cat dodo1_part* > dodo1.JPG so you can view the contents of dodo1.JPG

Download files through ftp curl can download files through ftp, and curl provides two syntax for downloading from ftp

# curl-O-u username: password ftp://www.linux.com/dodo1.JPG # curl-O ftp:// username: password @ www.linux.com/dodo1.JPG8.6: displays the download progress bar

# curl-#-O http://www.linux.com/dodo1.JPG8.7: download progress information will not be displayed

# curl-s-O http://www.linux.com/dodo1.JPG9, breakpoint continuation in windows, we can use software like Xunlei to resume breakpoint. Curl can also achieve the same effect through the built-in option:-C. If you suddenly drop the line while downloading dodo1.JPG, you can use the following ways to resume the upload.

# curl-C-O http://www.linux.com/dodo1.JPG10, upload files curl can not only download files, but also upload files. It is realized through built-in option:-T

# curl-T dodo1.JPG-u username: password ftp://www.linux.com/img/ so the file dodo1.JPG is uploaded to the ftp server

11. Display crawl error

# curl-f http://www.linux.com/error other parameters (translated here as reproduced):

-when a/--append uploads a file Attach to the target file-anyauth can use "any" authentication method-basic uses HTTP basic authentication-B/--use-ascii uses ASCII text transfer-d/--data HTTP POST mode to transfer data-data -ascii post data in ascii-data-binary post data in binary-negotiate uses HTTP authentication-digest uses digital authentication-disable-eprt prohibits the use of EPRT or LPRT-disable-epsv prohibits Do not use EPSV-egd-file to set the EGD socket path for random data (SSL)-tcp-nodelay uses the TCP_NODELAY option-E/--cert client certificate file and password (SSL)-cert-type certificate file type (DER/PEM/ENG) (SSL)-key private key text File name (SSL)-key-type private key file type (DER/PEM/ENG) (SSL)-pass private key password (SSL)-used by engine encryption engine (SSL). "--engine list" for list-- cacert CA certificate (SSL)-- capath CA item (made using c_rehash) to verify peer against (SSL)-- ciphers SSL password-- compressed request to return a compressed situation (using deflate or gzip)-- maximum time for connect-timeout setting-- create-dirs Establish a directory hierarchy of the local directory-- crlf upload is to convert LF to CRLF-ftp-create-dirs if the remote directory does not exist Create a remote directory-- ftp-method [multicwd/nocwd/singlecwd] controls the use of CWD-- ftp-pasv uses PASV/EPSV instead of port-- when ftp-skip-pasv-ip uses PASV Ignore this IP address-ftp-ssl attempts to use SSL/TLS for ftp data transfer-ftp-ssl-reqd requires SSL/TLS for ftp data transfer-F/--form simulates http form submission data-form-string simulates http form submission data-g/--globoff Disable URL sequence and range using {} and []-G/--get to send data in get-h/--help help-H/--header custom header information passed to the server-length of HTTP header information ignored by ignore-content-length-i/--include Include header information on output-I/--head displays document information only-j/--junk-session-cookies ignores session cookie when reading files-- interface uses specified network interface / address-- krb4 uses krb4-k/--insecure with specified security level Allow no certificate to SSL site-K/--config specified profile read-l/--list-only lists file names under ftp directory-limit-rate sets transfer speed-local-port forces use of local port number-m/--max-time sets maximum Transfer time-max-redirs sets maximum number of directories read-max-filesize sets maximum number of downloaded files-M/--manual shows full manual-n/--netrc reads username and password from netrc files-netrc-optional uses .netrc or URL to override-n-- ntlm uses HTTP NTLM authentication-N/--no-buffer disables buffered output-p/--proxytunnel uses HTTP proxy-proxy-anyauth chooses any proxy authentication method-proxy-basic on the proxy Use basic authentication-proxy-digest uses digital authentication on agents-proxy-ntlm uses ntlm authentication on agents-P/--ftp-port uses port addresses Instead of using PASV-Q/--quote file transfer, send commands to the server-range-file read (SSL) random files-when R/--remote-time generates files locally, retain remote file time-when there is a problem with retry transfer Number of retries-set retry interval when there is a problem with retry-delay transmission-when there is a problem with retry-max-time transmission Set maximum retry time-S/--show-error display error-socks4 uses socks4 proxy given host and port-socks5 uses socks5 proxy given host and port-t/--telnet-option Telnet option setting-trace debug-- trace-ascii Like-- tracks the specified file but does not Hex output-trace-time trace / detailed output Add a timestamp-url Spet URL to work with-U/--proxy-user sets the agent username and password-V/--version displays version information-what command X/--request specifies-the time it takes for y/--speed-time to abandon the speed limit. The default is 30-Y/--speed-limit stop transmission speed limit Speed time 'seconds-z/--time-cond transfer time setting-0/--http1.0 uses HTTP 1.0-1/--tlsv1 uses TLSv1 (SSL)-2/--sslv2 uses SSLv2's (SSL)-3/--sslv3 SSLv3 (SSL) used-- 3p-quote like-Q for the source URL for 3rd party transfer-- 3p-url uses url Third-party transmission-3p-user uses username and password for third-party transmission-4/--ipv4 uses IP4-6/--ipv6 to use IP6 above is all the content of this article "what is the use of curl commands in Linux", thank you for reading! Hope to share the content to help you, more related knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report