Xiaopei's DokuWiki

These are the good times in your life,
so put on a smile and it'll be alright

User Tools

Site Tools


it:linux:power_tools

Power Tools

tips

  • 简易计算器
    # usage: c 60/36
    # 1.66666
    c() {
        echo "$@" | bc -l
    }
  • 打开其他编码的 shell:luit
    luit -encoding big5 telnet ptt.cc
    luit -encoding gbk telnet bbs.sjtu.edu.cn
    • bash -x script.sh
  • 时区设置 CST - China Standard Time, UTC+08:00
    • Debian:
      $ dpkg-reconfigure tzdata
  • 中文 How to set up a clean UTF-8 environment in Linux
    # 问题
    # locale: Cannot set LC_CTYPE to default locale: No such file or directory
    # locale: Cannot set LC_ALL to default locale: No such file or directory
     
    # 解决
    $ sudo aptitude install locales
    $ sudo dpkg-reconfigure locales
    $ vi ~/.bashrc
    export LC_ALL=en_US.UTF-8
    export LANG=en_US.UTF-8
    export LANGUAGE=en_US.UTF-8
    $ . ~/.bashrc
  • 让 sudo 使用当前用户 export 的环境变量 ENV sudo -E(可让 sudo 使用 http_proxy)
  • 让 sudo 识别当前用户设定的 alias: alias sudo='sudo ' 1)
  • 在命令行前加空格,该命令不会进入 history 里
  • logger $MSG 可以在 syslog 中记录 $MSG
  • Tclip 使用 opencv 的智能图片裁剪, 有 php 扩展和 shell cmd
  • 将 flv 转为 mp4: avconv -i input.flv -codec copy output.mp4
  • uptime 能看开机多久, who -b 可看何时启动
    $ uptime
    11:01  up 7 days, 18:36, 7 users, load averages: 1.03 7.74 7.49
    $ who -b
    reboot   ~        Mar 12 16:25
  • 关于 ANSI color
  • xargs 可以将 stdin 转为命令的参数
    # 我写的一些 node 程序的 deploy.sh 如下
    #!/bin/bash
    cp -r deb/etc/* /etc/
     
    # 而 remove.sh 如下
    #!/bin/bash
    find deb/etc -type f | sed 's|deb/etc|/etc|' | xargs sudo rm
     
    # xargs -p 可确认
    $ find deb/etc -type f | sed 's|deb/etc|/etc|' | xargs -p rm
    rm /etc/monit/conf.d/foo /etc/cron.d/foo ?...
    # 需 y + 回车 才执行
     
     
    # xargs -I @ 可用一个字符替换参数,以让参数在命令中间(而非末尾)用
    $ ls | xargs -n 1 -I @ echo @ hehe
    a hehe
    b hehe
    c.csv hehe
     
    # 执行多命令,用 sh -c
    $ find . -type d -name "*log*" | xargs -I {} sh -c "echo {};ls -la {} | tail -2"
  • seq 1 100 可以生成一段数列, 类似一些语言中的 range()
    # ping an entire network range
    for i in `seq 1 254`; do ping -t 3 -c 5 10.10.0.$i; done
  • How to use 'cp' command to exclude a specific directory?
    $ rsync -av --progress sourcefolder destinationfolder --exclude thefoldertoexclude

paste

paste 可以将若干个文件按列合并

$ paste -d, file1 file2
Linux,Suse
Unix,Fedora
Solaris,CentOS
HPUX,OEL
AIX,Ubuntu

要注意如果行末有 \r\n paste 会不正常, 需要先 dos2unix !!!! 使用其他 text processing 工具时, 都应注意换行符的问题 !!!

tee - read from standard input and write to standard output and files

# tcpdump 同时在屏幕和文件输出 ( 因为 tcpdump -w  人类不可直接读 )
$ tcpdump -n -i eth0 host 111.111.111.111 | tee -a - /tmp/111.log

grep

To print lines of text that match one or more patterns. This is often the first stage in a pipeline that does further processing on matched data.

grep [ options ... ] pattern-spec [ files ... ]

Major options

pattern 相关

  • -P
    使用 Perl 语法, Perl 语法下可使用 .*? 实现非贪婪匹配, regex - Non greedy grep - Stack Overflow
  • -E
    使用扩展正则表达式. Match using extended regular expressions. grep -E replaces the traditional egrep command.
  • -F
    不用正则表达式. Match using fixed strings. grep -F replaces the traditional fgrep command.
  • -e pattern
    搜索条件为-.*(以 dash 开头)时, grep 默认当作命令的 option, 需要-e -.*, 但还是\-.*更直观
  • -i
    Ignore lettercase when doing pattern matching.
  • -f pat-file
    Read patterns from the file pat-file.
  • 多个 pattern 用引号包起来换行输入

输出相关

  • -o
    只输出匹配的部分, 而不是整句
  • -v
    Print lines that don't match the pattern.
  • -l
    只输出文件名
  • -h
    (搜索多文件时)不显示匹配文件名
  • -H
    (即使搜索单文件也)显示匹配文件名
  • -q
    Be quiet. Instead of writing lines to standard output, grep exits successfully if it matches the pattern, unsuccessfully otherwise. (We haven't discussed success/nonsuccess yet; see Section 6.2.)
  • -s
    Suppress error messages. This is often used together with -q.

zgrep

search possibly compressed files for a regular expression

zgrep 没有 -r 选项, recursive 搜索需要 find -exec

grep OR

# SEND or RECV

$ grep 'SEND\|RECV' deviceg.log

# extended regexp
$ grep -E 'SEND|RECV' deviceg.log

# PERL regexp
$ grep -P '(SEND|RECV)' deviceg.log

7 Linux Grep OR, Grep AND, Grep NOT Operator Examples

-R + exclude dir

GNU Grep (>= 2.5.2) provide:

--exclude-dir=dir
    Exclude directories matching the pattern dir from recursive directory searches. 

date

# 当前时间的 timestamp
$ date +%s
 
# timestamp 转可读时间
$ date -d@1234567890

sed

一些给 SQL 文件脱敏的方法:

# 手机号脱敏
$ sed -i "s/'1[0-9]\{10\}'/'13900000000'/g" prod.sql
 
# 邮箱脱敏,都替换成 @example.
$ sed -i 's/@[^\.]*\./@example./g' prod.sql

sed 不能用 .*? 做 non-greedy 匹配,但能换个思路,用 [^结束字符]* 来替换,示例如下

# 将“yy-track 到 >”替换为“>”
ack 'yy-track' www/templates/ | sed 's/ yy-track[^>]*//'
  • 替换: s/$from/$to/[g]
    • $to 需转义特殊字符, 可先 echo $to 查看效果
  • 删除: /match/d
  • append 在文件末尾附加: sed '$a/text to append'

Mac 和 Linux 的 sed 行为不一样,Mac sed -i 的第一个参数是“备份后缀”

# If you have:
#
# File1.txt
# File2.cfg
#
#The command:
$ sed -i '.original' 's/old_link/new_link/g' *
 
# create 2 backup files like:
#
# File1.txt.original
# File2.cfg.original
 
# You can use
$ sed -i '' 's/old_link/new_link/g' *
# to ignore backups.

find

-exec multiple commands

$ find . -name "*.txt" -exec echo {} \; -exec grep banana {} \;

bash - find -exec with multiple commands - Stack Overflow

-exec needs pipe(|)

# 问题: 一批用户数据丢失了, 没有日志, 该类数据不常修改, 想从每天的数据库备份中定位数据丢失日期
find backups/2013* -name db.gz | while read file
do
  echo $file `zgrep table_name $file | wc -c`
done

exclude (some) dir

# 只计算某些后缀名的文件
wc -l `find . -type f -name "*.css" -o -name  "*.js" -o -name "*.php" -o -name "*.phtml"`
 
# 不包含 .git (when using -name pattern, to ignore a directory and the files under it, use -prune)
$ wc -l `find . -name ./.git -prune -o -type f` | tail
 
# 排除多个目录
$ wc -l `find . \( -path ./.git -o -path ./deploy \)  -prune -o -type f`

refs

Text Processing Tools

awk, banner, basename, comm, csplit, cut, dirname, ed, ex, fmt, head, iconv, join, less, more, paste, sed, sort, spell, strings, tail, tr, uniq, vi, wc, xargs

当使用 text processing tools 操作一个文件, 并将结果重定向到文件本身时, (很可能)会导致文件为空!

不要用 >, 用 tee!

$ sort foo -o foo
# ok
$ uniq foo foo
# 文件会变空!
$ uniq foo | cat > foo
# 文件会变空!
$ uniq foo | tee foo
# ok

comm

Compare sorted files FILE1 and FILE2 line by line.

$ comm [OPTION]... FILE1 FILE2

With no options, produce three-column output. Column one contains lines unique to FILE1, column two contains lines unique to FILE2, and column three contains lines common to both files.

-1
suppress column 1 (lines unique to FILE1)
-2
suppress column 2 (lines unique to FILE2)
-3
suppress column 3 (lines that appear in both files)

cat

最简单的编辑器

$ cat > foo
# 写内容
C-c

cut

To select one or more fields or groups of characters from an input file, presumably for further processing within a pipeline.

Usage

cut -c list [ file ... ] 
cut -f list [ -d delim ] [ file ... ] 

Major options

  • -c list
    Cut based on characters. list is a comma-separated list of character numbers or ranges, such as 1,3,5-12,42.
  • -d delim
    Use delim as the delimiter with the -f option. The default delimiter is the tab character.
  • -f list
    Cut based on fields. list is a comma-separated list of field numbers or ranges.

join

To merge records in sorted files based on a common key.

Usage

join [ options ... ] file1 file2

Major options

  • -1 field1 -2 field2
    Specifies the fields on which to join. -1 field1 specifies field1 from file1, and -2 field2 specifies field2 from file2. Fields are numbered from one, not from zero.
  • -o file.field
    Make the output consist of field field from file file. The common field is not printed unless requested explicitly. Use multiple -o options to print multiple output fields.
  • -t separator
    Use separator as the input field separator instead of whitespace. This character becomes the output field separator as well.

tr

translate or delete characters

Synopsis

tr [OPTION]... SET1 [SET2]

Examples

# 列出 php 进程号(进而可作为参数 kill )
$ ps ax | grep php | awk '{print $1}' | tr "\\n" " "
 
# Convert lower case to upper case
$ tr a-z A-Z

awk - 写一个 for 循环的内部实现

格式化 printf The GNU Awk User’s Guide: Printf Examples

# 235411
# 235657
# 49
# 51
# 3443
$ awk 'BEGIN{FS="|"}{printf "%06d\n", $4}'
235411
235657
000049
000051
003443
# input
07.46.199.184 [28/Sep/2010:04:08:20] "GET /robots.txt HTTP/1.1" 200 0 "msnbot"
123.125.71.19 [28/Sep/2010:04:20:11] "GET / HTTP/1.1" 304 - "Baiduspider"

完整输出

$ awk '{print $0}'

输出空白(空格、tab)分割的第一项数据

$ awk '{print $1}' logs.txt

输出第一和倒数第

$ awk '{print $1, $(NF-2) }' logs.txt

格式化输出,NR表示当前正在处理的输入的行号

$ awk '{print NR ") " $1 " -> " $(NF-2)}' logs.txt

以“:”分割

# FS for Field Separator
$ awk '{print $2}' logs.txt | awk 'BEGIN{FS=":"}{print $1}' | sed 's/\[//'
# output
28/Sep/2010
28/Sep/2010

条件判断:只得到状态为200的行

$ awk '{if ($(NF-2)=="200") {print $0}}' logs.txt

跨行保存状态:欲得到文件中所有HTTP状态字段的和

$ awk '{a+=$(NF-2); print "Total so far:", a}' logs.txt
# output
Total so far: 200
Total so far: 504

…只在最后一行调用print

$ awk '{a+=$(NF-2)}END{print "Total:", a}' logs.txt
# output
Total: 504

输出所有列 except for 某些列

$ awk '{$1=$2=$3="";print}' file

lighttpd 查看 access.log 的每分钟请求数

awk '{print $4}' access.log | awk 'BEGIN{FS=":"}{print $1,$2":"$3}' |  uniq -c | sort -n

转换 timestamp 为可读的时间

$ awk '{print strftime("%c",$1)}

strftime 是 gawk 的函数. gawk (GNU awk) 是 awk 的扩展, Ubuntu 默认的为 mawk, gawk 需安装.

awk、nawk、mawk、gawk的简单介绍

more

使用 awk 命令行快速分析 IBM HTTP Server 访问日志

  • 查找并显示所有状态码为 404 的请求
    awk '($9 ~ /404/)' access.log
  • 追查谁在盗链网站图片
    awk -F\" '($2 ~ /\.(jpg|gif|png)/ && $4 !~ /^http:\/\/www\.example\.com/)\ 
    {print $4}' access.log \ | sort | uniq -c | sort
    # 注意:使用前,将 www.example.com 修改为自己网站的域名。
    #  - 使用 ” 分解每一行;
    #  - 请求行中必须包括 “.jpg” 、”.gif” 或 ”.png”;
    #  - 引用页不是以您的网站域名字符串开始( 在此例中,即 www.example.com );
    #  - 显示出所有引用页,并统计出现的次数。
  • 统计每一个 IP 访问了多少个页面
    awk '{++S[$1]} END {for (a in S) print a,S[a]}' log_file
  • 所有响应时间超过 3 秒的日志记录
    awk '($NF > 3){print $0}' access.log
    # 注意:NF 是当前记录中域的个数。$NF 即最后一个域。

将不定长空格或 tab 分割的日志, 转换为 tab 分割

$ echo -e  "haha hahaha     hahahahah\thaha haha" | awk '{
for (i = 1; i <= NF; i++) {
  printf $i;
  if (i <= 2)
    printf "\t";
  else
    printf " ";
}
print ""
}' | cat -t
haha^Ihahaha^Ihahahahah haha haha

解释:

  1. 前两列是日志时间, 其余是日志内容
  2. awk 默认以任意空白作为列分割符
  3. awk 里不能直接输出余下的所有列, 所以用了 for 循环
  4. awk 的 print 会换行, printf 不会
  5. echo -e 能在打印时输出特殊字符, 如 “\t”
  6. cat -t 可以在输出时将 “\t” 显示输出为 “^I”

定期使用 ps 查看某类的进程, 并增加 date 打印 (awk date):

ps aux | grep gs| grep -v grep | awk -v date="$(date +"%H:%M")" '{ print date, $0}'

awk 的 print 对于大数字会用科学计数法(scientific notation),可以用 printf 避免:

awk '{sum += $1} END {printf "%.2f\n", sum}'
# 替换两列
$ awk ' { t = $1; $1 = $2; $2 = t; print; } ' input_file

# 以 , 分割输出 OFS
$ awk 'BEGIN { OFS = ";"; ORS = "\n\n" }
>            { print $1, $2 }' mail-list

IF / ELSE

$ awk '{if (a!=$1) {a=$1; s+=$2} }END{print s}'
# 对第一列排重,对第二列求和

到底第几列?

$ awk -F',' ' { for (i = 1; i <= NF; ++i) print i, $i; exit } ' file
 
# 或者抽象成函数
awkcol() {
  if [ "$#" -eq 1 ]; then
    awk '{ for (i = 1; i <= NF; ++i) print i, $i; exit }' $1
  else
    awk -F "$1" '{ for (i = 1; i <= NF; ++i) print i, $i; exit }' $2
  fi
 
}

AWK 的 build-in 参数

8 Powerful Awk Built-in Variables – FS, OFS, RS, ORS, NR, NF, FILENAME, FNR

  • FS: 输入 列 分隔符
  • OFS: 输出 列 分隔符
  • RS: 输入 行 (记录) 分隔符
  • ORS: 输出 行 (记录) 分隔符
  • NR: Number of Records Variable
  • NF: Number of Fields in a record
  • FILENAME: Name of the current input file
  • FNR: Number of Records relative to the current input file

head/tail/multitail

$ head -n 10
# 显示前 10 行
 
$ head -n -10
# 显示最后 10 行外的所有内容
 
$ tail -n 10
# 显示后 10 行
 
$ tail -n +10
# 显示除前 10 行外的所有内容
 
 
# multitail 显示 ansi 颜色
$ multitail -cT ansi log/development.log -cT ANSI log/test.log

计划任务

at 定时执行

  • at, 增加定时任务
 $ echo 'sudo reboot' | at 0000 # 在将来最近的 00:00 重启
 $ at 0000 jan 31
 at> sudo reboot
 at> ^D (press Control-D while at the beginning of a line)
  • atq, 查看任务队列
 $ atq
1 2011-01-30 00:00 sudo reboot user
2 2011-01-31 00:00 sudo reboot user

如果 atq 未显示具体执行的命令, 则可以 at -c id 查看(the last part of the answer which will be the script you type)

  • atrm, 删除定时任务
 $ atrm 1

参考

cron 周期性进程

crontab 默认是用 sh, 所以不能用 bash 的语法, 如 [[ ... ]]

命令格式

*    *    *    *    *  [username] command to be executed
┬    ┬    ┬    ┬    ┬
│    │    │    │    │
│    │    │    │    │
│    │    │    │    └───── day of week (0 - 6) (0 is Sunday, or use names)
│    │    │    └────────── month (1 - 12)
│    │    └─────────────── day of month (1 - 31)
│    └──────────────────── hour (0 - 23)
└───────────────────────── min (0 - 59)

设置方法

通过命令设置
# list
$ crontab -l
# edit
$ crontab -e 
# remove all
$ crontab -r
通过文件设置

由于 cron 配置中可写任意用户的 crontab, 所以文件必须 owned by root:root(否则会报错 WRONG FILE OWNER)

/etc/cron.d/my_conf
# m h dom mon dow user  command
* * * * * www-data echo "`whoami` `date`" >> /tmp/date.log
*/5 * * * * www-data echo "`whoami` `date`" >> /tmp/date.log

添加/修改配置文件后, 不需重启 cron 即可生效.

命令中的百分号 % percent 问题

# 如果 crontab 的命令中有百分号
17 * * * * mysqldump foo | gzip > /backups/`date +%s`.sql.gz

# 会运行不成功, 日志如下(syslog)
Nov 26 10:17:01 mall CRON[22600]: (root) CMD (mysqldump foo  | gzip > /backups/`date +)

# 需要将百分号转义
37 * * * * mysqldump foo | gzip > /backups/`date +\%s`.sql.gz

# 才 OK
Nov 26 10:37:01 mall CRON[22856]: (root) CMD (mysqldump foo  | gzip > /backups/`date +%s`.sql.gz)

Disable The Mail Alert By Crontab Command

如果有一堆 sendmail / crond 进程,而且是 defunct / zombie 僵尸进程,则可能是任务失败在排队发邮件

kill 掉 zombie 进程需要 ps -ef | grep defunct,再把 PID 和 PPID 一起 kill 了

$ ps -ef | grep defunct
UID          PID     PPID       C    STIME      TTY          TIME              CMD
1000       637      27872      0   Oct12      ?        00:00:04 [chrome] <defunct>
 
$ kill -9 637 27872

Disable The Mail Alert By Crontab Command

  • 若要禁止 /etc/cron.d/*$ crontab -e 任务输出的 mail, 需在每个配置文件中增加 MAILTO=“”
  • 而 cron.{hourly,daily,…} 等配置由 /etc/crontab 控制, 禁止这些任务输出的 mail 需在 /etc/crontab 中增加 MAILTO=“”

参考

更多 cron 写法见 Cron

anacron

anacron 会侦测停机期间应该进行但是并没有进行的 crontab 任务. anacron 其实是一支程序并非一个服务.

REFS

WWW

curl & wget

curl

超级 debug 模式:

$ curl --trace-ascii /dev/stdout  -F  "file=@/tmp/foo.xlsx"  'https://www1.example.local:8084/api/mg/upload-'

== Info:   Trying 127.0.0.1...
== Info: Connected to www1.example.local (127.0.0.1) port 8084 (#0)
== Info: TLS 1.2 connection using TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384
== Info: Server certificate: server.example.com
=> Send header, 602 bytes (0x25a)
0000: POST /api/mg/upload HTTP/1.1
0032: Host: www1.example.local:8084
0051: User-Agent: curl/7.49.1
006a: Accept: */*
01d6: Content-Length: 7105
01ec: Expect: 100-continue
0202: Content-Type: multipart/form-data; boundary=--------------------
0242: ----018d8eef64a5d67b
0258:
<= Recv header, 23 bytes (0x17)
0000: HTTP/1.1 100 Continue
=> Send data, 375 bytes (0x177)
0000: --------------------------018d8eef64a5d67b
002c: Content-Disposition: form-data; name="cur_broker_id"
0062:
0064: 06603195
006e: --------------------------018d8eef64a5d67b
009a: Content-Disposition: form-data; name="cur_dealer_id"
00d0:
00d2: 01720374
00dc: --------------------------018d8eef64a5d67b
0108: Content-Disposition: form-data; name="file"; filename="180331.xl
0148: sx"
014d: Content-Type: application/octet-stream
0175:
=> Send data, 6682 bytes (0x1a1a)
0000: PK.........[L........K......._rels/.rels.......................
0040: J.1...>E.{7.."....z+.>........d...7.....z.?..o`..y.....9.XW5(...
0080: .....iu.*...#.2p......F.....Y.F..z...u.=M.+.....4..2u:...#...;.~
00c0: 2`.T.g ...Ts..7..H.....h.SI'....`.H.8......U!..,..........4Q.K^4
0100: ..G....x...?....2...9......^.....PK.........[L..I.....+.......d
0140: ocProps/core.xml....................m..j.@.E.....-;.RL...J!....4
0180: .3..`....wb...R.s.W...O..gq1..5-T.(Z.......'.DM.f....e..~G.....c
...
1900: PK..>........[L..!n..........................xl/worksheets/shee
1940: t1.xmlPK..>........[L..............................xl/sharedStr
1980: ings.xmlPK..>........[Ll..f].........................xl/styles.
19c0: xmlPK..>........[L...*..........................[Content_Types]
1a00: .xmlPK....................
=> Send data, 48 bytes (0x30)
0000:
0002: --------------------------018d8eef64a5d67b--
<= Recv header, 24 bytes (0x18)
0000: HTTP/1.1 403 Forbidden
<= Recv header, 21 bytes (0x15)
0000: Server: nginx/1.6.2
<= Recv header, 37 bytes (0x25)
0000: Date: Sat, 31 Mar 2018 04:01:57 GMT
<= Recv header, 47 bytes (0x2f)
0000: Content-Type: application/json; charset=utf-8
<= Recv header, 20 bytes (0x14)
0000: Content-Length: 89
<= Recv header, 24 bytes (0x18)
0000: Connection: keep-alive
== Info: HTTP error before end of send, stop sending
<= Recv header, 2 bytes (0x2)
0000:
<= Recv data, 89 bytes (0x59)
0000: {"code":"2021","message":"...........................","request_
0040: id":"129208998280003968"}
== Info: Closing connection 0
{"code":"2021","message":"已上传过该批次流水","request_id":"129208998280003968"}

查看网页源码

curl www.sina.com

…并保存

curl -o [文件名] www.sina.com

自动跳转

curl -L www.sina.com
# 键入上面的命令,结果就自动跳转为www.sina.com.cn

连同网页代码一起显示头信息

curl -i www.sina.com
# -I 参数则是只显示http response的头信息

显示通信过程

curl -v www.sina.com
# output
* About to connect() to www.sina.com port 80 (#0)
* Trying 61.172.201.195... connected
* Connected to www.sina.com (61.172.201.195) port 80 (#0)
> GET / HTTP/1.1
> User-Agent: curl/7.21.3 (i686-pc-linux-gnu) libcurl/7.21.3 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18
> Host: www.sina.com
> Accept: */*
>
* HTTP 1.0, assume close after body
< HTTP/1.0 301 Moved Permanently
< Date: Sun, 04 Sep 2011 00:42:39 GMT
< Server: Apache/2.0.54 (Unix)
< Location: http://www.sina.com.cn/
< Cache-Control: max-age=3600
< Expires: Sun, 04 Sep 2011 01:42:39 GMT
< Vary: Accept-Encoding
< Content-Length: 231
< Content-Type: text/html; charset=iso-8859-1
< X-Cache: MISS from sh201-19.sina.com.cn
< Connection: close
<
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>301 Moved Permanently</title>
</head><body>
<h1>Moved Permanently</h1>
<p>The document has moved <a href="http://www.sina.com.cn/">here</a>.</p>
</body></html>
* Closing connection #0
# 如果你觉得上面的信息还不够,那么下面的命令可以查看更详细的通信过程。
curl --trace output.txt www.sina.com
# 或者
curl --trace-ascii output.txt www.sina.com

发送表单信息

# GET
curl example.com/form.cgi?data=xxx
# POST
curl --data "data=xxx" example.com/form.cgi
# 经过编码的POST
curl --data-urlencode "date=April 1" example.com/form.cgi

文件上传。假定文件上传的表单是下面这样:

<form method="POST" enctype='multipart/form-data' action="upload.cgi">
  <input type=file name=upload>
  <input type=submit name=press value="OK">
</form>

则可以用curl这样上传文件:

curl --form upload=@localfilename --form press=OK [URL]

Referer

curl --referer http://www.example.com http://www.example.com

User Agent

curl --user-agent "[User Agent]" [URL]

cookie

curl --cookie "name=xxx" www.example.com

增加头信息

curl --header "xxx: xxxxxx" http://example.com

指定 METHOD

curl -X DELETE http://foo.com/photo/1
it/linux/power_tools.txt · Last modified: 2018/03/31 12:03 by admin