青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

為生存而奔跑

   :: 首頁 :: 聯系 :: 聚合  :: 管理
  271 Posts :: 0 Stories :: 58 Comments :: 0 Trackbacks

留言簿(5)

我參與的團隊

搜索

  •  

積分與排名

  • 積分 - 331734
  • 排名 - 74

最新評論

閱讀排行榜

評論排行榜

------------------------- How to Use AWK --------------------------

Awk is an powerful command language that allows the user to manipulate files containing columns of data and strings. Awk is extremely useful, both for general operation of Unix commands, and for data reduction (e.g. IRAF). You might also learn how to use the stream editor sed. Many applications of awk resemble those done on PC spreadsheets.

This file contains a number of examples of how to use awk. I have compiled this table gradually over a couple of years as I've learned to do new things. Everyone who reduces data with IRAF should learn the fundamentals of AWK Learning to do even simple things will save you a lot of time in the long run. It should take you less than an hour to read through this file and learn the basics.

There are two ways to run awk. A simple awk command can be run from a single command line. More complex awk scripts should be written to a command file. I present examples of both types of input below.

Awk takes each line of input and tries to match the 'pattern' (see below), and if it succeeds it will do whatever you tell it to do within the {} (called the action). Awk works best on files that have columns of numbers or strings that are separated by whitespace (tabs or spaces), though on most machines you can use the -F option if your columns are set apart by another character. Awk refers to the first column as $1, the second column as $2, etc., and the whole line as $0. If you have a file (such as a catalog) that always has numbers in specific columns, you may also want to run the command 'colrm' and combine it with awk. There is a manual page on colrm. There is also a very incomplete man page on awk.

I'll lead you through two examples. First, suppose you have a file called 'file1' that has 2 columns of numbers, and you want to make a new file called 'file2' that has columns 1 and 2 as before, but also adds a third column which is the ratio of the numbers in columns 1 and 2. Suppose you want the new 3-column file (file2) to contain only those lines with column 1 smaller than column 2. Either of the following two commands does what you want:

awk '$1 < $2 {print $0, $1/$2}' file1 > file2

-- or --

cat file1 | awk '$1 < $2 {print $0, $1/$2}' > file2

Let's look at the second one. You all know that 'cat file1' prints the contents of file1 to your screen. The | (called a pipe) directs the output of 'cat file1', which normally goes to your screen, to the command awk. Awk considers the input from 'cat file1' one line at a time, and tries to match the 'pattern'. The pattern is whatever is between the first ' and the {, in this case the pattern is $1 < $2. If the pattern is false, awk goes on to the next line. If the pattern is true, awk does whatever is in the {}. In this case we have asked awk to check if the first column is less than the second. If there is no pattern, awk assumes the pattern is true, and goes onto the action contained in the {}.

What is the action? Almost always it is a print statement of some sort. In this case we want awk to print the entire line, i.e. $0, and then print the ratio of columns 1 and 2, i.e. $1/$2. We close the action with a }, and close the awk command with a '. Finally, to store the final 3-column output into file2 (otherwise it prints to the screen), we add a '> file2'.

As a second example, suppose you have several thousand files you want to move into a new directory and rename by appending a .dat to the filenames. You could do this one by one (several hours), or use vi to make a decent command file to do it (several minutes), or use awk (several seconds). Suppose the files are named junk* (* is wildcard for any sequence of characters), and need to be moved to ../iraf and have a '.dat' appended to the name. To do this type

ls junk* | awk '{print "mv "$0" ../iraf/"$0".dat"}' | csh

ls junk* lists the filenames, and this output is piped into awk instead of going to your screen. There is no pattern (nothing between the ' and the {), so awk proceeds to print something for each line. For example, if the first two lines from 'ls junk*' produced junk1 and junk2, respectively, then awk would print:

mv junk1 ../iraf/junk1.dat
mv junk2 ../iraf/junk2.dat

At this point the mv commands are simply printed to the screen. To execute the command we take the output of awk and pipe it back into the operating system (the C-shell). Hence, to finish the statement we add a ' | csh'.

More complex awk scripts need to be run from a file. The syntax for such cases is:

cat file1 | awk -f a.awk > file2

where file1 is the input file, file2 is the output file, and a.awk is a file containing awk commands. Examples below that contain more than one line of awk need to be run from files.

Some useful awk variables defined for you are NF (number of columns), NR (the current line that awk is working on), END (true if awk reaches the EOF), BEGIN (true before awk reads anything), and length (number of characters in a line or a string). There is also looping capability, a search (/) command, a substring command (extremely useful), and formatted printing available. There are logical variables || (or) and && (and) that can be used in 'pattern'. You can define and manipulate your own user defined variables. Examples are outlined below. The only bug I know of is that Sun's version of awk won't do trig functions, though it does do logs. There is something called gawk (a Gnu product), which does a few more things than Sun's awk, but they are basically the same. Note the use of the 'yes' command below. Coupled with 'head' and 'awk' you save an hour of typing if you have a lot of files to analyze or rename.

Good luck!
EXAMPLES      # is the comment character for awk.  'field' means 'column'

# Print first two fields in opposite order:
awk '{ print $2, $1 }' file


# Print lines longer than 72 characters:
awk 'length > 72' file


# Print length of string in 2nd column
awk '{print length($2)}' file


# Add up first column, print sum and average:
{ s += $1 }
END { print "sum is", s, " average is", s/NR }


# Print fields in reverse order:
awk '{ for (i = NF; i > 0; --i) print $i }' file


# Print the last line
{line = $0}
END {print line}


# Print the total number of lines that contain the word Pat
/Pat/ {nlines = nlines + 1}
END {print nlines}


# Print all lines between start/stop pairs:
awk '/start/, /stop/' file


# Print all lines whose first field is different from previous one:
awk '$1 != prev { print; prev = $1 }' file


# Print column 3 if column 1 > column 2:
awk '$1 > $2 {print $3}' file


# Print line if column 3 > column 2:
awk '$3 > $2' file


# Count number of lines where col 3 > col 1
awk '$3 > $1 {print i + "1"; i++}' file


# Print sequence number and then column 1 of file:
awk '{print NR, $1}' file


# Print every line after erasing the 2nd field
awk '{$2 = ""; print}' file


# Print hi 28 times
yes | head -28 | awk '{ print "hi" }'


# Print hi.0010 to hi.0099 (NOTE IRAF USERS!)
yes | head -90 | awk '{printf("hi00%2.0f \n", NR+9)}'

# Print out 4 random numbers between 0 and 1
yes | head -4 | awk '{print rand()}'

# Print out 40 random integers modulo 5
yes | head -40 | awk '{print int(100*rand()) % 5}'


# Replace every field by its absolute value
{ for (i = 1; i <= NF; i=i+1) if ($i < 0) $i = -$i print}

# If you have another character that delimits fields, use the -F option
# For example, to print out the phone number for Jones in the following file,
# 000902|Beavis|Theodore|333-242-2222|149092
# 000901|Jones|Bill|532-382-0342|234023
# ...
# type
awk -F"|" '$2=="Jones"{print $4}' filename



# Some looping commands
# Remove a bunch of print jobs from the queue
BEGIN{
for (i=875;i>833;i--){
printf "lprm -Plw %d\n", i
} exit
}


Formatted printouts are of the form printf( "format\n", value1, value2, ... valueN)
e.g. printf("howdy %-8s What it is bro. %.2f\n", $1, $2*$3)
%s = string
%-8s = 8 character string left justified
%.2f = number with 2 places after .
%6.2f = field 6 chars with 2 chars after .
\n is newline
\t is a tab


# Print frequency histogram of column of numbers
$2 <= 0.1 {na=na+1}
($2 > 0.1) && ($2 <= 0.2) {nb = nb+1}
($2 > 0.2) && ($2 <= 0.3) {nc = nc+1}
($2 > 0.3) && ($2 <= 0.4) {nd = nd+1}
($2 > 0.4) && ($2 <= 0.5) {ne = ne+1}
($2 > 0.5) && ($2 <= 0.6) {nf = nf+1}
($2 > 0.6) && ($2 <= 0.7) {ng = ng+1}
($2 > 0.7) && ($2 <= 0.8) {nh = nh+1}
($2 > 0.8) && ($2 <= 0.9) {ni = ni+1}
($2 > 0.9) {nj = nj+1}
END {print na, nb, nc, nd, ne, nf, ng, nh, ni, nj, NR}


# Find maximum and minimum values present in column 1
NR == 1 {m=$1 ; p=$1}
$1 >= m {m = $1}
$1 <= p {p = $1}
END { print "Max = " m, " Min = " p }

# Example of defining variables, multiple commands on one line
NR == 1 {prev=$4; preva = $1; prevb = $2; n=0; sum=0}
$4 != prev {print preva, prevb, prev, sum/n; n=0; sum=0; prev = $4; preva = $1; prevb = $2}
$4 == prev {n++; sum=sum+$5/$6}
END {print preva, prevb, prev, sum/n}

# Example of defining and using a function, inserting values into an array
# and doing integer arithmetic mod(n). This script finds the number of days
# elapsed since Jan 1, 1901. (from http://www.netlib.org/research/awkbookcode/ch3)
function daynum(y, m, d, days, i, n)
{ # 1 == Jan 1, 1901
split("31 28 31 30 31 30 31 31 30 31 30 31", days)
# 365 days a year, plus one for each leap year
n = (y-1901) * 365 + int((y-1901)/4)
if (y % 4 == 0) # leap year from 1901 to 2099
days[2]++
for (i = 1; i < m; i++)
n += days[i]
return n + d
}
{ print daynum($1, $2, $3) }

# Example of using substrings
# substr($2,9,7) picks out characters 9 thru 15 of column 2
{print "imarith", substr($2,1,7) " - " $3, "out."substr($2,5,3)}
{print "imarith", substr($2,9,7) " - " $3, "out."substr($2,13,3)}
{print "imarith", substr($2,17,7) " - " $3, "out."substr($2,21,3)}
{print "imarith", substr($2,25,7) " - " $3, "out."substr($2,29,3)}
posted on 2010-05-18 19:07 baby-fly 閱讀(384) 評論(0)  編輯 收藏 引用 所屬分類: Ubuntu&Linux
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            亚洲福利视频网| 久久久xxx| 久久久国产一区二区| 亚洲专区一区二区三区| 亚洲视频在线观看免费| 亚洲网站在线看| 欧美一级淫片aaaaaaa视频| 欧美一站二站| 免费成人黄色av| 亚洲欧洲在线播放| 日韩视频免费| 在线综合亚洲| 欧美中文字幕在线播放| 久久视频在线免费观看| 欧美精品久久久久久久久久| 欧美性事免费在线观看| 国产亚洲一区二区三区| 亚洲欧洲另类国产综合| 亚洲制服丝袜在线| 免费视频亚洲| 亚洲午夜精品一区二区三区他趣 | 国产视频一区免费看| 洋洋av久久久久久久一区| 亚洲自拍电影| 老司机凹凸av亚洲导航| 亚洲乱码国产乱码精品精可以看| 亚洲自拍偷拍麻豆| 欧美高清在线视频| 国产亚洲午夜| 亚洲视频精品| 欧美aa在线视频| 亚洲在线观看视频| 欧美激情精品久久久久久蜜臀 | 久久精品最新地址| 亚洲激情校园春色| 久久激情网站| 国产精品日韩欧美| 亚洲毛片在线| 欧美成人一区二区三区| 性高湖久久久久久久久| 欧美性淫爽ww久久久久无| 亚洲国产成人精品久久| 久久精品99久久香蕉国产色戒 | 久久精品青青大伊人av| 夜夜嗨av一区二区三区四季av | 国产精品亚洲成人| 制服诱惑一区二区| 亚洲黄色影院| 美女亚洲精品| 在线不卡中文字幕| 久久精品国产一区二区电影| 在线亚洲+欧美+日本专区| 欧美电影免费观看大全| 亚洲大片一区二区三区| 久久亚洲精品视频| 欧美中文在线观看| 国产亚洲二区| 欧美在线影院在线视频| 亚洲女人av| 国产欧美va欧美不卡在线| 午夜久久久久| 亚洲综合视频一区| 国产亚洲精品v| 久久久久久午夜| 欧美一级视频| 一区二区三区在线免费观看 | 久久久久一区| 久久国内精品自在自线400部| 国产一区二区精品久久| 久久天堂精品| 美日韩丰满少妇在线观看| 亚洲国产欧美不卡在线观看| 欧美国产专区| 欧美在线欧美在线| 欧美制服丝袜| 国产欧美精品一区二区三区介绍| 亚洲自拍16p| 亚洲欧美欧美一区二区三区| 国产欧美一区二区精品忘忧草| 久久经典综合| 欧美成人国产| 午夜欧美精品| 久久久99免费视频| 99精品免费| 欧美亚洲视频在线看网址| 亚洲高清一区二| 夜夜爽99久久国产综合精品女不卡 | 欧美一区二区精品在线| 久久久精品国产免大香伊| 亚洲美女视频在线观看| 亚洲视频在线免费观看| 韩国三级在线一区| 91久久精品国产91性色| 欧美性感一类影片在线播放 | 久久久一区二区| 欧美高清视频在线| 欧美一区亚洲二区| 免费亚洲婷婷| 亚洲欧美网站| 欧美va天堂va视频va在线| 亚洲欧美成人一区二区在线电影| 午夜一区不卡| 亚洲视频自拍偷拍| 久久精品99无色码中文字幕 | 在线视频你懂得一区二区三区| 亚洲综合成人在线| 亚洲日本中文字幕| 欧美一区成人| 亚洲欧美日韩区| 欧美激情精品久久久久久黑人| 欧美一区二区三区视频免费| 欧美精品色网| 欧美成人精品h版在线观看| 国产精品亚洲综合| 一区二区三区精品视频| 亚洲精品国产精品乱码不99 | 久久午夜精品| 欧美在线视频免费观看| 欧美精品系列| 欧美激情欧美狂野欧美精品| 国内外成人在线| 亚洲一区二区三区中文字幕| 99国产精品国产精品久久 | 欧美性理论片在线观看片免费| 老司机午夜精品| 国产亚洲精品自拍| 亚洲少妇最新在线视频| 亚洲特级片在线| 欧美伦理影院| 91久久精品一区二区别| 亚洲黄色影院| 久久超碰97中文字幕| 久久国产精品一区二区| 欧美夜福利tv在线| 欧美性感一类影片在线播放| 亚洲精品视频一区| 一本一道久久综合狠狠老精东影业 | 在线综合亚洲| 亚洲午夜一级| 欧美日韩一区视频| 99亚洲精品| 亚洲免费一区二区| 国产精品久久久久999| 亚洲一区二区视频| 欧美专区在线播放| 韩国一区二区三区美女美女秀| 久久riav二区三区| 欧美jjzz| 亚洲精品国产品国语在线app| 欧美福利专区| 一区二区三区精品视频在线观看| 亚洲欧美成人在线| 国产综合在线看| 蜜臀av一级做a爰片久久 | 亚洲美女视频在线观看| 日韩一级黄色片| 欧美视频中文一区二区三区在线观看 | 国产精品日韩在线播放| 午夜精品视频一区| 免费日韩视频| 中国成人黄色视屏| 国产人成精品一区二区三| 久久久www成人免费无遮挡大片| 欧美电影免费观看高清| 一本色道久久综合| 国产欧美精品一区二区色综合 | 国产伦精品一区二区三区视频黑人| 亚洲制服av| 欧美成人激情视频| 亚洲视屏一区| 在线成人激情视频| 欧美三级中文字幕在线观看| 欧美在线亚洲在线| 亚洲精品国产精品国自产观看浪潮 | 国产精品国产自产拍高清av王其| 欧美怡红院视频一区二区三区| 亚洲国产三级| 久久久精品日韩| 亚洲美女尤物影院| 国产主播一区| 欧美色另类天堂2015| 久久精品中文| 中文久久乱码一区二区| 在线电影欧美日韩一区二区私密| 欧美亚洲三区| 亚洲精品久久久久久久久久久久久 | 欧美成人午夜激情| 亚洲欧美激情精品一区二区| 亚洲大片免费看| 国产午夜精品理论片a级大结局| 欧美大片在线看| 久久久久久有精品国产| 一本到12不卡视频在线dvd| 美脚丝袜一区二区三区在线观看 | 国产精品免费久久久久久| 免费欧美日韩| 久久精品国产69国产精品亚洲| 一本色道久久综合亚洲精品高清| 欧美高清视频在线观看| 久久久久久久高潮|