Linux 6: basic operation 3

1, Content view Positive and nega...
Positive and negative order output cat\tac:
Common usage:
Common operations:
Command mode:
Edit mode:
Last line mode:
Common usage:
which command

1, Content view

Positive and negative order output cat\tac:

commandeffectcatDisplays text content similar to type (sequential output) in WindowstacDisplay text content (reverse order output of cat)cat file 1 file 2 > file 3File mergecat -bDisplay line number output

1. cat: display text content

2. tac: display text content (reverse order output of cat)

[root@hadoop60 ~]# ll Total consumption 8 drwxrwxrwx. 2 root root 19 10 July 13:13 aa drwxr-xr-x. 2 root root 19 10 July 13:12 ab -rwxr-xr-x. 1 root root 0 10 July 8:30 abc.txt lrwxrwxrwx. 1 root root 7 10 July 9:48 abx.txt -> abc.txt -rw-------. 1 root root 1232 10 May 08:58 anaconda-ks.cfg -rwxrwxrwx. 1 root root 6 10 July 21:55 xyz.txt [root@hadoop60 ~]# cat xyz.txt 1 2 3 [root@hadoop60 ~]# tac xyz.txt 3 2 1

3. cat file 1 > file 2 > file 3: file merge

(1) For example: merge abc.txt and xyz.txt into the b.txt folder
(2) Execution:

[root@hadoop60 ~]# ll Total consumption 20 drwxrwxrwx. 2 root root 19 10 July 13:13 aa drwxr-xr-x. 2 root root 19 10 July 13:12 ab -rwxr-xr-x. 1 root root 6 10 July 22:04 abc.txt lrwxrwxrwx. 1 root root 7 10 July 9:48 abx.txt -> abc.txt -rw-------. 1 root root 1232 10 May 08:58 anaconda-ks.cfg -rw-r--r--. 1 root root 6 10 July 21:59 bac.txt -rw-r--r--. 1 root root 12 10 July 22:05 b.txt -rwxrwxrwx. 1 root root 6 10 July 21:55 xyz.txt [root@hadoop60 ~]# cat abc.txt 7 7 8 [root@hadoop60 ~]# cat xyz.txt 1 2 3 [root@hadoop60 ~]# cat abc.txt xyz.txt >b.txt [root@hadoop60 ~]# cat b.txt 7 7 8 1 2 3

4. cat -b: display line number output

give an example:

[root@hadoop60 ~]# cat abc.txt 7 7 8 [root@hadoop60 ~]# cat -b abc.txt 1 7 2 7 3 8

Note: the difference between cat and more:
When cat is used to view large files, it will be loaded at one time. When more is used to view large files, it will display one screen at a time. If not, the last line will display the progress. Enter to display the next line, press b to display the previous page, space to display the next page, and q to exit.
So it is recommended to use more

2, Compression and decompression

Common usage:

commandeffecttar -zcvfPackaging and compression (gzip mode)tar -zxvfUnzip (gzip package)tar -zxvf -CUnzip to the specified directory (gzip package)

1. Tar -zcvf < < compressed file > < compressed file >: packaging and compression (less used)

For example: compress all files ending in. txt into abc.tar.gz package, and then ll check to find abc.tar.gz package.

[root@hadoop60 ~]# ll Total consumption 20 drwxrwxrwx. 2 root root 19 10 July 13:13 aa drwxr-xr-x. 2 root root 19 10 July 13:12 ab -rwxr-xr-x. 1 root root 6 10 July 22:04 abc.txt lrwxrwxrwx. 1 root root 7 10 July 9:48 abx.txt -> abc.txt -rw-------. 1 root root 1232 10 May 08:58 anaconda-ks.cfg -rw-r--r--. 1 root root 6 10 July 21:59 bac.txt -rw-r--r--. 1 root root 12 10 July 22:06 b.txt -rwxrwxrwx. 1 root root 6 10 July 21:55 xyz.txt [root@hadoop60 ~]# tar -zcvf abc.tar.gz *.txt abc.txt abx.txt bac.txt b.txt xyz.txt [root@hadoop60 ~]# ll Total consumption 24 drwxrwxrwx. 2 root root 19 10 July 13:13 aa drwxr-xr-x. 2 root root 19 10 July 13:12 ab -rw-r--r--. 1 root root 222 10 August 8:33 abc.tar.gz -rwxr-xr-x. 1 root root 6 10 July 22:04 abc.txt lrwxrwxrwx. 1 root root 7 10 July 9:48 abx.txt -> abc.txt -rw-------. 1 root root 1232 10 May 08:58 anaconda-ks.cfg -rw-r--r--. 1 root root 6 10 July 21:59 bac.txt -rw-r--r--. 1 root root 12 10 July 22:06 b.txt -rwxrwxrwx. 1 root root 6 10 July 21:55 xyz.txt

2. tar -zxvf: unzip

For example, copy the compressed file abc.tar.gz to the ab directory, check the ab directory and find another file. In order to see more intuitively, delete a.txt. At this time, there is only the compressed file abc.tar.gz in the ab directory. Execute the command to decompress abc.tar.gz, and then check that the decompressed file is in the ab directory.
Execution:

[root@hadoop60 ~]# cp abc.tar.gz ab [root@hadoop60 ~]# ll ab Total consumption 4 -rw-r--r--. 1 root root 222 10 August 8:34 abc.tar.gz -rw-r--r--. 1 root root 0 10 July 13:12 a.txt [root@hadoop60 ab]# rm -rf a.txt [root@hadoop60 ab]# ll Total consumption 4 -rw-r--r--. 1 root root 222 10 August 8:34 abc.tar.gz [root@hadoop60 ab]# tar -zxvf abc.tar.gz abc.txt abx.txt bac.txt b.txt xyz.txt [root@hadoop60 ab]# ll Total consumption 20 -rw-r--r--. 1 root root 222 10 August 8:34 abc.tar.gz -rwxr-xr-x. 1 root root 6 10 July 22:04 abc.txt lrwxrwxrwx. 1 root root 7 10 July 9:48 abx.txt -> abc.txt -rw-r--r--. 1 root root 6 10 July 21:59 bac.txt -rw-r--r--. 1 root root 12 10 July 22:06 b.txt -rwxrwxrwx. 1 root root 6 10 July 21:55 xyz.txt [root@hadoop60 ab]#

3. Tar -zxvf < compressed file > - C < specified directory >: unzip to the specified directory (gzip package)

give an example:
Execution:

[root@hadoop60 ab]# cd ~ [root@hadoop60 ~]# ll Total consumption 24 drwxrwxrwx. 2 root root 19 10 July 13:13 aa drwxr-xr-x. 2 root root 97 10 August 8:36 ab -rw-r--r--. 1 root root 222 10 August 8:33 abc.tar.gz -rwxr-xr-x. 1 root root 6 10 July 22:04 abc.txt lrwxrwxrwx. 1 root root 7 10 July 9:48 abx.txt -> abc.txt -rw-------. 1 root root 1232 10 May 08:58 anaconda-ks.cfg -rw-r--r--. 1 root root 6 10 July 21:59 bac.txt -rw-r--r--. 1 root root 12 10 July 22:06 b.txt -rwxrwxrwx. 1 root root 6 10 July 21:55 xyz.txt [root@hadoop60 ~]# tar -zxvf abc.tar.gz -C aa abc.txt abx.txt bac.txt b.txt xyz.txt [root@hadoop60 ~]# ll aa Total consumption 16 -rwxr-xr-x. 1 root root 6 10 July 22:04 abc.txt lrwxrwxrwx. 1 root root 7 10 July 9:48 abx.txt -> abc.txt -rwxrwxrwx. 1 root root 0 10 July 13:13 a.txt -rw-r--r--. 1 root root 6 10 July 21:59 bac.txt -rw-r--r--. 1 root root 12 10 July 22:06 b.txt -rwxrwxrwx. 1 root root 6 10 July 21:55 xyz.txt [root@hadoop60 ~]#

Note: parameter introduction:

parametermeaning-zDo you also have gzip attributes? That is, do you need to compress with gzip?-cParameter instruction to create a compressed file-xUnpack the parameter instruction of a compressed file!-vDisplay files during compression-fUse file name. This parameter is the last parameter, which can only be followed by file name 2, File size view

Introduction: you can use the du command to count the disk space occupied by files and directories.

Common operations:

commandeffectdu -aCount the disk space occupied by each file in all directories and their subdirectories (count all)du -hCount the disk space occupied by all directories and their subdirectories (only directories)du -chCount the space occupied by the corresponding directory and subdirectory and add totaldu -shDirect statistics of total size

1. du -a: count the disk space occupied by each file in all directories and their subdirectories

[root@hadoop60 ~]# du -a 4 ./.bash_logout 4 ./.bash_profile 4 ./.bashrc 4 ./.cshrc 4 ./.tcshrc 4 ./anaconda-ks.cfg 4 ./.bash_history 0 ./abx.txt 4 ./ab/abc.tar.gz 4 ./ab/abc.txt 0 ./ab/abx.txt 4 ./ab/bac.txt 4 ./ab/b.txt 4 ./ab/xyz.txt 20 ./ab 0 ./aa/a.txt 4 ./aa/abc.txt 0 ./aa/abx.txt 4 ./aa/bac.txt 4 ./aa/b.txt 4 ./aa/xyz.txt 16 ./aa 4 ./xyz.txt 4 ./.ssh/known_hosts 4 ./.ssh 4 ./bac.txt 4 ./b.txt 4 ./abc.txt 4 ./abc.tar.g 88 . [root@hadoop60 ~]#

2. du -h: count the disk space occupied by all directories and their subdirectories

Execution: (1) three directories are counted through du -h

[root@hadoop60 ~]# du -h 20K ./ab 16K ./aa 4.0K ./.ssh 88K .

(2) ll-a looks at the hidden directories and files and finds these three directories

[root@hadoop60 ~]# ll -a Total dosage 48 dr-xr-x---. 5 root root 258 10 August 8:33 . dr-xr-xr-x. 17 root root 224 10 May 08:58 .. drwxrwxrwx. 2 root root 92 10 August 8:39 aa drwxr-xr-x. 2 root root 97 10 August 8:36 ab -rw-r--r--. 1 root root 222 10 August 8:33 abc.tar.gz -rwxr-xr-x. 1 root root 6 10 July 22:04 abc.txt lrwxrwxrwx. 1 root root 7 10 July 9:48 abx.txt -> abc.txt -rw-------. 1 root root 1232 10 May 08:58 anaconda-ks.cfg -rw-r--r--. 1 root root 6 10 July 21:59 bac.txt -rw-------. 1 root root 835 10 July 20:57 .bash_history -rw-r--r--. 1 root root 18 12 September 29, 2013 .bash_logout -rw-r--r--. 1 root root 176 12 September 29, 2013 .bash_profile -rw-r--r--. 1 root root 176 12 September 29, 2013 .bashrc -rw-r--r--. 1 root root 12 10 July 22:06 b.txt -rw-r--r--. 1 root root 100 12 September 29, 2013 .cshrc drwx------. 2 root root 25 10 July 15:06 .ssh -rw-r--r--. 1 root root 129 12 September 29, 2013 .tcshrc -rwxrwxrwx. 1 root root 6 10 July 21:55 xyz.txt

Note: ll -a lists all files (including hidden files) in the current directory

3. du -ch: count the space occupied by the corresponding directory and subdirectory and add total

(1) View the space occupied by aa directory

[root@hadoop60 ~]# du -ch aa 16K aa 16K Total dosage

(2) View the space occupied by the current directory and its total usage

[root@hadoop60 ~]# du -ch . 20K ./ab 16K ./aa 4.0K ./.ssh 88K . 88K Total dosage

4. du -sh: directly count the total size

View the total usage of the current directory

[root@hadoop60 ~]# du -sh . 88K .
3, vi editor

Introduction:
(1) The installed Linux system usually comes with vi editor.
(2) vi has three modes: command mode, edit mode and last line mode

Command mode:

Introduction: the main functions are delete, replace and revoke.

commandeffectddDelete rowyypCopy paste999ddClear all contents under the current line of the file

1. dd: delete the row

(1) vi + file. enter enters the command mode. At this time, press two d's to delete the line

[root@hadoop60 ~]# ll Total consumption 24 drwxrwxrwx. 2 root root 92 10 August 8:39 aa drwxr-xr-x. 2 root root 97 10 August 8:36 ab -rw-r--r--. 1 root root 222 10 August 8:33 abc.tar.gz -rwxr-xr-x. 1 root root 6 10 July 22:04 abc.txt lrwxrwxrwx. 1 root root 7 10 July 9:48 abx.txt -> abc.txt -rw-------. 1 root root 1232 10 May 08:58 anaconda-ks.cfg -rw-r--r--. 1 root root 6 10 July 21:59 bac.txt -rw-r--r--. 1 root root 12 10 July 22:06 b.txt -rwxrwxrwx. 1 root root 6 10 July 21:55 xyz.txt [root@hadoop60 ~]# vi xyz.txt

(2) Entered command mode

1 2 3

(3) Press dd to delete the row

2 3

2. yyp: copy paste

Introduction: press yyp to copy the line where the cursor is located to the next line

2 2 3

3. 999dd: clear all contents under the current line of the file

Introduction: press 999dd to delete all contents (including the line) under the line where the cursor is located. If the cursor is located on the first line, press 999dd to delete all contents.

Edit mode:

commandeffectiEnter edit mode

Last line mode:

commandeffect:wqSave the file and exit the vi editor:wSave the file without exiting the vi editor:qExit vi editor without saving the file: q!Force exit vi editor without saving file/abcPress enter to find the specified string, and press n to find the next string

1. / abc: press enter to find the specified string, and press n to find the next string

Introduction:
(1) vi + file, enter into the command mode, press Esc + / + to specify the string, and the system will automatically locate the string from the first place
Execution:

1 2 3 3 8 43 ~ ~ ~ /3

(2) At this time, press n again, and the system will locate the next same character from top to bottom.

4, Pipeline

Introduction: the pipeline command uses | as the defining symbol. The pipeline command needs to be used in combination with other commands

Common usage:

1,cat test.txt | grep abc
Output abc lines in test.txt file (case sensitive, spaces must be enclosed in quotation marks)
2,cat test.txt | grep -i abc
Output abc lines in test.txt file (case ignored)
3,cat test.txt | grep -v abc
The output test.txt file does not contain abc lines
Note: try to use cat instead of more for pipeline commands, because more paging is not easy to see.

1. Cat < test. TXT > | grep abc: output the line containing abc in the test.txt file

(1) Introduction: check the contents of the file and filter out the abc in the file.
Example: cat xyz.txt | grep 3: filter out 3 in xyz.txt

[root@hadoop60 ~]# cat xyz.txt 1 2 3 [root@hadoop60 ~]# cat xyz.txt | grep 3 3

(2) Note: the content to be filtered contains spaces and must be quoted

[root@hadoop60 ~]# cat xyz.txt | grep 3 4 grep: 4: There is no such file or directory [root@hadoop60 ~]# cat xyz.txt | grep "3 4" 3 4

2. cat test.txt | grep -i abc: output the line containing abc in the test.txt file

Output abc lines in test.txt file (case ignored)

[root@hadoop60 ~]# cat xyz.txt | grep -i "d F" d f

3. cat test.txt | grep -v abc: output lines that do not contain abc in the test.txt file

[root@hadoop60 ~]# cat xyz.txt 12 d f 1 17 3 4 2 3 [root@hadoop60 ~]# cat xyz.txt | grep -v "d f" 12 1 17 3 4 2 3
5, Statistics

Introduction: wc is mainly used to calculate the number of words, bytes and lines of a file.

commandeffectwc -lCalculate the number of lines in the filewc -cCalculate the number of characters in the filewc -wCount wordswcThe number of lines, words and characters are displayed together (in the order of lines, words and characters)

1. wc -l: calculate the number of lines in the file

(1) Execute the command and find that xyz.txt has 7 lines

[root@hadoop60 ~]# wc -l xyz.txt 7 xyz.txt

(2) Check if xyz.txt is 7 lines:

[root@hadoop60 ~]# cat -b xyz.txt 1 12 2 d f 3 1 4 17 5 3 4 6 2 7 3

Note: the command can also be written as cat xyz.txt | wc-l

2. wc -c: calculate the number of characters in the file

[root@hadoop60 ~]# wc -c xyz.txt 20 xyz.txt

3. wc -w: count words

[root@hadoop60 ~]# wc -w xyz.txt 9 xyz.txt

4. wc: the number of lines, words and characters are displayed together

[root@hadoop60 ~]# wc xyz.txt 7 9 20 xyz.txt

Note: the command can also be written as cat xyz.txt | wc

6, Search

which command

Meaning: which finds the location of the setting command and installation file directory in $PATH. (check the location of the command)

[root@hadoop60 ~]# which pwd /usr/bin/pwd

8 October 2021, 03:46 | Views: 9948

Add new comment

For adding a comment, please log in
or create account

0 comments