Classic shell thirteen questions

xiaoxiao2021-03-06  51

[Reposted] Compare classic --- UNIX, Linux shell thirteen questions?

Code:

1) Why is it called shell?

Before you introduce the shell, let us re-examine the relationship between the user and the computer system:

Figure (FixMe)

We know that the operation of the computer cannot leave the hard body, but the user can't drive the hard body directly.

The hardware drive can only control the control through a software called "Operating System".

In fact, we are talking about Linux every day, just a homework system, we call "kernel".

However, from the user's perspective, the user can't operate Kernel directly.

Instead, through the "shell" program, the so-called shell, is to communicate with Kernel.

This is also the image naming relationship of Kernel with the shell. Figure:

Figure (FixMe)

From a technical point of view, Shell is an interactive interface of a user and system.

Mainly let users use the system through the command line to complete the work.

Therefore, the simplest definition of the shell is --- Command Interpreter:

* Translate the user's command to the core,

* At the same time, translate the core processing results to the user.

Every time we complete the system login (log in), we get an interactive mode, also known as the Login Shell or Primary Shell.

If a processes perspective, we are under the command, both the child stroke generated by the shell. This is like, we temporarily known for on.

If it is a SHELL SCRIPT, the command in the script is performed by a child shell (Sub shell) of another non-interactive mode.

That is, the primary shell produces the stroke of the Sub Shell, and the Sub Shell produces the strokes of all commands in the Script.

(With regard to the itinerary, we have the opportunity to add again.)

Here, we must know that kernel and shell are different two sets of software, and they can be replaced:

* Different job systems use different kernel,

* On the same kernel, different shells can also be used.

In Linux's preset system, several different shells can usually be found, and usually listed in the following files:

/ etc / shells

Different shells have different functions, and they are different from each other, or "Datong Small".

Common shells are mainly divided into two mainstreams:

SH:

Burne Shell (SH)

Burne Again Shell (BASH)

CSH:

c shell (csh)

Tc Shell (TCSH)

Korn shell (KSH)

(FixMe)

Most of the Linux system's preset shell is BASH, why is the following two points:

* Free software

* Powerful

Bash is one of the most successful products of GNU Project. Since the launch, it has been loved by the majority of UNIX users.

And gradually become a system standard for many organizations.

2) What is the relationship between shell PROMPT (PS1) and Carriage Return (CR)?

When you successfully log in into a text interface, under most of the case, you will see a constantly flicker or bottom line (depending on the different versions).

We call it * cursor * (coursor).

The role of the cursor is to tell you the next position you enter from the button input from the keyboard.

And each of the lattice is moved to the right, if there is too much, if there is too much continuous input, it is automatically entered in the next line.

If you have just completed the login, you haven't entered any button, you can see the left side of the same line of the position,

We call it * prompt symbol * (Prompt).

Tip symbols or different system versions, on Linux, just pay attention to a visible prompt symbol of the closest cursor, usually one of the following:

$: Uses the general user account

#: Use the root account

In fact, Shell Prompt meaning is simple:

* Yes shell tells the user: You can now enter the command line.

We can say that users can only gave the command line when shell PROMPT is.

CURSOR is the location indicating that the keyboard entered in the command line, the user enters a button, and Cursor moves behind,

Until the command line reads CR (Carriage Return, generated by the ENTER button).

The meaning of CR is also very simple:

* Is the user tells shell: The old brother can perform my command line.

strictly speaking:

* The so-called command line is the text entered between the Shell Prompt and Cr characters.

(Thinking: Why do we insist on using Cr characters and don't say an Enter key? The answer is announced in the latter.)

Different commands acceptable command line formats or different, in general, a standard command line format is listed below:

Command-name Options Argument

If you come from technical details, the Shell will be disassembled as "word" based on the text of the Command Line according to the IFS (Internal Field SEPERATOR).

Then, for special characters (META), it will be processed, and finally recombine Command Line.

(Note: Be sure to understand the meaning of the two sentences, we will often return here in the future study.)

The IFS is the field separation symbol used by the shell preset, which can be composed of one and more press:

* Blank key (White Space)

* Table Key (Tab)

* Enter key (ENTER)

The system acceptable command name can be obtained from the following path:

* Clear path command specified by the path

* Command Alias ​​(Alias)

* Custom function (Function)

* shell built-in command (Built-in)

* External command under the $ PATH

Each command line must include a command name, which is not missing.

3) Others echo, you also echo, is it asking for echo?

Undertake the Command Line described in the previous chapter, here we use the Echo this command to further explain.

Templer --- Standard Command Line contains three parts:

* Command_name Option Argument

Echo is a very simple, direct Linux command:

* Send Argument to standard output (STDOUT), usually output on the monitor (Monitor).

(Note: Stdout will have the opportunity to explain again, or you can first refer to the following discussion:

Http://www.chinaunix.net/forum/viewtopic.php?t=191375) In order to better understand, let us first run the echo command:

Code:

$ echo

$

You will find that there is only one blank row, then return to the shell prompt.

This is because Echo is in the preset, and after the Argument is displayed, a new-line character will be sent.

But the above Command did not have any argument, then only one wrap symbol left ...

If you want to cancel this wrap symbol, you can take advantage of ECHO -N Option:

Code:

$ echo -n

$

Let us return to Command Line concept to discuss the epho command of the above example:

* Command Line Only Command_Name (Echo) and Option (-N) have nothing argument.

If you want to see the Echo's Argument, it is not easy! Next, try the following input:

Code:

$ echo first line

First line

$ echo -n first line

First Line $

In the two Echo commands, you will find that the Argument part is displayed on your screen, and the wrap symbol is the case-N Option.

Obviously, the second echo is canceled because the wrap symbol is canceled, the next shell prompt will pick up the output result is the same line ... ^ _ ^

In fact, Echo is except-N Options, and the common options are:

-e: Enable the conversion of a reverse laster control character (refer to the following table)

-E: Turn off the conversion of the anti-reselling control character (preset)

-n: Cancel the line of wrap (with the / c character under the -e option)

The anti-slope control character supported by the Echo command is as follows:

/ A: Alert / Bell (ringing from the system speaker)

/ B: Backspace, that is, to the left

/ c: Cancel the line of wrap symbol

/ E: Escape, Jumper button

/ f: Formfeed, change the page character

/ N: Newline, wrap characters

/ r: return, Enter key

/ T: Tab, Table Jump

/ v: Vertical Tab, vertical table pitch key

/ N: ASCII eight-in-position encoding (with X-sales as a hexadecimal)

//: Anti-slope itself

(Table information from the Learning the Bash Shell, 2nd Ed.) Of O'Reilly Press.

Perhaps we can learn about Echo's options and control characters through an example:

Example 1:

Code:

$ Echo -e "A / TB / TC / ND / TE / TF"

A b C

D e f

The above example uses / t to distinguish ABC and DEF, and use / n to change the DEF to the next row.

Example 2:

Code:

$ ECHO -E "/ 141/011/142 / 011 / 145/012/144 / 011/145/011/146"

A b C

D e f

As with the results of an example, just use the ASCII eight-in-line encoding.

Example 3:

Code:

$ echo -e "/ x61 / x09 / x62 / x09 / x63 / x0a / x64 / x09 / x65 / x09 / x66" A b c

D e f

There are many differences from the example, just this time you exchange ASCII hex.

Example 4:

Code:

$ Echo -ne "A / TB / TC / ND / TE / BF / A"

A b C

D f $

Because the e-letter is the delete key (/ b), there is no E for the output result.

I heard a bell at the end, that is the masterpiece of / A!

Since the -n option is used at the same time, shell PROMPT is next after the second line.

If you don't have to -n, then you add a / c after / a, and the same effect.

In fact, in the future shell operation and shell script design, the echo command is one of the most commonly used commands.

For example, use ECHO to check the variable:

Code:

$ A = b

$ Echo $ A

B

$ ECHO $?

0

(Note: About variable concept, we have left the next chapter to explain with you.)

Ok, more about the format of Command Line, and the options for the echo command,

Please do your own practice, apply ...

4) "" "" "(double quotation mark) and '' (single quotes)?

Still back to our Command Line ...

After learning in the previous chapter, it should be very clear when you are tapping the keyboard behind the shell prompt until it presses Enter.

The text you entered is Command Line, and then the shell will execute the command you handed over with the itinerary.

However, you can know: Every text you entered in Command Line, is there a category of the shell?

Simply, I don't dare to say this is accurate, Note 1), each Charactor of Command Line is divided into two types:

* Literal: It is ordinary pure text, there is no special function for the shell.

* Meta: Special reserves with specific functions for Shell.

(Note: About the Bash Shell in the order of processing Command Line,

Please refer to the O'Reilly Press Learning The Bash Shell, 2nd Edition, No. 177 - 180 Page,

Especially on the 178-page flowchart Figure 7-1 ...)

Literal is not talked, all of these "text" is Litral ... (easy?)

But Meta often makes us confused ..... (confused?)

In fact, the first two chapters we have encountered two machines in Command Line, and Meta encountered every time:

* IFS: Composed by or or (we often use space).

* CR: Produced by .

IFS is used to disassemble each word (Word) of Command Line, because shell Command Line is processed by word.

The CR is used to end Command Line, which is why we knock command will run. In addition to IFS and Cr, common METAs are also:

=: Set the variable.

$: Make a variable or an alternative (please don't confuse the shell prompt).

>: Heavy directive stdout.

<: Heavy guidance stdin.

|: Command line.

&: Keep File Descriptor, or place the command to the back.

(): Place the inside the command to the NESTED SUBSHELL, or for the operation or command replacement.

{}: Place it in the Non-Named function, or uses the defined range of variables.

: At the end of the previous command, ignore its return value and continue the next command.

&&: If the previous command ends, if the value is true, continue to perform the next command.

||: When the previous command ends, if the return value is false, continue to execute the next command.

!: Execute the command in the History list

....

If we need to close these reserved characters in Command Line, you will need quoting to handle it.

In Bash, the commonly used quoting has three methods:

* Hard quote: '' (single quote), all METAs in Hard Quote are closed.

* Soft Quote: "" (Double Quotes), large part of Meta in Soft QuoE will be turned off, but some are reserved (such as $). (Note)

* Escape: / (reverse slash), only close Meta immediately after Escape (Jumk) is closed.

(Note: I don't know if I was exempted in Soft Quote.

Do you want to add, or to find and understand through practice. )

The following example will help us understand the quoting:

Code:

$ A = b c # blank key is not turned off, as IFS processing.

$ C: Command Not Found. (FixMe)

$ Echo $ A

$ A = "b c" # blank key has been turned off, only as blank key processing.

$ Echo $ A

B C

When setting a variable for the first time, the Command Line will be interpreted as:

* A = b then encounter , then execute C command

When the A variable is set for the second time, since the blank key is placed in Soft Quote, it is closed, no longer as IFS:

* A = b c

In fact, blank keys are closed in Soft Quote or in HARD Quote. ENTER key is also:

Code:

$ A = 'b

> C

> '

$ Echo $ A

B

C

In the above example, since is placed in the Hard Quote, it is no longer processed as a Cr character.

Here's is just a broken line symbol (new-line), because Command Line does not get a CR character, so enter the second shell prompt (PS2, in> symbolic representation), Command Line will not end,

Until the third line, we entered is not in the Hard Quote, so it is not closed.

At this point, Command Line hits the Cr character, which is ended, handed over to the shell.

If the is placed in Soft Quote, it will be also closed, and Escape can also:

Code:

$ A = b /

> C /

>

$ Echo $ A

B

C

In the above example, the first is closed by the Escape character with the second , so it is not processed as a Cr.

However, the third is not jumped, and therefore the COMMAND line is ended as CR.

As for Soft Quote, the hard quote is different, mainly for some Meta closes, with a description:

Code:

$ A = b / c

$ echo "$ a"

B C

$ Echo '$ A'

$ A

In the first echo command line, the $ is placed in Soft Quote, will not be turned off, thus continuing to process variable replacement,

Therefore, the ECHO outputs A to the screen, which also obtains the result of "B C".

In the second echo command line, the $ is placed in the hard quote, it is turned off, so $ is just a $ symbol,

It is not used to make a variable replacement process, so the result is the result of the next A letter behind: $ A.

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Practice and Thinking: Why is the following results?

Code:

$ A = b / c

$ echo '"$ a"' # The outermost is single quotes

$ A "

$ echo "$ a '" # outer is double quotes

'B c'

(Tip: Single quotes and double quotes, are all in quoting? # 93 ;.)

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

In the shell version of CU, I found that there are many initiator's issues, and they are related to the quoting understanding.

For example, if we call some variables before the parameters of the AWK or SED, will often ask why it is impossible.

To solve these problems, the key points are:

* Distinguish Shell Meta and Command Meta

The meta we mentioned earlier is a special purpose in Command Line.

For example, {} is executed in a series of Command Line in a non-named function (you can easily see the Command Block).

However, AWK needs to distinguish the AWK command segment (Begin, Main, End) with {}.

If you enter this in Command Line:

Code:

$ awk {print $ 0} 1.txt

Since {} is not closed in the shell, the shell will treat {Print $ 0} as a Command Block, but at the same time, there is no ";" symbol as a command area, therefore, the AWK's syntax error result.

To solve it, use hard quote:

Code:

$ awk '{print $ 0}' 1.txt

The above Hard Quote should be understood, that is, the original {, , $ (note three),}, these shell meta close,

Avoid falling in the shell, and complete Command Meta in the awk parameter.

(Note Three: The $ 0 is the Field Number built by awk, not the AWK variable,

AWK's own variable does not need to use $. )

If you understand the features of Hard Quote, you will not be difficult to understand Soft Quote and ESCAPE:

Code:

AWK "{Print / $ 0}" 1.txt

AWK / {Print / / / / $ 0 /} 1.txt

However, if you want to change the 0 value of the $ 0 in AWK is read from another shell variable?

For example: The value of the variable $ A is 0, how do you solve the AWK $$ in Command Line?

You can directly deny the Solution of Hard QuoE:

Code:

$ awk '{print $$ a}' 1.txt

That is because $ A of $ A is not replaced in Hard Quote.

Smart readers (such as you!), After learning this chapter, I think, you should explain why we can use the following:

Code:

A = 0

AWK "{Print / $$ A}" 1.txt

awk / {print / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / $$ a /} 1.txt

AWK '{Print $' $ a '}' 1.txt

awk '{print $' "$ a" '}' 1.txt # Note: "$ a" package in Soft Quote

Perhaps, you can mention more solutions .... ^ _ ^

5) VAR = value? Where is the difference before and after EXPORT?

Let us temporarily throw the Command Line first, let's take a look at the Bash variable (Variable) ...

The so-called variable is to use a specific "name" (Name) to access a "value" (value) that can vary.

* Setting (SET) *

In Bash, you can use "=" to set or redefine the contents of the variable:

Name = value

When setting the variable, comply with the following rules:

* Isographic symbol (IFS) cannot be used on both sides of the left and right, but also avoid using the shell's reserved word (Meta Charactor).

* Variable name cannot use the $ symbol.

* The first letter of the variable name cannot be a number (NUMBER).

* Variable name length should not exceed 256 letters.

* Variable name and variable value are different (Case Sensitive).

The following is a common error for some variables:

A = B: No IFS

1A = B: Can't start with numbers

$ A = B: Name can't have $

A = B: This is different from A = B

As follows, it is acceptable to set: a = "b": IFS is turned off (please refer to the quoting chapter in front)

A1 = B: Not the beginning of the number

A = $ b: $ can be used within variable values

This_is_a_long_name = B: Available_Nable_Network Names or values, and don't have you.

* Variable replacement (SUBSTITION) *

The shell is powerful, one of which is that it can be replaced with variables in the command line.

The user can use the number of variable names in the command line (except for defining variable names with = number),

Replace the variable value and then re-set the command line.

Alterate:

Code:

$ A = ls

$ B = la

$ C = / TMP

$ $ A - $ c

(Note: The first $ of the above command line is Shell Prompt, not within the command line.)

It is necessary to emphasize that the variable we mentioned is replaced with only the top of Command Line. (Yes, let's go back to Command Line!)

Carefully analyze the last line of Command Line, it is not difficult to find before being executed (before entering Cr characters),

The $ symbol will replace each variable (replace the variable value to reorganize the command line), and finally the following command line is obtained:

Code:

LS -LA / TMP

Remember the second chapter I invited to "Be sure to understand"? If you forget, then I will post it again:

Quote:

If you come from technical details, the Shell will be disassembled as "word" based on the text of the Command Line according to the IFS (Internal Field SEPERATOR).

Then, for special characters (META), it will be processed, and finally recombine Command Line.

The $ here is one of the most classic meta in Command Line, that is, the variable is replaced!

In daily shell operations, we often use the echo command to view the values ​​of specific variables, for example:

Code:

$ ECHO $ ​​A - $ B $ C

We have learned that the echo command is simply sent to "Standard Output" (Stdout, usually our screen).

So the above command will get the following results on the screen:

Code:

LS -LA / TMP

This is the result of replacing $ A (LS), $ B (LA), with $ C (/ TMP) first when the echo command is executed.

With the replacement ability of the variable, we are more flexible when setting variables:

A = B

B = $ A

Thus, the variable value of B can inherit the variable value of the A variable "at that time".

However, don't use "mathematical Luo" to settled the variable setting, for example:

A = B

B = C

This does not make a variable value of A becomes c. Rethlean:

A = B

B = $ A

A = C

The value of B is also changed to c.

The above is a variable that is simply defined by two different names: A and B, and their values ​​are B and C.

If the variable is repeatedly defined, the original old value will be replaced by the new value. (This is not "variable amount"? ^ _ ^)

When we set the variable, please remember this:

* Store a value with a name

This is only.

In addition, we can also use the command line variable replacement capabilities to "expand" variable: a = B: C: D

A = $ A: E

Thus, the first line we set a value of A "B: C: D", then the second line will then expand the value to "A: B: C: E".

The above expansion example, we use the interval symbol (:) to achieve expansion purposes,

If there is no interval symbol, there is a problem with the following:

A = BCD

A = $ AE

Because the second time is the result of the value of $ AE, not $ a!

To resolve this issue, we can use more rigorous replacement:

A = BCD

A = $ {a} e

In the above example, we use {} to clearly define the range of variable names.

As a result, we can extend the variable value of A from the BCD to BCDE.

(Tip: About $ {name} The fact that the variable processing capability can be done, these are all advanced variable processing,

At present, you will not introduce it, please refer to your own information. Such as CU's post:

http://www.chinaunix.net/forum/viewtopic.php?t=201843

)

* export *

Strictly speaking, our variables defined in the current shell belong to "local variable",

Only the "Output" processing of the export command can be an environment variable (Environment Variable): ENVIRONMENT

Code:

$ A = b

$ EXPORT A

or:

Code:

$ export a = b

After the export output is processed, the variable A can be used as an environment variable for future commands.

When using Export, please don't forget the shell's "Substitution" on the command line.

for example:

Code:

$ A = b

$ B = c

$ EXPORT $ A

The above command did not output A to an environment variable, but the B is output,

This is because in this command line, $ A will first be emitted out B and then "stuff" as an export parameter.

To understand this export, in fact, it is necessary to understand from the perspective of Process.

I will explain the concept of Process, so please pay attention to you.

* Cancel variable *

To cancel a variable, you can use the unset command in BASH:

Code:

Unset A

Like Export, the unset command line also makes a variable replacement (this is actually one of the features of the shell),

therefore:

Code:

$ A = b

$ B = c

$ unset $ a

The variable canceled in fact is B instead a.

In addition, once the variable is canceled after the unset, the result is to take the entire variable, not only to cancel the variable value.

The next line is actually very different:

Code:

$ A =

$ unset a

The first line just sets the variable A to "null value", but the second line makes the variable A not exist.

Although in the eyes of the eye, the state of these variables is the same in the following command:

Code:

$ A =

$ Echo $ A

$ unset a

$ Echo $ A

Ask the students to identify the essence of NULL VALUE and UNSET, which is strict in some advanced variable processing. for example:

Code:

$ Str = # Set as NULL

$ VAR = $ {str = expr} # Defining VAR

$ Echo $ VAR

$ Echo $ STR

$ unset str # cancellation

$ VAR = $ {str = expr} # Defining VAR

$ Echo $ VAR

Expr

$ Echo $ STR

Expr

Smart readers (Yes, you!), Slightly thinking,

It should not be difficult to find why the same var = $ {str = expr} is different under NULL and Unset?

If you can't see it, it may be one of the following reasons:

a. You are too stupid.

b. Do not understand VAR = $ {str = expr} This advancement process

c. The description of this article has not been obtained and digested

e. I am not talking about it.

I don't know, which one do you choose? .... ^ _ ^

6) Where is EXECE?

Let us first talk about an instance post from the Cu Shell version:

(http://www.chinaunix.net/forum/Viewtopic.php?t=194191)

The question in the example is:

Quote:

CD / ETC / AA / BB / CC can be implemented

But the shell does not execute when writing this command to the shell!

What is this reason!

I was how to answer it for the time being, let us know the concept of the processes.

First, any program we have implemented is a child stroke generated by the Parent Process,

After the subunion is over, it will return to the parent stroke. This is now called Fork in the Linux system.

(Why is the direction for FORK? Well, draw a picture below may be better understood ... ^ _ ^)

When the subunion is generated, a certain resource allocation will be obtained from the parent itinerary, and (more importantly) inheriting the environment of the parent itinerary!

Let's go back to the "environment variable" mentioned by the previous chapter:

* The so-called environment variable is actually those variables that will pass on the child.

Simply, "hereditary" is the decisive indicator of local variables and environment variables.

However, from the genetic point of view, we are not difficult to find another important feature of environment variables:

* Environment variables can only be inherited from the parent stroke to the subunion. In other words: How to change in the subtraction, never affect the environment of the parent stroke.

Next, let's take a look at the concept of the command script.

The so-called shell script is very simple, that is, write you usually in the multi-line Command Line you entered after shell Prompt, and it will be written in order.

Among them, add some of the conditions judging, interactive interface, parameter application, function call, and so on, so that Script is more "smart".

But if you don't talk about these skills, we really can simply see that Script is just a pre-written command line.

Combined with the above two concepts (Process Script), it should not understand the meaning of this sentence:

* Normally, when we execute a shell script, it is actually a sub-stroke of a sub-shell first, then the sub-shell will generate a subunion of the command line.

However, then let us return to the examples mentioned in this chapter and then think about new thinking:

Quote:

The CD / ETC / AA / BB / CC can execute but write this command to the shell!

What is this reason!

I am this at the time:

Quote:

Because, usually we run the shell script is executed with Subshell.

From the concept of Process, the Parent Process creates a child process to execute.

When Child is over, it will return Parent, but the environment of the Parent does not change due to changes in Child.

The so-called environmental element is much, and it is enchanting Effective ID, Variable, Workding Dir, etc. ...

The Workding Dir ($ PWD) is the doubt of the landlord:

When Subshell is running Script, the $ PWD of the Sub Shell will change because of CD.

But when returning the primary shell, the $ PWD will not change.

The reason why the problem can be learned is good, but? How to solve the problem, I am afraid we are more interested! right? ^ _ ^

Well, next, let us know how the source command is good.

When you have the concept of Fork, it is not difficult to understand Source:

* The so-called source is to let Script execute within the current shell, rather than generating a sub-shell.

Since all execution results are completed within the current shell, if the Script environment changes, it will now change the current environment!

Therefore, as long as we want to turn the Script command line originally entered into the parameters of the Source command, it can easily solve the problems mentioned before.

For example, I was originally to perform Script:

Code:

./my.script

Now change to this:

Code:

Source ./my.script

or:

./my.script

Speaking here, I think you are interested in watching the numerous settings of / etc,

It should be not difficult to understand how they are expressed, how to let other Script read and inherited?

If you have the opportunity to write your own script in the future, it should also be specified to specify a set file for "sharing" together with different Script ... ^ _ ^

Okay, here, if you don't understand the difference between Fork and Source, then you will receive a challenge:

---- The exec is not the same as Source / Fork?

Oh ... you have to know that EXEC may be more complicated, especially if you are on File Descriptor ...

However, simply:

* EXEC is also performed on Script on the same stroke, but the original stroke is over.

That is, short: whether the original stroke will terminate, that is, the biggest difference between Exec and Source / Fork.

Well, the light is to understand from theory, perhaps not so digest, not as good as the "actual thinking" is deep.

Let's write two simple script, the command is 1.sh and 2.sh:

Sch

Code:

#! / bin / bash

A = B

Echo "Pid for 1.sh Before Exec / Source / fork: $$"

Export a

echo "1.sh: / $ a is $ a"

Case $ 1 in

EXEC)

Echo "Using Exec ..."

Exec ./2.sh ;;

Source

Echo "Using Source ..."

./2.sh ;;

*)

echo "Using fork by Default ..."

./2.sh ;;

ESAC

echo "Pid for 1.SHAfter EXEC / SOURK: $$"

echo "1.sh: / $ a is $ a"

2.SH

Code:

#! / bin / bash

Echo "Pid for 2.sh: $$"

Echo "2.SH GET / $ A = $ a from 1.sh"

A = C

Export a

echo "2.sh: / $ a is $ a"

Then, run the following parameters to observe the results:

Code:

$ ./1.sh fork

$ ./1.sh Source

$ ./1.sh exec

Or, you can also refer to another post on CU:

Http://www.chinaunix.net/forum/viewtopic.php?t=191051

Ok, don't forget the difference between the output results and the reasons behind it ...

If you have any questions, welcome to come out to discuss discussions ~~~

Happy scripting! ^ _ ^

7) () Where is the {}?

Well, this time is easy, don't talk too much ... ^ _ ^

Let me talk about it, why do you want to use () or {}.

Many times, we do it on the shell, you need to execute multiple commands at a certain condition.

That is, either execute, either execute, rather than whether the next command is to be performed.

Or, you need to be exempt from some commands to perform priority, such as arithmetic 2 * (3 4) ...

At this time, we can introduce the concept of "Command Group": Treat multiple commands.

In Shell Command Line, the average person may not be much more than () and {}.

Although the two can be configured with multiple commands, it is very different from the technical details:

() Take the Command Group to the sub-shell, also known as the nested subs-shell.

{} Is completed in the same shell, also known as Non-named Command Group.

If you still remember the concept of the conference of the previous chapter, it is not difficult to understand the differences between the two.

If you tear the variables and other environments in Command Group, we can use () or {} according to different needs.

Usually, if the modifications are temporary, and do not want to affect the original or later settings, then we will nested sub-shell,

Conversely, use non-named command group.

Yes, light from Command line, () is finished with {}, it is easy enough ~~~ ^ _ ^

However, if these two meta use in other Command meta or fields, there are still many differences.

Just, I don't plan to explain again, leave the reader, slowly discovery ...

I just want to add a concept here, it is function. The so-called Function is to name a Command Group with a name, then call this name to perform the Command Group.

Inferring from Non-Named Command Group, you can also guess what I want to say is {}? (Yes! You are so smart! ^ _ ^)

In Bash, there are two ways to define the function:

method one:

Code:

Function function_name {

Command1

Command2

Command3

....

}

Method 2:

Code:

Fuction_name () {

Command1

Command2

Command3

....

}

Which way does it don't matter, just if you feel the center of intended names conflict with existing commands or alias (alias), the way may fail.

But the way two, you can use the English letters, and the lazy people say (such as me), why not? ... ^ _ ^

Function can also be called "funny" to a certain extent, but please do not confuse the traditional programming, the two differences are very different.

The only thing that we can call them with "defined names" at any time ...

If we need a constant revision of the row in the shell operation, we first think of it, perhaps write commands to a command manuscript (shell script).

However, we can also write into function, then play function_name in Command Line, you can use it when the Script is used.

Just if you defined in the shell, in addition to the available unset function_name cancel, once you exit the shell, Function is followed.

However, there is a lot of benefits in Script, in addition to improving the performance efficiency of the overall SCRIPT (because it has been loaded).

You can also save many recidified code ...

Simply put, if you write multiple commands into script for call, then you can see Function as script ... ^ _ ^

Moreover, through the Source command introduced by the previous chapter, we can define a lot of good useful function, and then write in a specific file.

Then, in other Script, use Source to load and re-execute them.

If you are the user of Redhat Linux, perhaps, already guess /etc/rc.d/init.d/functions this file is used ~~~ ^ _ ^

Okay, saying it is easy, then write this this time. I wish you all a happy learning! ^ _ ^

8) $ (()) and $ () also $ {}?

Our last chapter introduces () different from {}, this time, let us expand, see more changes: $ () and $ {} is a meal?

In the Bash Shell, $ () and `` (reverse quotation marks) are used to make command replacements (Command Substitution).

The so-called command replacement is similar to the variables of our fifth chapter. It is used to reorganize the command line:

* Complete the command line in the quotation marks, and then replace the result, then restructure the command line. E.g:

Code:

$ Echo the last sunday is $ (Date -D "Last Sunday" % Y-% M-% D)

This is convenient to get the date on last Sunday ... ^ _ ^

In operation, it doesn't matter with $ () or ``, just "personal" prefer $ (), the reason is:

1, `` It is easy to confuse with '' (single quotes), especially for beginners.

Sometimes in some strange glyphs, the two symbols are exactly the same (vertical and vertical).

Of course, experienced friends can also be different. Just, if you can better avoid confusion, what is it not? ^ _ ^

2, in multi-level composite replacement, `` must have additional jumping (/ `), and $ () is more intuitive. E.g:

This is wrong:

Code:

Command1 `command2` command3``

The original intent is to go to Command2 to Command 2 first in Command2 `Command3` first.

The result is then transmitted to the Command1 `command2 ...`.

However, the real results are divided into `command2` and `two paragraphs in the command line.

The correct input should be as follows:

Code:

Command1 `command2 /` command3 / ``

Otherwise, there is no problem with $ ():

Code:

COMMAND1 $ (Command2 $ (Command3))

As long as you like, how many layers don't have a problem ~~~ ^ _ ^

However, $ () is not unfair ...

First, `` Basically can be used in all UNIX shells, if written as a shell script, the transplantability is relatively high.

Every shell is not available, I can only tell you, if you use Bash2, you must have no problem ... ^ _ ^

Next, let us see $ {} ... it is actually used for variables.

Under normal circumstances, $ VAR and $ {var} are not different.

However, with $ {} will be more accurately defined in the range of the variable name, saying that

Code:

$ A = b

$ Echo $ ab

Originally, it is intended to replace the results of $ A and then make up a B letter in it.

But on the command line, the real result is only the value of the number of variables is AB ...

If you use $ {}, there is no problem:

Code:

$ Echo $ {a} B

BB

However, if you only see $ {} can only be used to define the variable name, then you are too small to see BASH!

If you are interested, you can refer to the essence of Cu this edition:

http://www.chinaunix.net/forum/viewtopic.php?t=201843

For the sake of integrity, I will use some examples to explain some specific features of $ {} in some cases:

Suppose we define a variable:

FILE = / DIR1 / DIR2 / DIR3 / My.File.txt

We can use $ {} to replace different values, respectively:

$ {file # * /}: Take the first / and its left strings: Dir1 / Dir2 / Dir3 / my.file.txt $ {file ## * /}: Take the last / and its left String: my.file.txt

$ {file # *.}: Take the first. And the strings on the left: file.txt

$ {file ## *.}: Take the last one. And the strings on the left: txt

$ {file% / *}: Take the last string / and the string on the right side: / Dir1 / Dir2 / Dir3

$ {file %% / *}: take off the strings of the first / and its right: (null)

$ {file%. *}: Take the last one. And the string on the right: /dir1/dir2/dir3/my.file

$ {file %%. *}: Take the first. And the strings on the right: / Dir1 / Dir2 / Dir3 / My

Memory method is:

# 是 掉 左 (在 盘 # 之)

% Is to remove the right (% on the right side of the disk)

Single symbols are minimally matched; two symbols are maximum match.

$ {file: 0: 5}: Extract the leftmost 5 bytes: / DIR1

$ {file: 5: 5}: Extract the 5th byte of the 5 bytes: / Dir2

We can also replace the strings in variables:

$ {file / dir / path}: Add the first DIR to path: /path1/dir2/dir3/my.file.txt

$ {file // DIR / path}: Substituted all DIR to path: /path1/path2/path3/my.file.txt

You can also assign a value for different variable status (no setting, null value, non-null value) for different variable status:

$ {file-my.file.txt}: If the $ file is null, use my.file.txt written by default. (Keep no setting and non-null value)

$ {file: -my.file.txt}: If the $ file is not set or a null value, use my.file.txt written by default. (Keep a non-null value)

$ {file my.file.txt}: Regardless of the value of $ file, use my.file.txt authorified value. (Do not retain any value)

$ {file: my.file.txt}: Unless the $ file is a null value, use my.file.txt written by default. (Keep null)

$ {file = my.file.txt}: If the $ file is not set, use my.file.txt written by default, and the $ file is defined as a non-null value. (Retain null value and non-null)

$ {file: = my.file.txt}: If the $ file is not set or a null value, use my.file.txt written by default, and the $ file is defined as a non-null value. (Keep a non-null value)

$ {file? my.file.txt}: If the $ file is not set, output my.file.txt to stderr. (Keep null and non-null)))

$ {file:? my.file.txt}: If the $ file is not set or a null value, output my.file.txt to stderr. (Keep a non-null value)

Also, $ {# var} can calculate the length of the variable value:

$ {# file} You can get 27, because /dir1/dir2/dir3/my.file.txt is just 27 bytes ...

Next, the number of groups (ArRay) processing is done to everyone. In general, the variables such as A = "A B C DEF" are only replaced with a single string,

But change to A = (A b c DEF), is defined as a group ...

Bash's number of groups can be referred to the following methods:

$ {A [@]} or $ {a [*]} to get A B C DEF (all of the number of groups)

$ {A [0]} can get A (first group), $ {a [1]} is the second group ...

$ {# A [@]} or $ {# a [*]} can get 4 (number of all groups)

$ {# A [0]} can get 1 (i.e., the length of the first group (a)), $ {a [3]} can get 3 (the length of the first group (DEF))

A [3] = XYZ is the number 4 of the fourth group to be redefined as XYZ ...

Such ....

Can make good use of $ () with $ {} and simplify the processing power of the shell on the variable ~~~ ^ _ ^

Ok, finally introduce $ (()) for you: it is used to make an integer operation.

In Bash, the integer operation symbols of $ (()) are roughly these:

- * /: is "add, minus, multiply, divided".

%: Remainder

& | ^!: Operations for "and, or, xor, not".

example:

Code:

$ a = 5; b = 7; c = 2

$ ECHO $ ​​((A B * C))

19

$ ECHO $ ​​((((A B) / C))

6

$ ECHO $ ​​((((A * B)% C))

1

The variables in $ (()) can be replaced with a $ symbol or not, such as:

$ ($ A $ B * $ C)) can also get the results of 19

In addition, $ (()) can also make a different carry (such as binary, eight-in-one, sixteen into the bits), but only, the output is ten in:

Echo $ ((16 # 2a)) The result is 42 (16 carrying ten into the 10)

Take a look at a practical example:

If the current umask is 022, then the permissions of the new file are:

Code:

$ umask 022

$ echo "Obase = 8; $ (8 # 666 & (8 # 777 ^ 8 # $ (umask))))))))))))))

644

In fact, simple use (()) can also redefine the variable value or TESTING:

A = 5; ((A )) can be defined as 6

A = 5; ((A -)) is a = 4

A = 5; b = 7; ((a

Common test symbols for (()) are as follows:

<: Less than

>: Big than

<=: Less than or equal to

> =: Greater than or equal to

==: equal to

! =: Do not equal

However, when using (()), please don't confuse [] integer testing. (More tests I will introduce you in Chapter 10)

how? It's fun .. ^ _ ^ okay, this time you temporarily say so much ...

The above introduction, there is no detail of each available state, more, please reader reference manual file ...

9) $ @ 与 * 差?

To say $ @ @ *, you must first talk from the Positional Parameter from Shell Script ...

We all know how variables are defined and replaced, this doesn't have to talk more.

However, we also need to know that some variables are the shell, and their name is we can't modify,

Among them, there is Positional Parameter.

In Shell Script, we can use $ 0, $ 1, $ 2, $ 3 ... such variables to extract the following part of the command line, respectively:

Code:

Script_name Parameter1 Parameter2 Parameter3 ...

We can easily guess $ 0 is the representative shell script name (path) itself, and $ 1 is the first parameter, so push ...

It is necessary to pay attention to the role of IFS, that is, if the IFS is processed by quoting, then Positional parameter will also change.

Such examples:

Code:

My.sh P1 "P2 P3" P4

Since the blank key between P2 and P3 is closed by Soft Quote, $ 2 in My.SH is "P2 P3" and $ 3 is P4 ...

I still remember that before two chapters mentioned fucntion, did I not say that it is Script in Script? ^ _ ^

Yes, Function can read yourself (different from Script) Postitional Parameter, only one exception is $ 0.

For example: Suppose there is a fucntion called my_fun in My.sh, if you run my_fun fp1 fp2 fp3 in Script,

So, $ 0 in Function is my.sh, and $ 1 is fp1 instead of P1 ...

It's better to write a simple my.sh script:

Code:

#! / bin / bash

MY_FUN () {

Echo '$ 0 Inside Function IS' $ 0

Echo '$ 1 INSIDE FUNCTION IS' $ 1

Echo '$ 2 INSIDE FUNCTION IS' $ 2

}

Echo '$ 0 OUTSIDE FUNCTION IS' $ 0

Echo '$ 1 OUTSIDE FUNCTION IS' $ 1

Echo '$ 2 OUTSIDE FUNCTION IS' $ 2

MY_FUN FP1 "FP2 FP3"

Then I will run Script in Command Line and I know:

Code:

chmod x my.sh

./my.sh p1 "p2 p3"

$ 0 outside function is ./my.sh

$ 1 Outside Function IS P1

$ 2 Outside Function IS P2 P3

$ 0 inside function is ./my.sh

$ 1 Inside Function IS FP1

$ 2 Inside Function IS FP2 FP3

However, when using Positional parameter, we have to pay attention to some traps:

* $ 10 is not replacing the 10th parameters, but replaces the first parameter ($ 1) and then make up a 0 behind it!

That is, My.sh One Two Three Four Five Six EIGTH NINE TEN This command line, $ 10 in my.sh is not Ten but one0 ... Be careful!

To catch Ten, there are two ways:

Method One is to use $ {} introduced in the previous chapter, that is, with $ {10}.

Second, it is Shift.

In terms of popularity, the so-called Shift is to cancel the leftmost parameters in the Positional Parameter ($ 0 is not affected).

Its preset is 1, which is Shift or Shift 1 to cancel $ 1, and the original $ 2 becomes $ 1, $ 3 is $ 2 ...

If Shift 3 is canceled three parameters, that is, the original $ 4 will become $ 1 ...

That, dear readers, you say how many parameters you want Shift can get $ 1 $ {10}? ^ _ ^

Okay, when we have the basic concepts of Positional Parameter, let's take a look at other related variables.

The first is $ #: it can capture the number of POSITIONAL Parameters.

Take the previous my.sh p1 "p2 p3" as an example:

Since IFS between P2 and P3 is in Soft Quote, the value of 2 can be obtained.

However, if P2 and P3 are not placed in quoting, the $ # can get 3 values.

The same reason is the same in function ...

Therefore, we often test whether Script has a read parameter in the shell script:

Code:

[$ # = 0]

If it is 0, then the script has no parameters, otherwise there is a parameter ...

The next is $ @ 与 *:

Exactly, both only have differences in Soft Quote, otherwise, "all parameters" (except $ 0).

For example, it is good:

If you run my.sh p1 "p2 p3" P4 on Command Line,

Whether it is $ @ or $ *, you can get P1 P2 P3 P4.

However, if you are placed in Soft Quote:

"$ @" Can get the three different words (Words "" P1 "" P2 P3 "" P4 ";

"$ *" Can get "P1 P2 P3 P4" string of single words.

We can modify the previous my.sh so that the content is as follows:

Code:

#! / bin / bash

MY_FUN () {

Echo "$ #"

}

Echo 'The Number of Parameter in "$ @" is' $ (My_FUN "$ @")

Echo 'The Number Of Parameter in "$ *" is' $ (my_fun "$ *")

Then execute ./my.sh p1 "p2 p3" P4 know $ @ 与 * difference? ... ^ _ ^

10) && and || Are you?

It's hard to enter the chapters of the two digits ... all the way, very hard? Also very happy? ^ _ ^ Before answering this chapter, let us know a concept: Return Value!

Every Command or Function we run in the shell, we will return to the father's stroke one value at the end, called Return Value.

A $$ of the "new" is available in Shell Command Line, which is the value of the trip that is just over.

RETURN VALUE (RV) is between 0-255, the author of the program (or Script) is discounted:

* If in Script, use the exit RV to specify its value, if not specified, at the end, at the end of the last command RV is value.

* If in the function, use Return RV to replace Exit RV.

Return Value's role is to determine the exit status, only two:

* 0 is "true" (true)

* Non 0 words as "false"

For example, you will explain it:

Suppose there is a My.File file in the current directory, and no.file does not exist:

Code:

$ Touch My.File

$ ls my.file

$ Echo $? #first echo

0

$ ls no.file

Ls: no.file: no such file or directory

$ Echo $? # SECOND ECHO

1

$ Echo $? # third echo

0

The first echo of the above example is the RV of LS My.File, which can get the value of 0, so TRUE;

The second echo is the RV of LS NO.FILE, it gaves a value of non-0, so it is false;

The third Echo is the value about the second ECHO $?, The value of 0, so it is TRUE.

Keep in mind: Every Command will send back Return Value at the end! No matter what kind of order you run ...

However, there is a command to "specifically" to test a certain condition to send Return Value for the judgment of True or False.

It is TEST command!

If you use Bash, please understand this TEST usage in Command Line.

This is the most accurate file you can use as a reference. If you listen to others, just refer to the reference ...

Below I just make some auxiliary instructions, the rest of the remaining MAN:

First of all, Test's representation we called Expression, and there are two types of command format:

Code:

Test Expression

OR:

[expression]

(Please pay attention to the blank key between []!)

Using which format is not called, it is the same effect. (I personally prefer the latter ...)

Second, the Test of Bash's Test is only three types of tests:

* String: String, which is pure text.

* Integer: integer (0 or positive integer, no negative or decimal point).

* File: file.

Be suretters must figure out the difference between these three, because the expression used by Test is different. Take a variable of a = 123 as an example:

* ["$ A" = 123]: It is a string test to test whether $ A is 1, 2, 3 these three consecutive "text".

* ["$ A" -eq 123]: It is an integer test to test whether $ A is equal to "one hundred and twenty-three".

* [-e "$ a"]: It is a test of the file to test whether the "file" exists in this "file".

Third, when the expression is "true", TEST sent back to 0 (True) Return Value, otherwise the non-0 (false) is sent.

If you add one "!" (Exclamation number) before expression, it is sent to 0 when Expression is "false", otherwise it will be sent out.

At the same time, TEST also allows multiple coverage tests:

* Expression1 -a expression2: When two exrepsions are True, they will be sent 0, otherwise they are sent out.

* Expression1 -o expression2: Just one of the exrepsion is true, it will be sent to 0, only the two are false to send non-0.

E.g:

Code:

[-d "$ file" -a -x "$ file"]

It means that when $ file is a directory, TEST will be true when there is X permissions.

Fourth, when using TEST in Command Line, please don't forget the "reorganization" feature of the command line.

That is, when you encounter Meta, you will process META and re-set the command line. (This feature I have repeatedly toned in the second and fourth chapters)

For example, if the TEST is replaced with a variable or command replacement, the results of syntax errors will be obtained if the expression format is not met.

For example, it is good:

About [String1 = String2] this TEST format,

There must be a string on both sides of = number, including a null string (available with Soft Quote or Hard Quote).

If $ A is currently not defined, or if it is targeted as an empty string, then the following will fail:

Code:

$ unset a

$ [$ A = ABC]

[: =: unary Operator Expected

This is because the command line encounters the value of $ A, then reorganize the command line, then it becomes:

[= ABC]

In this way, there is no string on the left of the left, thus causing Test's grammatical error!

However, the following is true:

Code:

$ ["$ A" = ABC]

$ ECHO $?

1

This is because the result after the command line is reorganized is:

["= ABC]

Since = left we use Soft Quote to get a empty string, let Test syntax pass ...

Readers must pay attention to these details, because it is very unpacking, will lead to Test's result!

If you are not very experienced for Test, then you can first use this "rule": * If you touch the variable replacement in Test, use Soft Quote is the most insurance!

If you are not familiar with quoting, please re-TX. The content of the fourth chapter ... ^ _ ^

Okay, about more test usage, old saying: Please see Man Page! ^ _ ^

Although the ocean sprinkled a lot, perhaps you are still embarrassed .... ... that Return Value is used? !

Asked!

Tell you: Return Value has a big role! If you want your shell to become "smart", you will rely on it:

* With Return Value, we can let Shell do different times according to different status ...

At this time, I will let me disclose this chapter. ~~~ ^ _ ^

&& and || are used to "set up" multiple command lines:

* Command1 && Command2: It means that Command2 is only performed under the RV 0 (TRUE).

* Command1 || COMMAND2: It means that Command2 is only executed under the RV is not 0 (false).

Come, with example, it is good:

Code:

$ A = 123

$ [-n "$ a"] && echo "yes! it's turre."

Yes! it's turre.

$ unset a

$ [-n "$ a"] && echo "yes! it's turre."

$ [-n "$ a"] || echo "no, it's not turre."

NO, IT'S Not Ture.

(Note: [-n string] is TRUE for test String length greater than 0.)

The first && command line of the above example executes the Echo command on the right, because the last TEST sent back to 0 RV values;

But the second time will not be executed, because the result of the non-0 is sent for TEST ...

Similarly, || The Echo on the right will be executed, but it is because the Test on the left is caused by non-0.

In fact, we can build multiple && or || in the same command line:

Code:

$ A = 123

$ [-n "$ a"] && echo "yes! it's turre." || echo "no, it's not ture."

Yes! it's turre.

$ unset a

$ [-n "$ a"] && echo "yes! it's turre." || echo "no, it's not ture."

NO, IT'S Not Ture.

How, starting from this moment, do you think our shell is "very clever"? ^ _ ^

Ok, finally, arrange an exercise to do it, and

The following judgment is: When the $ A is assigned, see if it is less than 100, otherwise send the TOO BIG!:

Code: $ a = 123

$ [-n "$ a"] && ["$ a" -lt 100] || echo 'too big!'

TOO BIG!

If I cancel A, take care, it should not send text (because the first condition is not established) ...

Code:

$ unset a

$ [-n "$ a"] && ["$ a" -lt 100] || echo 'too big!'

TOO BIG!

Why can I get the above results?

Also, how to solve it?

(Tip: There are many modifications, one of the methods can take advantage of the Command Group described in Chapter VI.

fast! Tell me my answer! The rest is concerned ...

11)> Where is it?

I have explained the shell version in CU before this title:

Http://bbs.chinaunix.net/forum/24/20031030/191375.html

I didn't rewrite this time, and I copied the contents of the post.

----------------

11.1

Talking about I / O Redirection, let us know the File Descriptor (FD) first.

The operation of the program, in most cases, the processing of data (DATA),

Which data is read from? Also, where is it sent?

This is the function of File Descriptor (FD).

In the shell program, the most commonly used FD has three, respectively, respectively:

0: Standard INPUT (stdin)

1: Standard Output (STDOUT)

2: Standard Error Output (stderr)

In standard cases, these FDs are associated with the following equipment (Device):

Stdin (0): Keyboard

STDOUT (1): Monitor

STDERR (2): Monitor

We can use the following command to test:

Code:

$ mail -s test root

This Is A Test Mail.

Please skip.

^ D (press CRTL and D "at the same time)

Obviously, the data written by the mail program is from stdin to Keyboard read.

However, it is not seen that STDIN each program is read from Keyboard as Mail.

Because the program author can read stdin from the file parameters, such as:

Code:

$ CAT / ETC / PASSWD

But if there is no file parameter after Cat?

Oh, please play it yourself .... ^ _ ^

Code:

$ Cat

(Please pay attention to where the data output is going, don't forget to leave it by ^ D ...)

As for STDOUT and Stderr, um ... Waiting for me, let's continue again ... ^ _ ^

Or is there any seniors going to play the dragon?

----------------

11.2

Continue along the text, the book is next ... ^ _ ^

I believe that after the last exercise, you should not understand stdin and stdout?

Then, let us continue to see stderr.

In fact, stderr has no difficult to understand: saying that it is "wrong information" to send it ...

For example, if the read file parameters do not exist, then we have seen it on Monitor: Code:

$ ls no.such.file

Ls: no.such.file: no sudh file or directory

If a command generates STDOUT and STDERR at the same time?

It is not easy to send Monitor to Monitor:

Code:

$ Touch My.File

$ ls my.file no.such.file

Ls: no.such.file: no sudh file or directory

my.file

Okay, here, about FD and its name, there is also associated equipment, I believe that you have no problem?

Well, let's take a look at how to change the preset data channel of these FDs.

We can use

We can use> to change the sent data channel (stdout, stderr) to output it to the specified file.

for example:

Code:

$ cat

Just reading data from my.file

Code:

$ mail -s test root

It is read from / etc / passwd ...

In this way, stdin will no longer be read from Keyboard, but read from the file ...

Strictly speaking, it is necessary to specify a FD before the

But because 0 is the

Okay, this is a good understanding?

That, if if you use two << What?

This is the so-called Here Document, which allows us to enter a piece of text until you read << The string specified later.

for example:

Code:

$ cat << Finish

First Line Here

Second Line There

Third Line Nowhere

Finish

In this case, CAT will read into the 3 row sentence without having to read data from the Keyboard and end input.

As for> What is it?

And listen to the decomposition ....

----------------

11.3

Okay, come to the ancient time ~~~

When you understand 0

* 1>

* 2>

The former is changing the data output channel of Stdout, and the latter is the data output channel that changes the stderr.

Both will turn it originally to send data to Monitor to output to the specified file.

Since 1 is> of the preset value, 1> and> are the same, it is changed to STDOUT.

Use the last LS example to explain it.

Code:

$ ls my.file no.such.file 1> file.out

Ls: no.such.file: no sudh file or directory

This Monitor is only stderr. Because Stdout is written to file.out.

Code:

$ ls my.file no.such.file 2> file.err

my.file

Thus, Monitor has only stdout, because stderr writes into file.err.

Code:

$ ls my.file no.such.file 1> File.out 2> file.err This Monitor is nothing, because STDOUT and STDERR are transferred to the file ...

Oh ~~~ It seems to be understood> Not too hard! Is it? Didn't lind you? ^ _ ^

However, some places should still pay attention.

First, it is a problem with File Locking. For example, the following example:

Code:

$ ls my.file no.such.file 1> File.both 2> file.both

From the perspective of File System, a single file can only be written in a single FD within a single time.

If stdout (1) and stderr (2) are written in File.Both at the same time,

It is necessary to see if they are writing at the same time, basically "first to win first" principles.

Let's take a look at Stdout and Stderr at the same time with Stderr.

* No. 1, 2, 3 seconds to write STDOUT

* No. 3, 4, 5 seconds to write STDERR

So, this time the data written in Stderr's third second is lost!

If we can control stderr must wait for STDOUT to write again, or fall, STDERR and other stderr write again, then the problem can be solved.

However, from technology, it is more difficult to control, especially FDs in "long-term" writing ...

So, how to solve it? The so-called mountain does not turn to the road, the road is not turned,

We can change a thinking: introduction stderr into stdout or guide StDOUT to STERR, not everyone who is robbing a file, no!

Bingo! That's it:

* 2> & 1 is to output stderr to stdout

* 1> & 2 or> & 2 is to output stdout into stderr

Thus, the previous error operation can be changed to:

Code:

$ ls my.file no.such.file 1> File.both 2> & 1

or

$ ls my.file no.such.file 2> File.both> & 2

In this way, isn't it a big joy? Oh ~~~ ^ _ ^

However, light solves the problem of Locking is not enough, we have other techniques to be understood.

The story is not over, don't walk! After advertising, let's come back ...!

----------------

11.4

Okay, this time you don't talk about I / o Redirction, talk about Buddha ...

(Is there a mistake?! Does the network be burn it? ...) 嘻 ~~~ ^ _ ^

The highest realm of learning Buddha is "four major empty". As for which four blocks? I don't know, because I haven't come to the realm ...

But this "empty" word is very worthwhile to play:

--- Color is empty, empty is color!

Ok, it is mainly to understand the Zen of "empty", that is not far from the affected fruit ~~~

In the Linux archive system, there is a device file located in / dev / null.

Many people have asked me what stuff is? I told you: That is "empty"!

That's right! The empty empty is null .... Does the donor suddenly have a mistake? However, congratulations ~~~ ^ _ ^ This NULL can be used very much in I / O Redirection:

* If FD1 goes with FD2 to / dev / null, you can get stdout and stderr.

* If the FD0 is connected / dev / null, it is to read the Nothing.

For example, when we execute a program, the picture will send STDOUT and stderr at the same time.

If you don't want to see stderr (nor you want to save your file), you can:

Code:

$ ls my.file no.such.file 2> / dev / null

my.file

To be opposite: I just want to see STDERR? Not simple! Put stdout to null:

Code:

$ ls my.file no.such.file> / dev / null

Ls: no.such.file: no sudh file or directory

That took down, if you only run in a simple run, don't you want to see any output results?

Oh, I have left a man who didn't talk about the last time, I specially gave a dedication! ... ^ _ ^

In addition to> / dev / null 2> & 1, you can do this:

Code:

$ ls my.file no.such.file &> / dev / null

(Tip: Change &> to change> & also ~~!)

Okay? After talking about the Buddha, then let us see the following situation:

Code:

$ Echo "1"> file.out

$ cat file.out

1

$ ECHO "2"> file.out

$ cat file.out

2

It seems that when we react to stdout or stderr to enter a file, we seem to have only got the result of the last import.

Then, what about the previous content?

Oh ~~~ To solve this, this is very simple, will be> replaced >> just fine:

Code:

$ ECHO "3" >> file.out

$ cat file.out

2

3

In this way, the content of the re-directed target file will not lose, and the new content has been increasing in the end.

Easy? Oh ... ^ _ ^

However, as long as you use it back to a single> to re-transfer, then the old content will be "washed"!

At this time, how do you avoid it?

---- Backup! Yes, I heard it! But .... Is there anything better?

Since there is such a fate with the donor, I will send you a kitchen method:

Code:

$ set -o noclobber

$ ECHO "4"> file.out

-Bash: File: Cannot Overwrite EXISTING FILE

Then, how do you cancel this "restriction"?

Oh, change the set -o to set o:

Code:

$ set o noclobber

$ Echo "5"> file.out

$ cat file.out

5

Asked: That ... Is there a way to cancel and "temporarily" cover your target file?

Oh, Buddha: You can't hear it!

Ah ~~~ I am joking, joking ~~~ ^ _ ^ 唉, early, it is not enough!

Code:

$ set -o noclobber

$ ECHO "6"> | file.out

$ cat file.out

6

Notice that there is no: in>, add a "|"! Note:> There is no white between |

Call ... (deep breath, get it) ~~~ ^ _ ^

Another day, there is a problem for you to refer to it:

Code:

$ Echo "Some Text Here"> File

$ Cat

Some text here

$ Cat> File.bak

$ cat

Some text here

$ Cat> File

$ Cat

Ok? ! Do not notice there is? ! !

---- How did the File that the CAT command see is empty? !

Why? why? why?

Classmates: Don't be late ~~~!

----------------

11.5

Dangdang ~~~ Class ~~~ ^ _ ^

Previous mentioned: $ cat File After the original file the file is washed away!

It is not difficult to understand that this is not difficult, this is just the problem of Priority:

* In IO Redirection, Stdout and Stderr's pipeline will be ready to read data from stdin.

That is to say, in the above example,> File will first empty the file, then read

But at this time, the file has been emptied, so it will become a reading that does not enter any information ...

Oh ~~~ It is like this ~~~~ ^ _ ^

That ... what is the following two?

Code:

$ cat <> file

$ cat > file

Um ... students, these two answers are practicing the title, please pay the job before the next class!

Ok, I / O Redirection is also finished, sorry, because I only know so much ~~~ 嘻 ~~ ^ _ ^

However, as well as Dongdong is a must say, everyone (please own yourself ~! # @! $%):

---- is Pipe Line!

Talking about Pipe Line, I believe that many people will not be unfamiliar:

Our "|" symbols you see in many Command Line are Pipe Line.

However, what is the PIPE LINE?

Don't worry, take a look at the English-Chinese Dictionary first, see what PIPE mean?

That's right! It is the meaning of "water pipe" ...

So, can you imagine how the water pipe is just one?

Also, what is INPUT between each water pipe?

Ok? ?

Lingguang flashed: The I / O of the original Pipe Line I / O water pipe is exactly the same:

* STDOUT of the last command to get the next command!

This is true ... no matter how many Pipe lines you use on Command Line,

The I / O of the two Commandes before and after each other is connected! (Congratulations: You are finally open! ^ _ ^)

However ... how ... but ... but ... stderr? good question! But it is also easy to understand:

* What should I do if the water pipe is leaking?

That is to say: between the pipe line, the Stderr of the previous command will not be accepted from the STDIN of the next order.

Its output, if you don't use 2> to go, it is sent to the monitor!

Please use the Pipe Line to use it necessary to pay attention to.

Then, maybe you will ask:

* Is there a way to fell STDERR to STDIN in the next command?

(Guy, cheerful!)

The method is of course, and you have already learned! ^ _ ^

I prompt, just right:

* How do you merge STDERR into STDOUT?

If you can't answer it, ask me again after class ... (if your skin is really thick ...)

Perhaps, you are still yet! Perhaps, you have encountered the following questions:

* In Cm1 | CM2 | CM3 ... This PIPE LINE is to save CM2 to a certain file?

If you write to CM1 | CM2> File | CM3,

Then you will definitely find the stdin of CM3 is empty! (Of course, you have received the water pipe to another pool!)

Smart you may resolve:

Code:

CM1 | CM2> File; CM3

Yes, you can do this, but the biggest disadvantage is: This, File I / O will change double!

During the entire process of Command, File I / O is the most common maximum efficiency killer.

Anyone who has experienced shell operators will try to avoid or reduce the frequency of File I / O.

That, is there a better way to have the above question?

Yes, that is the TEE command.

* The so-called TEE command is to copy STDOUT to the archive without affecting the original I / O.

Therefore, the above command line can be hit:

Code:

CM1 | CM2 | TEE FILE | CM3

At the preset, Tee will rewrite the target file. If you want to change the content, then the -a parameter can be reached.

Basically, the application of Pipe Line is very broad in Shell, especially in Text Filtering.

Any text processing tool for CAT, More, Head, Tail, WC, Expand, Tr, Grep, Sed, AWK, ..., etc.

With Pipe Line, you will be surprised that CommanD Line is active so exciting!

Often people have "the crowd for him thousands of Baidu, suddenly look back, the man is in a dim light!" 之 ... ^ _ ^

....

Ok, the introduction to I / O Redirection will come here.

If you have time in the future, let you introduce some other things on the shell! Bye ... ^ _ ^

11)> Where is it?

I have explained the shell version in CU before this title:

Http://bbs.chinaunix.net/forum/24/20031030/191375.html

I didn't rewrite this time, and I copied the contents of the post.

----------------

11.1

Talking about I / O Redirection, let us know the File Descriptor (FD) first. The operation of the program, in most cases, the processing of data (DATA),

Which data is read from? Also, where is it sent?

This is the function of File Descriptor (FD).

In the shell program, the most commonly used FD has three, respectively, respectively:

0: Standard INPUT (stdin)

1: Standard Output (STDOUT)

2: Standard Error Output (stderr)

In standard cases, these FDs are associated with the following equipment (Device):

Stdin (0): Keyboard

STDOUT (1): Monitor

STDERR (2): Monitor

We can use the following command to test:

Code:

$ mail -s test root

This Is A Test Mail.

Please skip.

^ D (press CRTL and D "at the same time)

Obviously, the data written by the mail program is from stdin to Keyboard read.

However, it is not seen that STDIN each program is read from Keyboard as Mail.

Because the program author can read stdin from the file parameters, such as:

Code:

$ CAT / ETC / PASSWD

But if there is no file parameter after Cat?

Oh, please play it yourself .... ^ _ ^

Code:

$ Cat

(Please pay attention to where the data output is going, don't forget to leave it by ^ D ...)

As for STDOUT and Stderr, um ... Waiting for me, let's continue again ... ^ _ ^

Or is there any seniors going to play the dragon?

----------------

11.2

Continue along the text, the book is next ... ^ _ ^

I believe that after the last exercise, you should not understand stdin and stdout?

Then, let us continue to see stderr.

In fact, stderr has no difficult to understand: saying that it is "wrong information" to send it ...

For example, if you read the archive parameters, we have seen it on Monitor:

Code:

$ ls no.such.file

Ls: no.such.file: no sudh file or directory

If a command generates STDOUT and STDERR at the same time?

It is not easy to send Monitor to Monitor:

Code:

$ Touch My.File

$ ls my.file no.such.file

Ls: no.such.file: no sudh file or directory

my.file

Okay, here, about FD and its name, there is also associated equipment, I believe that you have no problem?

Well, let's take a look at how to change the preset data channel of these FDs.

We can use

We can use> to change the sent data channel (stdout, stderr) to output it to the specified file.

for example:

Code:

$ cat

Just reading data from my.file

Code:

$ mail -s test root

In this way, stdin will no longer be read from Keyboard, but read from the file ...

Strictly speaking, it is necessary to specify a FD before the

But because 0 is the

Okay, this is a good understanding?

That, if if you use two << What?

This is the so-called Here Document, which allows us to enter a piece of text until you read << The string specified later.

for example:

Code:

$ cat << Finish

First Line Here

Second Line There

Third Line Nowhere

Finish

In this case, CAT will read into the 3 row sentence without having to read data from the Keyboard and end input.

As for> What is it?

And listen to the decomposition ....

----------------

11.3

Okay, come to the ancient time ~~~

When you understand 0

* 1>

* 2>

The former is changing the data output channel of Stdout, and the latter is the data output channel that changes the stderr.

Both will turn it originally to send data to Monitor to output to the specified file.

Since 1 is> of the preset value, 1> and> are the same, it is changed to STDOUT.

Use the last LS example to explain it.

Code:

$ ls my.file no.such.file 1> file.out

Ls: no.such.file: no sudh file or directory

This Monitor is only stderr. Because Stdout is written to file.out.

Code:

$ ls my.file no.such.file 2> file.err

my.file

Thus, Monitor has only stdout, because stderr writes into file.err.

Code:

$ ls my.file no.such.file 1> File.out 2> file.err

Thus, Monitor is nothing, because STDOUT and Stderr are transferred to the file ...

Oh ~~~ It seems to be understood> Not too hard! Is it? Didn't lind you? ^ _ ^

However, some places should still pay attention.

First, it is a problem with File Locking. For example, the following example:

Code:

$ ls my.file no.such.file 1> File.both 2> file.both

From the perspective of File System, a single file can only be written in a single FD within a single time.

If stdout (1) and stderr (2) are written in File.Both at the same time,

It is necessary to see if they are writing at the same time, basically "first to win first" principles.

Let's take a look at Stdout and Stderr at the same time with Stderr.

* No. 1, 2, 3 seconds to write STDOUT

* No. 3, 4, 5 seconds to write STDERR

So, this time the data written in Stderr's third second is lost! If we can control stderr must wait for STDOUT to write again, or fall, STDERR and other stderr write again, then the problem can be solved.

However, from technology, it is more difficult to control, especially FDs in "long-term" writing ...

So, how to solve it? The so-called mountain does not turn to the road, the road is not turned,

We can change a thinking: introduction stderr into stdout or guide StDOUT to STERR, not everyone who is robbing a file, no!

Bingo! That's it:

* 2> & 1 is to output stderr to stdout

* 1> & 2 or> & 2 is to output stdout into stderr

Thus, the previous error operation can be changed to:

Code:

$ ls my.file no.such.file 1> File.both 2> & 1

or

$ ls my.file no.such.file 2> File.both> & 2

In this way, isn't it a big joy? Oh ~~~ ^ _ ^

However, light solves the problem of Locking is not enough, we have other techniques to be understood.

The story is not over, don't walk! After advertising, let's come back ...!

----------------

11.4

Okay, this time you don't talk about I / o Redirction, talk about Buddha ...

(Is there a mistake?! Does the network be burn it? ...) 嘻 ~~~ ^ _ ^

The highest realm of learning Buddha is "four major empty". As for which four blocks? I don't know, because I haven't come to the realm ...

But this "empty" word is very worthwhile to play:

--- Color is empty, empty is color!

Ok, it is mainly to understand the Zen of "empty", that is not far from the affected fruit ~~~

In the Linux archive system, there is a device file located in / dev / null.

Many people have asked me what stuff is? I told you: That is "empty"!

That's right! The empty empty is null .... Does the donor suddenly have a mistake? However, congratulations ~~~ ^ _ ^

This NULL can be useful in I / O Redirection:

* If FD1 goes with FD2 to / dev / null, you can get stdout and stderr.

* If the FD0 is connected / dev / null, it is to read the Nothing.

For example, when we execute a program, the picture will send STDOUT and stderr at the same time.

If you don't want to see stderr (nor you want to save your file), you can:

Code:

$ ls my.file no.such.file 2> / dev / null

my.file

To be opposite: I just want to see STDERR? Not simple! Put stdout to null:

Code:

$ ls my.file no.such.file> / dev / null

Ls: no.such.file: no sudh file or directory

That took down, if you only run in a simple run, don't you want to see any output results?

Oh, I have left a man who didn't talk about the last time, I specially gave a dedication! ... ^ _ ^

In addition to> / dev / null 2> & 1, you can also do this: code:

$ ls my.file no.such.file &> / dev / null

(Tip: Change &> to change> & also ~~!)

Okay? After talking about the Buddha, then let us see the following situation:

Code:

$ Echo "1"> file.out

$ cat file.out

1

$ ECHO "2"> file.out

$ cat file.out

2

It seems that when we react to stdout or stderr to enter a file, we seem to have only got the result of the last import.

Then, what about the previous content?

Oh ~~~ To solve this, this is very simple, will be> replaced >> just fine:

Code:

$ ECHO "3" >> file.out

$ cat file.out

2

3

In this way, the content of the re-directed target file will not lose, and the new content has been increasing in the end.

Easy? Oh ... ^ _ ^

However, as long as you use it back to a single> to re-transfer, then the old content will be "washed"!

At this time, how do you avoid it?

---- Backup! Yes, I heard it! But .... Is there anything better?

Since there is such a fate with the donor, I will send you a kitchen method:

Code:

$ set -o noclobber

$ ECHO "4"> file.out

-Bash: File: Cannot Overwrite EXISTING FILE

Then, how do you cancel this "restriction"?

Oh, change the set -o to set o:

Code:

$ set o noclobber

$ Echo "5"> file.out

$ cat file.out

5

Asked: That ... Is there a way to cancel and "temporarily" cover your target file?

Oh, Buddha: You can't hear it!

Ah ~~~ I am joking, joking ~~~ ^ _ ^ 唉, early, it is not enough!

Code:

$ set -o noclobber

$ ECHO "6"> | file.out

$ cat file.out

6

Notice that there is no: in>, add a "|"! Note:> There is no white between |

Call ... (deep breath, get it) ~~~ ^ _ ^

Another day, there is a problem for you to refer to it:

Code:

$ Echo "Some Text Here"> File

$ Cat

Some text here

$ Cat> File.bak

$ cat

Some text here

$ Cat> File

$ Cat

Ok? ! Do not notice there is? ! !

---- How did the File that the CAT command see is empty? !

Why? why? why?

Classmates: Don't be late ~~~!

----------------

11.5

Dangdang ~~~ Class ~~~ ^ _ ^

Previous mentioned: $ cat File After the original file the file is washed away! It is not difficult to understand that this is not difficult, this is just the problem of Priority:

* In IO Redirection, Stdout and Stderr's pipeline will be ready to read data from stdin.

That is to say, in the above example,> File will first empty the file, then read

But at this time, the file has been emptied, so it will become a reading that does not enter any information ...

Oh ~~~ It is like this ~~~~ ^ _ ^

That ... what is the following two?

Code:

$ cat <> file

$ cat > file

Um ... students, these two answers are practicing the title, please pay the job before the next class!

Ok, I / O Redirection is also finished, sorry, because I only know so much ~~~ 嘻 ~~ ^ _ ^

However, as well as Dongdong is a must say, everyone (please own yourself ~! # @! $%):

---- is Pipe Line!

Talking about Pipe Line, I believe that many people will not be unfamiliar:

Our "|" symbols you see in many Command Line are Pipe Line.

However, what is the PIPE LINE?

Don't worry, take a look at the English-Chinese Dictionary first, see what PIPE mean?

That's right! It is the meaning of "water pipe" ...

So, can you imagine how the water pipe is just one?

Also, what is INPUT between each water pipe?

Ok? ?

Lingguang flashed: The I / O of the original Pipe Line I / O water pipe is exactly the same:

* STDOUT of the last command to get the next command!

This is true ... no matter how many Pipe lines you use on Command Line,

The I / O of the two Commandes before and after each other is connected! (Congratulations: You are finally open! ^ _ ^)

However ... how ... but ... but ... stderr?

good question! But it is also easy to understand:

* What should I do if the water pipe is leaking?

That is to say: between the pipe line, the Stderr of the previous command will not be accepted from the STDIN of the next order.

Its output, if you don't use 2> to go, it is sent to the monitor!

Please use the Pipe Line to use it necessary to pay attention to.

Then, maybe you will ask:

* Is there a way to fell STDERR to STDIN in the next command?

(Guy, cheerful!)

The method is of course, and you have already learned! ^ _ ^

I prompt, just right:

* How do you merge STDERR into STDOUT?

If you can't answer it, ask me again after class ... (if your skin is really thick ...)

Perhaps, you are still yet! Perhaps, you have encountered the following questions:

* In Cm1 | CM2 | CM3 ... This PIPE LINE is to save CM2 to a certain file?

If you write to CM1 | CM2> File | CM3,

Then you will definitely find the stdin of CM3 is empty! (Of course, you have received the water pipe to another pool!) Smart you may resolve:

Code:

CM1 | CM2> File; CM3

Yes, you can do this, but the biggest disadvantage is: This, File I / O will change double!

During the entire process of Command, File I / O is the most common maximum efficiency killer.

Anyone who has experienced shell operators will try to avoid or reduce the frequency of File I / O.

That, is there a better way to have the above question?

Yes, that is the TEE command.

* The so-called TEE command is to copy STDOUT to the archive without affecting the original I / O.

Therefore, the above command line can be hit:

Code:

CM1 | CM2 | TEE FILE | CM3

At the preset, Tee will rewrite the target file. If you want to change the content, then the -a parameter can be reached.

Basically, the application of Pipe Line is very broad in Shell, especially in Text Filtering.

Any text processing tool for CAT, More, Head, Tail, WC, Expand, Tr, Grep, Sed, AWK, ..., etc.

With Pipe Line, you will be surprised that CommanD Line is active so exciting!

Often people have "the crowd for him thousands of Baidu, suddenly look back, the man is in a dim light!" 之 ... ^ _ ^

....

Ok, the introduction to I / O Redirection will come here.

If you have time in the future, let you introduce some other things on the shell! Bye ... ^ _ ^

13) FOR What? While and Until?

Finally, I came to the last question of Shell thirteen asked ... Chang live a breath ~~~~

Finally, the "loop" (loop) common in Shell Script is description.

The so-called LOOP is a code that is reversed in a certain condition in Script.

The common loops in the Bash Shell are as follows:

* for

* while

* Until

The for loop reads the variable value from a list of lists and "sequentially" cycles executes the command line between DO to DONE.

example:

Code:

For Var in One Two Three Four Five

DO

echo -----------

Echo '$ VAR IS' $ VAR

echo

DONE

The result of the above example will be:

1) for will define a variable called VAR, and its value is ONE TWO Three Four Five.

2) Because there are 5 variables, the command line between DO and DONE will be executed 5 times.

3) Each cycle produces three row sentences with ECHO.

The $ VAR in the second line is not in Hard Quote will be replaced with One Two Three Four Five.

4) When the last variable value is completed, the loop ends.

It is not difficult to see that in the for loop, the number of variable values ​​determines the number of cycles.

However, it is not necessarily that the variable is used in the cycle, and it is determined to determine the design demand.

If the for loop does not use the in this keyword to specify a list of variable values, its value will inherit from $ @ (or $ *):

Code:

For var; do

....

DONE

(If you have forgotten Positional Parameter, please TXIN CHAPTER 9 ...)

For loop is used to deal with the "List" project is very convenient,

The list is made in addition to the specification or from the Positional Parameter.

You can also replace it from variables or commands to replace ... (again remind: Don't forget the "reorganization" feature of the command line!)

However, for some "cumulative changes" items (such as integers), FOR can also handle:

Code:

For ((i = 1; i <= 10; i ))

DO

Echo "NUM IS $ I"

DONE

In addition to for loop, we can also use while loop to do:

Code:

Num = 1

While ["$ num" -le 10]; do

Echo "NUM IS $ NUM"

Num = $ ($ NUM 1)))

DONE

The principle of While Loop is slightly different from for loop:

It is not a variable value in the list, but depends on the command line Return Value behind the while:

* If it is TURE, execute the command between DO and DONE, and then re-determine the remaining Return Value after While.

* If false, the command is no longer executed between the DO and DONE, and the loop is ended.

Analysis of the above example:

1) Define the variable Num = 1 before while.

2) Then the test (TEST) $ NUM is less than or equal to 10.

3) The result is TRUE, which is executed and the value of NUM is added.

4) Reclusion, the value of Num is 1 1 = 2, which is still less than or equal to 10, and thus is TRUE, continuing the cycle.

5) Until Num is 10 1 = 11, the test will fail ... then end the loop.

We are not difficult to find:

* If the test result of While is always True, the loop will always be permanently executed:

Code:

While:; do

echo looping ...

DONE

The above example ":" is BASH NULL COMMAND, do not do any action, in addition to sending back Return Value.

Therefore, this cycle will not end, called a dead cycle.

The generation of dead cycles may be deliberately designed (such as running daemon) or it is also a design error.

To end the deadlink, you can terminate (such as pressing Ctrl-C) through Signal.

(About Process and Signal, there is a chance to add, thirteen questions temporarily slightly.)

Once you can understand while loop, you can understand Until Loop:

* In contrast to the While, the unsil enters the loop when return value is false, otherwise the end.

Therefore, in the previous example, we can easily write with unsil:

Code:

Num = 1

Until [! "$ num" -le 10]; do

Echo "NUM IS $ NUM"

Num = $ ($ NUM 1)))

DONE

Or:

Code:

Num = 1

Until ["$ NUM" -gt 10]; doecho "NUM IS $ NUM"

Num = $ ($ NUM 1)))

DONE

Okay, about three common loops of Bash temporarily introduced here.

Before ending this chapter, add two commands related to loop:

* Break

* Continue

These two commands are often used in the composite cycle, that is, there is another LOOP between DO ... DONE,

Of course, I have not tasted it in a single cycle ... ^ _ ^

Break is used to interrupt the loop, which is "forced end" cycle.

If Break specifies a value N, "from the outside" interrupt the nth cycle.

The preset value is BREAK 1, which is to interrupt the current loop.

It is important to note when using Break, it is different from Return and EXIT:

* Break is end LOOP

* Return is the end of FUNCTION

* EXIT is end Script / Shell

Continue is opposite to Break: forced to enter the next cycle action.

If you can't understand it, then you can look simple: the sentence between Continue to DONE is slightly returned and returned to the top of the loop ...

The same as Break is: Continue can also specify a value N to decide which layer (from the outside to the outside),

The preset value is Continue 1, which is to continue the current loop.

In the Shell Script design, if you can make good use of LOOP, it will greatly improve the processing power of Script under complex conditions.

Please practice more ....

-----------

Ok, it's time to end.

Mother's mother, I have a bunch of basic concepts about shell,

The purpose is not to tell everyone "answer", but to bring you "inspiration" ...

In the future about Shell's discussion, I may always use the "link" to guide back the contents of thirteen questions.

So that we can have some discussion foundations when conducting technical discussion, and not to say all words, time for time.

However, more hope that thirteen questions can bring you more thinking and fun, and is the important thing to deepen understanding through practical.

Yes, I am very attached to the "actual" and "independent thinking". If you can master the true sense, then please feel:

--- Congratulations! Thirteen asked you that you didn't read it! ^ _ ^

P.s.

As for the supplementary problem, I don't write it for a while. But hope:

1) Everyone expands the topic.

2) Let's write together.

Good luck and happy studying!

13) FOR What? While and Until?

Finally, I came to the last question of Shell thirteen asked ... Chang live a breath ~~~~

Finally, the "loop" (loop) common in Shell Script is description.

The so-called LOOP is a code that is reversed in a certain condition in Script.

The common loops in the Bash Shell are as follows:

* for

* while

* Until

The for loop reads the variable value from a list of lists and "sequentially" cycles executes the command line between DO to DONE.

example:

Code:

For Var in One Two Three Four Five

DO

echo ----------- Echo '$ VAR IS' $ VAR

echo

DONE

The result of the above example will be:

1) for will define a variable called VAR, and its value is ONE TWO Three Four Five.

2) Because there are 5 variables, the command line between DO and DONE will be executed 5 times.

3) Each cycle produces three row sentences with ECHO.

The $ VAR in the second line is not in Hard Quote will be replaced with One Two Three Four Five.

4) When the last variable value is completed, the loop ends.

It is not difficult to see that in the for loop, the number of variable values ​​determines the number of cycles.

However, it is not necessarily that the variable is used in the cycle, and it is determined to determine the design demand.

If the for loop does not use the in this keyword to specify a list of variable values, its value will inherit from $ @ (or $ *):

Code:

For var; do

....

DONE

(If you have forgotten Positional Parameter, please TXIN CHAPTER 9 ...)

For loop is used to deal with the "List" project is very convenient,

The list is made in addition to the specification or from the Positional Parameter.

You can also replace it from variables or commands to replace ... (again remind: Don't forget the "reorganization" feature of the command line!)

However, for some "cumulative changes" items (such as integers), FOR can also handle:

Code:

For ((i = 1; i <= 10; i ))

DO

Echo "NUM IS $ I"

DONE

In addition to for loop, we can also use while loop to do:

Code:

Num = 1

While ["$ num" -le 10]; do

Echo "NUM IS $ NUM"

Num = $ ($ NUM 1)))

DONE

The principle of While Loop is slightly different from for loop:

It is not a variable value in the list, but depends on the command line Return Value behind the while:

* If it is TURE, execute the command between DO and DONE, and then re-determine the remaining Return Value after While.

* If false, the command is no longer executed between the DO and DONE, and the loop is ended.

Analysis of the above example:

1) Define the variable Num = 1 before while.

2) Then the test (TEST) $ NUM is less than or equal to 10.

3) The result is TRUE, which is executed and the value of NUM is added.

4) Reclusion, the value of Num is 1 1 = 2, which is still less than or equal to 10, and thus is TRUE, continuing the cycle.

5) Until Num is 10 1 = 11, the test will fail ... then end the loop.

We are not difficult to find:

* If the test result of While is always True, the loop will always be permanently executed:

Code:

While:; do

echo looping ...

DONE

The above example ":" is BASH NULL COMMAND, do not do any action, in addition to sending back Return Value.

Therefore, this cycle will not end, called a dead cycle. The generation of dead cycles may be deliberately designed (such as running daemon) or it is also a design error.

To end the deadlink, you can terminate (such as pressing Ctrl-C) through Signal.

(About Process and Signal, there is a chance to add, thirteen questions temporarily slightly.)

Once you can understand while loop, you can understand Until Loop:

* In contrast to the While, the unsil enters the loop when return value is false, otherwise the end.

Therefore, in the previous example, we can easily write with unsil:

Code:

Num = 1

Until [! "$ num" -le 10]; do

Echo "NUM IS $ NUM"

Num = $ ($ NUM 1)))

DONE

Or:

Code:

Num = 1

Until [$ Num "-gt 10]; do

Echo "NUM IS $ NUM"

Num = $ ($ NUM 1)))

DONE

Okay, about three common loops of Bash temporarily introduced here.

Before ending this chapter, add two commands related to loop:

* Break

* Continue

These two commands are often used in the composite cycle, that is, there is another LOOP between DO ... DONE,

Of course, I have not tasted it in a single cycle ... ^ _ ^

Break is used to interrupt the loop, which is "forced end" cycle.

If Break specifies a value N, "from the outside" interrupt the nth cycle.

The preset value is BREAK 1, which is to interrupt the current loop.

It is important to note when using Break, it is different from Return and EXIT:

* Break is end LOOP

* Return is the end of FUNCTION

* EXIT is end Script / Shell

Continue is opposite to Break: forced to enter the next cycle action.

If you can't understand it, then you can look simple: the sentence between Continue to DONE is slightly returned and returned to the top of the loop ...

The same as Break is: Continue can also specify a value N to decide which layer (from the outside to the outside),

The preset value is Continue 1, which is to continue the current loop.

In the Shell Script design, if you can make good use of LOOP, it will greatly improve the processing power of Script under complex conditions.

Please practice more ....

-----------

Ok, it's time to end.

Mother's mother, I have a bunch of basic concepts about shell,

The purpose is not to tell everyone "answer", but to bring you "inspiration" ...

In the future about Shell's discussion, I may always use the "link" to guide back the contents of thirteen questions.

So that we can have some discussion foundations when conducting technical discussion, and not to say all words, time for time.

However, more hope that thirteen questions can bring you more thinking and fun, and is the important thing to deepen understanding through practical.

Yes, I am very attached to the "actual" and "independent thinking" these two learning elements. If you can master the true meaning, then please let me say: --- Congratulations! Thirteen asked you that you didn't read it! ^ _ ^

P.s.

As for the supplementary problem, I don't write it for a while. But hope:

1) Everyone expands the topic.

2) Let's write together.

Good luck and happy studying!

转载请注明原文地址:https://www.9cbs.com/read-117675.html

New Post(0)