UNIX Harness Handbook Chapter 1 Unix - The world's first computer virus "Berkeley's two most famous products are UNIX and LSD (a drug), I think this is not a coincidence" virus relies on tiny individuals and powerful Adaptability is survive. They are not complicated: they do not provide anything for breathing, metabolism, body activities, only sufficient DNA or RNA for prisoning. For example, pneumonia virus is much smaller than the cells they invaded, but they can produce new variants in the epidemic season of each pneumonia, causing countless deaths. A good virus is characterized by: * There are not many things doing a small virus, so it doesn't need a lot. Some people think that viruses are not organisms, just some destructive acids and proteins. * The portable virus is often varied to attack different cells in a different manner. It is said that AIDS is made of viruses on monkeys. * Resource depleting the host * Rapid variation UNIX has all the above advantages. When it is born, small, there are not many functions, lack the functions needed to truly operating the system (such as file mapping, telling IO, robust file system, device lock, reasonable process inter-process), its transplantability is very good . Unix exhausts the host's resources, there is no system administrator's time care, UNIX will continue to panic, Core Dump, hang. Unix constantly varies: the same patch works on a version, can't do it on another version. UNIX is a computer virus with a user interface. Standardized those inconsistency ================ "The greatness of the standard is that it can have a lot" --- Grace Murray Hopper Since the Unix 80s began popular, UNIX manufacturers have been Work hard to work with UNIX standardization. Sun, IBM, HP and DEC have poured millions of dollars on this difficult problem of their own manufacturing. Why does UNIX vendors don't like UNIX standardization? Many users have been complex and a wide range of UNIX, eventually using Windows because their Unix cannot support the app on that UNIX. If UNIX is standardized, who will buy Sun's machine, the second chapter, welcome new users welcome new users like a left round of six bullets to play Russian roulette Ken Thompson I have designed a car. Unlike other cars, it has no speed meter, gasoline gauge, and there is no stupid indicator to discuss the driver. If the driver makes any mistakes, a big "?" Will there be a big "?" On the dashboard. "Experienced drivers," Thompson said, "I should know where to make a mistake." The newcomer's newbie needs a friendly system. At least, a decent system will entertain yourself such that the command behavior and command line parameters parsing the dangerous command, the command line parameter parsing the dangerous command, and the command line parameter parsing the error and easy-to-read online documentation When the command fails, give Follow-up and useful error feedback from the construction of UNIX, never invited to residents. The visiting is a construction worker wearing a hard hat, which is placed in the corners of this broken wooden house. Unfortunately, not only the participation of the Human Factors engineers, but the needs of the households have never been considered. So the soiletable toilet, central heating, windows, etc. These convenient facilities are hard to add. But architects still proud of UNIX, seems to sleep in a house without a fireworks detector. In most history of its development, UNIX is just a research tool for university and industrial researchers. With the appearance of a large number of convenient work stations, UNIX has entered a new era as a platform software.
This change has occurred approximately in 1990, its mark is that workstation vendors remove C compilers from UNIX release to reduce cost meeting the needs of non-developing users. It can be seen that the Unix manufacturers have begun to consider the needs of non-programmer users in recent years, starting to provide them with a graphical interface outside the shell. Ambolic command name UNIX novice always surprised UNIX named command. Education is not enough to make them understand the simplicity and beauty of both alphabetical orders such as CP, RM, and LS. People who use the early IO equipment in the 1970s can understand the speed, reliability, and its keyboard of the ASR-33 Teletype. And today, this kind of feedback principle, only need to close a microphone's keyboard, you must use the TELETYPE of the teletype at least half inch to launch a small generator similar to bicycles, and operate the bone in the above Danger of fracture. If Dennis and Ken use SELECTRIC instead of Teletype, it may be "CP" and "RM" but "Copy" and "Remove" will be "CP" and "RM". (Ken Thompson has been asked if he can redesign U NIX he will do any changes, he replied: "I will add an e.", I will add an e. "), And technology can broaden our choices. Limit our choice, this example is also. More than 20 years have passed, what reasons have to continue this tradition? The reason is "the irreplaceable power of history", history is the code and textbooks that exist. If a vendor replaces RM with Remove, all UNIX Teachingbooks do not apply to this system, and each shell script that uses RM needs to be modified. And this is not in the POSI X standard. Before an century, the typographic master is often stirred together, and the engineer has designed the QWERTY keyboard, so the problem has been solved because no one can play fast on such a keyboard. The computer's keyboard no longer has a mechanical key handle, but QWERTY's keyboard layout is still in use. Similarly, in the next century, we will continue to use RM. The accident will happen to care about your own data and documents. They use computers to generate, analyze, and store important information. They believe that the computer protects their important property. If there is no such trust, they will be shadowed with the relationship between the computer. Unix failed our trust, which refused to protect users using hazardous commands. For example, RM is a dangerous command for deleting files. All UNIX novices have an unrecoverable deletion of important documents, even an expert and system administrator have encountered. Therefore, the annual loss time, energy can be worth millions of dollars. This is a questionable problem; we don't understand why UNIX has been refused to solve this problem. Does the result is not much tragic? Unix needs to provide recovery deletion feature than other operating systems because the Unix file system does not have version function automatic version maintenance to retain the historical version of the file, preventing the new version from rushing out of the old version. UNIX programmers are not yet checked in error handling Many programs that do not check if all content is written to disk, or whether the file written is present. Some programs always delete the input file. UNIX shell extension "*" instead of its subcommand that is the command such as "*" these dangers. Even DOS is also a few indications for "del *. *". But under UNIX, RM * and RM File1 file2 ... is not distinctive. Delete is a permanent Unix no undelete command. Many other safer systems just mark the blocks used by the deleted files for "can be used", then move it to a special directory. If the disk is full, these file blocks will be reused.
This technology is not a rocket science, Macintosh proposes the idea of "recycle bin" in 1984, and Tenex adopted this technology as early as 1974. Even DOS also provides a simple Undelete function, although not always effective. These four issues cooperate with each other to create an important document that cannot be recovered. The solution has long been present, but the UNIX "Standard" version has never been provided. Welcome to the future world. "RM" is the end of many actual terror stories to illustrate these principles. The following is a series of stories circulated on the Alt.Folklore.computers newsgroup: Date: WED, 10 JAN 90 X-Virus: 6 from: djones@megatest.uucp (Dave Jones) Subject: RM * NewsGroups: alt. Folklore.comPute has wanted to perform the following command:% rm * .o result is made:% rm *> o Now you get an empty file O, and a lot of space to store it! In fact, you may not even have O, because the shell's document does not say o is established after the extension is expanded or expanded. The last book tells how to use RM to get an empty file and a large disk space, below is another usage: Date: WED, 10 JAN 90 X-Virus: 6 from: ram@attcan.ucp Subject: Re: rm * Newsgroups: alt.folklore.com I have been I have been made by RM. Once I want to delete some / usr / foo / down, I will undermounce the following commands in / usr / foo:% rm -r./etc% rm -r ./adm When I want to delete ./bin directory I forgot to knock on that point. My system doesn't seem to like this. When this is hit, UNIX is completely finished. Smart systems will give users a chance (or at least reminding the user to cause system crash). Unix is ambiguously deleted as an occasional file. For example, you can refer to the FAQ: 6 on this comp.unix.quest Ions below to delete a file? Maybe one day, you accidentally executed this command:% rm * .foo then discovered that you delete "*". You should take this as a class of life. Of course, a competent system administrator should be scheduled to back up the system. So you'd better ask them in their hands with your file backup. "One lesson of life"? There is no other manufacturer to treat a defective product in such an attitude. "Adults, I know your fuel tank, but this is a class of life." "Mr. Jury, we will prove the failure of the chainsaw insurance switch is just a class of life on the user." Yes . After changing the behavior of RM is not a way to be bitten several times by RM, it is often thought that RM is replaced with "RM -I", or replaces the RM, put all deleted files in the ~ / .deleted directory. These tips make users with wrong security. Date: MON, 16 APR 90 18:46:33 199 x-Virus: 6 from: phil agre
So, now I have two concepts in my mind, one is a "deleting" file, one is a "rm'ing" file. When my hand wants my brain to delete a file, I always divide these two concepts again. Some UNIX experts have made ridiculous conclusions, they think it is best to make RM more friendly. They argue that let UNIX more friendly efforts are often counterproductive. Unfortunately, they are right. Date: Thu, 11 Jan 90 17:17 CST X-Virus: 6 from: merby@iwarp.intel.com (Randal L. Schwartz) Subject: Don't Overload Commands! (Was Re: rm *) NewsGroups: Alt. Folklore.computers Please don't let people replace standard commands with the "security" command. (1) Many shell programs are surprised by the multi-mouth RM, and they will not think that the deleted files still occupy the magnetic disk space. (2) Not all deleted operations are safe, and some households will generate an illusion that everything can be recovered. (3) Those unbolded orders are especially hateful for system administrators. If you want to have a "RM" with a confirmation function, use the following command:% Alias Del RM-i, don't replace RM! Recently, comp.unix.questions has a survey of system administrators on Comp.unix.Questions, let them Say the most horrible system management story. Within 72 hours, there were more than 300 responses. Many related to the files we described above. It is ridiculous, these can be unix masters. However, they are guilty of "Unix is not friendly". Isn't users are not friendly? Unix is friendly for system administrators? Please see Date: WED, 14 SEP 88 01:39 Edt X-Virus: 6 from: matthew p Wiener
Unfortunately, UNIX programs have never bother themselves. Instead, UNIX often mixes all kinds of errors until a fatal result is created. In the first section, we explained how RM is easily deleted. But you may not know that you can easily delete files without RM. Want to delete your files? Try the compiler Some CC versions often do not consider the possible input errors of the user, and delete some source code files. Some undergraduates often have a way. Date: Thu, 26 Nov 1992 16:01:55 GMT X-Virus: 6 from: tk@dcs.ed.ac.uk (Tommy Kelly) Subject: Help! NewsGroups: cs.questions Organization: lab for the Foundations of Computer Science, Edinburgh UK I just want to compile the program:% cc -o doit doit.c accidentally knocked into:% cc -o dogit.c DOIT No need to say that my DOIT.C was rushed away. Is there a way to recover my procedure? (I have done a whole morning) Some other programs have the same behavior: Date: Thu, 1 July 1993 09:10:50 - 0700 x-Virus: 6 from: daniel weise
My trouble is that my finger is self-contained, I have chosen the indent of a word rather than Idnt:% Indent Foo Indent is a stupid C code style conversion tool for UNIX. Is that write the independed bastard? Is it true that the input file is really a C procedure (天, at least you can look at the suffix of the file is .c)? I think you know the answer. Moreover, this SB (SAID BASTARD) thinks if you only give a parameter, then you just want to perform online style conversion. But don't worry, this SB takes into account the trouble that may bring, he saves a backup foo.bak. However, is he just changed the FOO? No, he chose a copy (no doubt, the programmer whose I NDENT has opened Foo when preparing backup, and the rename system call is later). Now, you may know what happened ... I am running foo when preparing the page fan out, I found that the original executable is already, this is not a good thing, so my foo collapsed, I Drop the system status information for 20 hours. Naturally, those designed (cough) Unix bastions are not interested in complex file version, and this feature can save my life. Of course, those bastards have never thought of locking files that are preparing to fans out, is it? There are so many bastards to choose, why not kill them? Pavel imagines a paint that emits chlorine, according to the instructions, used in outdoor, but if you use it to brush your bedroom wall, your head is bigger. How long does this paint can survive in the market? Of course, it will not exceed 20 years. Error information Joke When you see the restaurant's running hall, you will laugh when you are on the customer's head. Unix haball. But when the helpless users have to solve the error message, they are the first to laugh. Some people organize some Unix the most ridiculous error messages and released him on the usenet.
They use c shell.% Rm mese-ethics rm: Messe-Ethics Nonexistent% Ar M God Ar: god does not exist% "How Would You Rate Dan Quayle's INCOMPETENCE? Unmatched".% ^ How Did The SEX Change ^ Operation Go? Modifier Failed.% I Had A (For Every $ THE CONGRESS SPENT, WHAT WOULD I Have? Too Many ('s% make love make: don't know how to make love. stop.% sleep with me bad character % got a light? no match% man: why did you get A Divorce? Man: Too MAN: Too Many Arguments.% ^ What Is Saccharine? Bad Substitution.% Blow% Blow: No Such Job. These humorous works from Bourne Shell : $ PATH = Pretennding! / Usr / ucb / Which Sense No Sense In Preten; Opener Bottle: Cannot Open Opener: Not Found $ MKDIR Matter; Cat> Matter Matter: Cannot Create Unix Attitude We show a very bleak Scene: The fans are generally command names, inconsistencies and cannot be expected, the dangerous commands are not protected, unacceptable online documents, as well as the wrestling of errors and fault tolerance. Those who visit UNIX are not to get warm hospitality, They are not tourists in Disney Park, more like the United Nations peacekeeping forces in the implementation task. How can Nex make this? As we pointed out, some are caused by historical reasons. But there are other reasons: That is the Unix culture formed for many years. This culture is called "UNIX philosophy". Unix philosophy is not a manual from the Bell Lab or UNIX system laboratory. He is naturally formed, which contains many people's contributions . Don Libes and Sandy Compsler have made good summary in Unix philosophy in "UNIX Life": Small, 10% of the work to solve 90% of the task If you must make a choice, choose the easiest thing . According to the practical performance of UNIX programs and tools, the more accurate summary of UNIX philosophy should be: Small procedure is better than the correct program is acceptable, if you accept, you must make a choice, select the smallest responsibility. Unix has no philosophy, UNIX is only attitude. This attitude pointed out that simple work is better than complex complete work. This attitude indicates that the programmer is more precious than the user's time, even if the user is much more than the programmer. This attitude indicates that the minimum requirement is sufficient. Date: Sun, 24 DEC 89 19:01:36 Est X-Virus: 6 from: David Chapman
However, some of you, including me, may often run some TEX tasks, then learn to kill the task is especially important. The "kill" command inherits UNIX design principles, so some of the following experiences have more common sense. In UNIX you can use ^ z to hang a task, or terminate a task with ^ C. But Latex intercepted ^ c. Fruit, I often have a bunch of Latex tasks. I don't care about it, but I still feel that I should find a way to remove them. Many operating systems have commands such as "kill", Unix is no exception. "KILL" on most operating systems is only used to kill the process. However, UNIX is more common: "kill" is used to send a signal to the process, which reflects a design principle of UNIX: Try to make the operation, the powerful power (power "command function is very powerful; You can use it to send a variety of signals to the process. For example, 9 this signal is used to kill the process. Note that 9 is the largest one, which reflects another design principle of UNIX: Using the simplest name that can reflect functionality on the operating system I know, "kill" without parameters is used to kill the current task. Single Unix "kill" always requires parameters. This reflects a wise design principle of UNIX: Try to use long parameters or prompts to prevent users from accidentally spending themselves (Screwing Himself) This design principle is reflected in many UNIX applications, I don't want to list them, but Still want to mention the implementation of Logout and file deletion on UNI X, I hope you know what I mean. On the operating system I know, the parameters accepted by "Kill" are the task name. This is not a good choice, because you may have many Latex tasks run at the same time, and they all have the same task name "Latex". So "Kill -9 Latex" may produce ambiguity. Like other operating systems, UNIX provides a command "jobs" listing the task. Here is an example: ZVona @ rice-chex> jobs [1] - stopped Latex [1] - stopped Latex [1] stopped latex like you You can identify a task with a JOB number (indicated in []). If you are affected by those who have not been carried out, you will want to use "Kill -9 1" to kill the first LATEX task. But you will find the following error message: zvona @ rice-chex> kill -9 1 1: NOT OWNER The correct approach is to use the process number, such as 18517. You can use the "ps" command to get it. When you find the corresponding process number, you only need: zvona @ rice-chex> kill -9 18517 zvona @ rice-chex> [1] Killed Latex notice that UNIX is given to your prompt before your task is really killed. . This reflects a UNIX design principle: for feedback to users, you can say less, you can say that you can say it later. So-users who may lead to excessive information may result in. I hope that these experience can be useful to everyone. During this study process, I have been deeply attracted by UNIX design philosophy. We should all learn from the elegant, powerful and concise middleness of the UNIX kill command. The second chapter is over, I have experienced so many difficulties, you are not new, the book will introduce the documentation of UNIX, or there is no documentation in UNIX. Chapter III Document OK, not a novice, you may want to learn more Learn from UNIX. Nice, UNIX documents are what you need. Documentation What Documentation "A benefit of operating system teaching using UNIX is that the student's school bag can install all UNIX source code and documents.
"- John Lion, New South Wales University, 1976 discussed a paragraph of UNIX version 6. Many years, there have been three simple ways to get UNIX knowledge: read the source code writes a procedure for writing a UNIX Well call (or email) and Homa epic, UNIX is jealous. If you do not become a hacker, you can't be a serious U NIX user - or at least you should have a tentacle. The kernel hacker. That does exist document - MA N manual - but some memo I already know what people do in doing what they are doing. UNIX's document is so simple, you can read it in an afternoon. The online documentation Man Tool is the foundation of the UNIX document system. MAN accepts the parameters you entered, find the corresponding document file, output it to nroff (some text format macro) that is not used in some places, the last result is sent To PG or more. First, these fragmented documents are called "MAN Pages" because these documents are more than one page (most cases are less than one page). MAN is a good thing for that era, but that The times have already been returned. Over the years, the MAN system has developed mature. It is worthy of praise. It does not make the code chaotic program like UNIX, but it has not become more useful. Fact On the past 15 years, UNIX's documentation system has only two improvements: Catman. Programmer "surprise" found that in addition to NROFF format, they can store handled document files, so that the document is called up It is faster. For today's fast processor, Catman does not seem to have it. However, many NROFF processed document files still occupy a few megaby disk space of the user. Makewhatis, Apropos and Key (eventually constitute Man -k function ) Is a system that indexs to the Man manual, so that even if you don't know the exact name of the program, it can be inquired. At the same time, the momentum of the electronic publishing has already exceeded the Man manual. Using today's hypertext system you can use the mouse from one The article jumps to another article; Compared with it, the Man manual provides "See Also" section at the end, letting the user will man yourself. How is the index function of the online document? Today you can buy CD The Oxford English Dictionary on -ROM, which has an index in each of the words; but the Man manual is still indexing the command name and description line. Today, even DOS provides an indexed hypertext document. But the Man Manual Still use in the format of 80 columns 66 rows suitable for DEC print terminals. Fair point, some vendors can't see it, provide their own hypertext online documentation system. On these systems, the MAN manual has come to the end of evolution, often not outdated, it does not exist at all. "I know that it is here, but I can't find" People who are still using the Man Handbook today, the biggest problem is to tell MAN Handbook you want. " It is easy to find the Man's manual: all in / usr / man. Later, the Man Handbook is divided into different directories in chapters: / usr / man / man1, / usr / man / man2, / usr / man / man3, etc. Some systems even put the "local" manual under / usr / man / man1. When the AT & T release system V, the situation becomes puzzled. The / usr / man / man1 directory becomes / usr / man / c_man, which seems to be better than the number. In some systems, / usr / man / man1 becomes / usr / local / man. Those companies that sell UNIX applications began to build their own Man directory.
Finally, Berkeley modified the Man program to find a series of directories specified in the Environment Variable $ MANPATH. This is a great idea, there is only one small problem: it doesn't work. (The following is omitted, because I am too lazy, the content is too late, the man's man on Linux is not bad, in addition to the MAN manual of the shell internal command, of course, Man Bash is a choice - me). Is this an internal document? Some big UNIX tools also offer their own documents. Many online documents of the program are "Usage" (usage), which is called. Here is the "use" of awk:% awk awk: usage: awk [-f Source | 'cmds'] [files] is it useful? Complex procedures have a more in-depth online document. Unfortunately, they sometimes don't seem to be a program you are running. Date: 3 Jan 89 16:26:25 Est (Tuesday) x-Virus: 6 from: Reverend Heiny
You can get the file name, environment variable, unapproved option, weird error message, and so on all programs. For example, if you want to know how CPP looks up the header file, you'd better use strings instead of man: Next% Man CPP No Manual Entry for cpp. Next% strings / lib / cpp | grep / / lib / cpp / lib / / / / / usr / local / lib / / cpp next% um ... don't worry NexT% ls / lib cpp * gcrt0.o libssy_s.a cpp-precomp * i386 / m68k / crt0.o libsys_p.a posixcrt0.o next% strings / lib / cpp-precomp | grep / / *% s * / //% s / usr / local / include / NextDeveloper / Headers / NextDeveloper / Headers / ansi / NextDeveloper / Headers / bsd / LocalDeveloper / Headers / LocalDeveloper / Headers / ansi / Localdeveloper / headers / bsd /nextdeveloper/2.0compatibleHeaders% S /% s / lib /% s / specs next% I am stupid. Nextstep's CPP uses / lib / cpp-precomp. You can't find this in the Man's Manual. NEXT% man cpp-precomp no manual entry for cpp-precomp. OK. Is this because what? Where did this come from? Next decomposition. The last book is said to source code is the best and unique document. The root cause is because UNIX is ... to the programmer, not the user not blame Ken and Dennis because of the document of UNIX. When UNIX just started to establish a document, there is no document standard, some bugs and potential traps, not the function of the program, which is because people who read these documents are often UNIX system developers. For many developers, the MAN manual is just a place to collect bug reports. Those ideas that provide documents for primary users, programmers and system administrators are new. Sadly, this concept is not very successful due to UNIX document system established in the 1970s. The UNIX World recognizes the status quo of these documents, but does not think there is a big deal. "UNIX Life" is objectively explained to the attitude of UNIX for documents: UNIX source code is the best documentation. After all, this is the system used to determine how to run the documentation. Document is used to explain the code, often written in different times, and these people are often not writing code. You should think of these documents as guidelines. Sometimes these documents are just expectations. However, a more general approach is to find unsourized usage methods and function descriptions in the source code. Sometimes you will send the functions recorded in some documents. It is not implemented. This is just for the user program. For the kernel, the situation is even worse. Until recently, there is no documentation for the device driver and internal-core calling functions provided by the manufacturer. Some people joked: "If you feel that you need to read the documentation of the kernel function, you are likely that you don't use these functions." The truth is even more evil. The reason why there is no kernel document is because AT & T looks its code as a commercial confidential. If you want to write a book that describes the UNIX kernel, then you will wait for the Sub. The source code is the destined documentation, and the plan of AT & T is constructed. Due to no documentation, the only way to understand the kernel and applications is to read the source code. As a result, UNIX source code was madly pirated in the initial 20 years.
Consultants, programmers and system administrators to engage in unix source code is not to recompile or produce their UNIX version, they need documents, and source code is the only choice. Unix source code from the university flows to the surrounding high-tech company. This is of course illegal, but there is an enabled: UNIX manufacturers are not enough. This is not the secret of what is worthy of money in the source code. All people who have read UNIX code were thrive at a righteous confesses: / * You are not expected to understand this * / (/ * Didn't expect you to understand * /) Although this line of comments start appear in UNIX V6 In the kernel, almost all the original AT & T code is similar, which is full of macros that have been optimized and weird. The register variable is crown with the name of P, PP, and PPP. "This function is recursive" seems to indicate what is difficult to understand. In fact, the attitude of AT & T is good for the manager is just a reflection of the horrible attitude of its writing code. To identify a footprint man is actually very simple: you will see the paint on the crack, a patch that picks up one, all things are barely enhanced by tape and chewing gum. It must be recognized: If you want to build and redesign from head, you must think more, and more effort. Date: Thu, 17 May 90 14:43:28 -0700 X-Virus: 6 from: David Chapman
Content: Introduction: "No Document" Philosophy Introduction Why is the manual why the MAN manual is the demon, why should you still read this document? "This will be the last document you read!" Chapter 1: How to guess what or exists Chapter II: How to Guess the Unix Monsibility Case Case: GREP Chapter 3: How to Guess the Order Options How to Crack Weird Instructions: TAR How does it know when it is an important case: Fine Chapter 4: How to know the correct: No news is good news to restore the fifth chapter from the error: Oral Tradition: Your Friends Chapter 6: How to get and maintain a living Unix master How to feed your master How to make a happy news Group connection importance Why your master requires the fastest computer free cola: How does the master's long-lived? How do you keep a master health? Seventh chapter: Common Troubleshoot: Your master doesn't care about your stupid problem How to safely put forward stupid problems Chapter 8: How to endure how to treat failure: May only have 6, 7 chapters really need. Yes, this is the right road: I call it "Unix Master Domestication Guide". OK, there is no document. The next book will take you into the beautiful world of Sendmail, why "The feeling of sendmail is the same as the flower willow disease."? Next decomposition. Chapter 8 CSH, PIPES and FIND (Part 1) UNIX Romance has begun, and this book will form a table of Sendmail and flower willow disease. Big, so I don't want to find her any trouble, I am interested in the history of the history of the prostitute and the history of sexually transmitted diseases, we can communicate privately. As a programmer, you may be more interested in the UNIX programming environment, so this section describes the history of u Nix shell. My GPL, you don't spend money, so I can only let me put, what you eat, don't talk nonsense. The benefit of GPL is that you don't have to be responsible for your work, you don't have to be responsible for users, so SourseForge is full of tight free projects. I hope that my people can understand this. All the beginning is not for what value, responsibility, the past or the future, all this is not for the present, all this is just from pass. In each of the bubbles discharged in the sea, every time in the past, every pain in the past, in the past, in the weather forecast and news broadcast in July, in July, in July, in July, in July. In July, the Temple in July is in the Yifu Kani and Furong in the long rope day in the long-lasting day, in my heart, I don't have to seek significance. Chapter 8 CSH, Pipes and Find powerful tools to strong power The fool "Some operating systems have never been planned to plan, so that I have to name it with ruminant noise (AWK, GREP, FSCK, NORFF), I think this is to nausea." - "- Unknown Unix so-called" powerful tool "It is a scam. This is just that UNIX is a hit of the mosaic for the commandments and tools. The real powerful tool does not require users to pay too much effort to provide powerful features. Anyone who will make the cadre and drill should use electric to modify cone and electric drill. They don't need to understand electrical, motor, torque, electromagnetic, heat dissipation or maintenance. They only need to power it up, bring safety glasses, and turn on the switch. Many people do not have safety glasses. You can't find a deadly defective tool in the Hardware store: They are not at all, they are not able to put on the market, that is, they are being attached to the head. The initial goal of UNIX designers is to provide simple tools, however now is full of excessive design and bloated features.
For example, the command of the LS file has 18 options, providing various functions from sorting to the specified display column number, and these functions can be better (formerly this). The Find command except for files that have also been logged out of the file (and this function is greatly implemented using the Unix reputation of the Unix). Today, with Unix's electric drills will have 20 knobs, with an inexpensive power cord, do not match 3/8 inches and 7/8 inch drills (and this will be described in the BUG chapter of the manual). Unlike the tools in the Hardware store, many UNIX powerful tools are defective (sometimes fatal on the file): such as the file name of T Ar does not accept more than 100 characters; such as UNIX debugger, this Not enough, its Core file will override your own core, let you take the debugging of C fore next to the CO ORE generated in the debug debugger. The SHELL Game Unix inventor has a great idea: implement the command parser as a user program. If the user doesn't like the default command parser, he can write one yourself. More importantly, shell will evolve, so shell will not make progress, become more powerful, flexible and easy to use, at least theoretically. This is really a great idea, but it is intimate. The gradual increase in functions is a mess. Because these functions have not been designed, it is only evolved. Like the curse they have passed by all programming languages, those who use these features have become the biggest enemy of the shell. As long as there is a new function to join the shell, someone will use it in his script so this feature is not old. Bad ideas and stinking strength often can't die. So you got an incomplete, incompatible shell hodge (the description of each shell): SH is a command programming language for executing commands from terminals or files. JSH and SH, but the work control of CSH flavor (Job Control) CSH C Type Syntax Shell TCSH Emacs Edit Taste CSH Krnshell, your other command and programming language Zsh Z shell Bash Gun Bourne-Again Shell (GNU Bourne Reset Shell) The screwdriver and saw in the hardware store, although it may come from 3,4 different vendors, the operation method is almost. Typical UNIX is deposited by hundreds of programs in / bin or / usr / bin, and they come from many self-righteous programmers, with their own grammar, operating examples, and use rules (this can be used as a pipe, and that one is temporary Document), different command line parameters, and different restrictions. Take GREP and its variant fgrep, egrep, which is the fastest? Why is the parameters they accept are different, and even the understanding of the regular expression is not the same? Why can't you have a program to provide all features? Where is the responsible guy? When all the various types of orders are deeply branded in the mind, you can't avoid being thrown. SHELL CRASH The following message comes from the BBS of the Columbia University Compilation Principles.
Subject: Relevant Unix bug October 11, 1991 W4115X class classmates: We just learned Activation Record, argument passing, and function calling rules, do you know that the following input will make any A cshell collapsed immediately? :! XXX% s% s% s% s% s% s% s Do you know why? The following questions are for you to think: What will she do when shell encounters "! Xxx"? What will she do when shell encounters "! XXX% S% S% S% S% S% S% S% S"? Why will CSHELL crash? How will you modify the code to solve this problem? The most important thing: When you (yes, you), when you use this future operating system, you do you think it is tolerance? You can try it yourself. According to UNIX design, if the shell is falling, all your processes will be killed, and you will also be kicked out of the system. Other operating systems pop up the debugger when encountering illegal memory access errors, but not UNIX. May this is why Unix Shells does not allow you to dynamically load your module in the address space of the shell, or call the functions in other programs directly. If this is too dangerous. Step by error, 唉, you have been kicked out of the door. The stupid user should be punished, and the programmer's mistake is not tolerable. We will enter the colorful UNIX grammar world. --- I said that there is a sorrow and joys, why is it sad to say that my friend is like a water, why didn't I say, I've said that the young laugh is to read the future. I will cry. I will cry. Http://research.microsoft.com/ ~ Daniel / Unix-Haters.html Chapter 8 CSH, Pipes and Find (Part 2) Half a year ago, I said how you play shell games, it is estimated that you are already tired now, but don't worry, below, should Go to Shell playing with you. Welcome to the metasyntacitic zoo C shell's metamorphological operator brings a lot and quote related issues and confusion. The meta operator converts it before being executed. We call these operators to operate because they don't belong to the syntax ingredients of the command, they act on the command itself. Most programmers are not unfamiliar with the meta operators (sometimes called escape operators). For example, the backslash (/) in the C string is a meta-syntax method; it does not represent himself, but a description of the following characters. If you don't want this, you must use the reference mechanism to tell the system to handle the meta operator as a general character. Go back to the example of the C string, if you want to get a backslash character, you must write //. A simple reference mechanism is hard to work in C shell because the words between the shell and the programs it execute, cannot be unified. For example, the following is a simple command: Grep string filename; parameter string contains grep defined characters, such as?, [, And], but these characters are the monolithic operator. This means that they must be referenced. However, sometimes it is possible to do so, which is related to what kind of shell and environment variables you use. This is more complicated if you want to find a point (.) Or other mode starting with the horizontal bar (-) in the string. Be sure to remember the correct reference to the metamorphism. Unfortunately, like pattern identification, all parts of the operating system are full of reference standards that are incompatible. C shell's metamorphological zoo has raised seven different meta operator families. The fight is shifted, and it has been full of blindness in the zoo in an eye, and the cage is no longer steel, but uses tin.
Small friction between the animals is constantly. These seven conversion methods for the shell command line are: alias alias, UNALIAS command output replacement `file name replacement *,?, [] History alternative!, ^ Variable replacement $, set, unset process replacement% reference '," this The result of "Design" is that the question mark (?) Is always as a single character match, which will never be passed to the user program as the command line parameter, so don't want to use the question mark as a help option. If this seven metamic characters have Clear logical order and definition, then the situation will not be too bad. Can't do this: Date: MON, 7 May 90 18:00:27 - 0700 Sender: andy beals
If $ * appears in double quotes, the space between the various parameters will be added ("$ 1 $ 2 ...", and if $ @ appears in double quotes, the space between the various parameters will not be Plus the reference ("$ 1" "$ 2" ...). I think $ {1 "$ @"} can always be compatible with the "version 7" shell. Old days! It has been compatible with "version 7". Listen " "Does the" CD "? Unix is easy to do in the long evolution process, these UNIX system developers lead Unix to different directions, no one is stopped to think about their own practices will not And other people have conflicts. Date: Mon, 7 May 90 22:58:58 EDT sender: Alan Bawden
Let's verify this theory, write a script to list all the files in the directory and use the file command to display the type of file: Date: Fri, 24 Apr 92 14:45:48 EDT sender: Stephen Gildea
However, a guy uses it to measure an empty directory, this time * produces the output of "No Such File" (without this file). However, we can certainly continue to deal with this situation .... When this step UUCP may be suspected of this email, it seems that I can only go here, please resolve the remaining bug. Bar. Stephen also has a bigger question Stephen, but we have intended hidden from the beginning: UNIX file command does not work. Date: Sat, 25 Apr 92 17:33:12 EDT Sender: alan bawnden
Date: THU, 14 Nov 1991 11:46:21 PST Sender: Stanley's Tool Works
Run this script: tar cf temp.tar no.such.file if ($ status == 0) Echo "Good news! No error." You will get the following results: tar: no.such.file: no such file or Directory Good News! No Error. I understand - I should not expect any consistency from the beginning, useful, help good, fast, or even the correct result ... Bjorn Ok, was tossing by Shell is very cool? Haven't I be addicted yet? Don't tighten, let's go back to a place, drilling into the UNIX sewer (PIPE) to experience infinity pain and happiness - saying that there is a sad joy to say that friends are like water, why didn't I say young, laugh? Why often recall the youth, I have no regrets, why also cry NSHD Posted: 2003-01-14 22:33 Posting: 2 Registration: 2003-01-09 Interesting continued to turn SCZ Posted: 2003-01-15 09:31 Posting: 1400 Registration: 1999-10-26 Title: Chapter 8 CSH, Pipes and Find (Part 3) Posting: Pengchengzou (Veteran) Posts Date 01/14/03 23:20 Pipeline Unix suffers from abusers, welcome Go to UNIX sewer. "In our century, Paris sewer is still a mysterious place. If you know your own below is a terrible Dashan, Paris will be uneasy." - Hugo "Tragic World" is just my own view of UNIX. About six years ago (when I have the first workstation), I used a lot of time to learn UNIX. It should be a good job. Fortunately, these garbage in the brain is slowly degraded over time. However, since this discussion begins, many UNIX supporters have sent me an example to "prove" Unix's power. These examples certainly evoke my many good memories: they all use a simple and useless feature. There is a guy to say how a shell script allows him to get "success" (this script uses four noise the same command to rename all the '.pas' suffixes of the '.p' file ). But I still want to leave my religious enthusiasm to more important things than to change a few file names. Yes, this is the memory of the UNIX tool to me: you use a lot of time to learn those who have a strange flower stand, but it is an empty. I still go to learn a useful trushy. - Jim Giles Los Alamos National Labs Unix fell down on the truth of the pipe (PIPE). They sing pipes: there is no Unix without pipelines. They are interspersed with sound: "Pipeline can make you construct more complex programs with simple programs. The pipeline can use the command in the way, and the pipeline makes it easier." Unfortunately, Yangge's role in the role of the song on UNIX It is not much better than the great flag. The pipe is not a place. Modularity and abstraction is necessary in complex systems, which is the basic principles of computer science. The more excellent basic tools are excellent, and the complex system established by it will be more successful, the higher the maintenance resistance. The pipe is still valuable as a structural tool.
The following is an example of a pipe: egrep '^ to: | ^ cc:' / var / spool / mail / $ user | / cut -c5- | / awk '{for (i = 1; i <= nf; i ) print $ I} '| / sed' s /, // g '| grep -v $ user | sort | uniq understand? This program receives the mail list (almost what this means) by reading the user's mailbox. Like the water pipe in your home, this Unix pipeline will also crack mysteriously in a particular situation. The pipe is sometimes very useful, but it performs inter-process communication by connecting standard input and output, this mechanism limits its application. First, the information can only flow one-way. The process cannot communicate two-way communication through the pipeline. Second, the pipe does not support any form of abstraction. The sender and the receiver can only use a character streaming signal. A slightly complicated object than characters cannot be directly transmitted through the pipe, and must be serialized to a character stream, and of course the receiver must re-assemble the resulting character stream. This means you can't transfer an object and to create the definition code for this object. You can't transfer the address space of the pointer to another process. You cannot transfer file handles or socket handles or file permissions properties. Risks who have been self-righteous, we think the correct model should be process calls (local or remote) to deliver the first class structure (this is the C language from the beginning Functional Composition. Pipeline is good for simple tasks, such as text flow, but use it to build robust software, it is a bit stretched. For example, in an early test of the pipeline, how to use the pipeline to combine some small-range sequences to form a spell check program. This is a classic that reflects the simpleness, but if it is used to check the spelling error, it is worse. The pipe is often able to reveal a small hand in the shell script. Programmers use it to achieve some simple and fragile solutions. This is because the pipeline makes the two programs have rely on relationships. If you modify a program's output format, you must simultaneously modify the input processing of another program. Most programs are step-by-step: first develop the needs of the program, and then the internal progressive progress is gradually formed, and finally write an output processing function. The pipe does not put this set in the eyes: As long as someone put a half-life program in the pipeline, its output format is dead, no matter how inconsistent, not standard and inefficiencies, you can only recognize it. The pipe is not the only choice for inter-program communication. Macintosh has no pipeline, our favorite UNIX propaganda manual is written like this: However, Macintosh uses a completely opposite model. The system is not dealing with the character stream. Data files have higher levels, always related to specific programs. When did you pass the output of a Mac program to another? (If you can find the pipeline, you can vent it.) The program is integrated, you must completely understand what you are doing. You can't get Macfoo and MacBar together. - From "UNIX Life" Libes and RESSLER, these poor MAC users. If you can't turn the character stream through the pipeline, how can they insert a picture of the painting program in the document? How can I insert a table document? How can I send this east to the West? How can I get seamlessly to browse and edit it seamlessly, and I will reply back? There is no pipeline, we can't imagine how this is what is done by Macintosh in the past decade.
What is the same when your UNIX workstation is the same as Macintosh? What is the software you can run on it on it? Moreover, these software can communicate with each other. If UNIX really did this, it was because Mac software developers put their software into UNIX and hoped to let UNIX look like Mac. The fundamental difference between UNIX and Macintosh operating systems is that UNIX is designed for pleased, and Mac is for users. (Windows is an accounting, but this is some of the questions). Studies have shown that pipes and redirects are difficult to use, not because they are here, but due to their random and non-intuitive restrictions. UNIX's own document has long indicated that only UNIX's death party can experience the work of pipelines. Date: Thu, 31 Jan 91 14:29:42 EST Sender: Jim Davis The Unix search program Find is not a user, but a CPIO, a UNIX backup tool. Find has not foreseen the existence of the network and the new features of the file system (such as symbolic links), even if it has undergone repetitive modifications, it still can't work well. Thus, although it is very significant for the user's resulted file, Find still does not stabilize and work. The author of UNIX efforts is Find to keep up with the development of the rest of the system, but this is not easy. This day's Find has a variety of special options to process NFS file systems, symbolic links, execute programs, interactive execution programs, and even archive found files directly using the CPIO or CPIO-C format. Sun has modified Find and adds a background program to establish an index database for each file on the system. Due to some strange reasons, this database is used to search if you do not add any parameters to perform "Find FileName". Safe, is it?) Even if there is so much repair replenishment, Find still does not work properly. For example, CSH sees the symbolic link to go, but Find will not: CSH is written by the guys of Berkeley (the origin of symbolic link), but Find is from the original era of AT & T. In this way, the cultural differences between the East and the West have collided, causing huge confusion: Date: THU, 28 Jun 1990 18:14 EDT sender: pgs@crl.dec.com Theme: More Things to Hate About Unix (More If you hate, you will be in UNIX-Haters. This is my favorite. I work in a directory, I want to find files in another directory with find, I do this: Po> PWD / ATH / U1 / PGS PO> Find ~ halstead -name "* .trace" -print po> It seems that I didn't find it. But don't be busy, look at this: Po> CD ~ Halsead Po> Find. -Name "* .trace" -print ../learnx/fib-3.trace ../learnx/p20xp20.trace ../learnx/fib -3i.trace ../learnx/fib-5.trace ../learnx/p10xp10.trace po> Hey! The file is there! Next time, if you want to find a file, remember that random to each directory, can you hide it there? UNIX this waste. The poor Halstead comrades / etc / passwd record must be used to use a symbolic link to point to the real directory, so there is a command work, some don't work. Why not change Find, let it link in a symbol? This is because any symbolic link to the high-level directory will introduce the Find into the dead cycle. To handle this situation requires careful design and careful implementation to ensure that the system will not repeat the same directory. Unix uses the simplest approach: Sorry does not process symbolic links, let users look at themselves. The networked system becomes more complicated, and the problem is increasingly difficult to solve: Date: WED, 2 Jan 1991 16:14:27 PST sender: Ken Harrenstien Although "Find" syntax is very disgusting, I am still barely use it, so as not to use a few small bubbles in a list of files in the maze. In this brave new world with NFS and symbolic links, "Find" is useless. The so-called file system here is a group of messy numbers, "find", which does not want to deal with many file servers and symbolic links, and even the options are not available ... The result is that a large number of search paths are ignored by no sound. . I noticed this, it was a big directory. Suspose did nothing, and finally found because that directory is a symbolic link. I don't want yourself to check each search directory handed over to find - this fuck should be the work of find. I don't want to go to the system software every time this kind of situation occurs. I don't want to waste time to fight with Sun or the entire UNIX party. I don't want to use UNIX. Hate, hate, hate, hate, hate, hate, hate, hate. --Ken (feeling better, but still annoyed) If you want to write a complicated SHELL script to process the found file, the result will often be strange. This is the tragic consequence of the shell pass parameters. Date: SAT, 12 DEC 92 01:15:52 PST Sender: Jamie Zawinski What should I do now? I think, I seem to try "SED ..." But I forgot a deep philosophy: "When I encounter a UNIX problem, some people will think 'I understand, I can try Sed.' They have two questions to deal with them. "Test five times, read the SED manual twice, I got this:% echo foo.el | SED 'S / $ / c /' from:% Find. -Name '* .el' / -exec echo test -f `echo '{}' / | SED 'S / $ / C'` /; test -fc test -fc test -fc .... ok, it seems to be Try all the combinations of all shell references, will there be one and me? % Find. -Name '* .el' / -exec echo test -f "` echo '{}' / | s / $ / c'` "/; variable syntax.% find. -name '* .l '/ -exec echo test -f' `echo" {} "/ | SED" S / $ / c "` '/; test -f `echo" {} | SED "S / $ / c" `test - f `echo" {} "| SED" S / $ / C "` test -f `echo" {} "| SED" S / $ / C "` .... Hey, the last thing seems to have a play. I just need to do this:% find. -Name '* .el' / -exec echo test -f '`echo {} / | SED" S / $ / c "`' /; test -f `echo {} Sed "S / $ / C" `Test -f` echo {} | SED "S / $ / C" `test -f` echo {} | SED "S / $ / C" `.... Don't worry This is what I want, but why don't you replace {} to file name? You carefully, {} is not a space in both sides? What do you want? Oh, wait. The reference between the anti-single quotation is used as an element. Maybe I can filter out this back single quotes with SED. Well, no play. So I used it for half a minute to run "-EXEC SH -C ..." and finally appeared, and I wrote a piece of Emcas-Lisp code to do this. This is not difficult, very fast, and work. I am so happy. I thought that everything passed. I suddenly thought of another approach when I took a shower this morning. I tried again and again, deeply dropped in her love network, and the fascinating fans, could not extricate. drunk. Only Hanta's Scribe achieved the pleasure of me. I only tried 12 times to find a solution. For each traveled file, it generates only two processes. This is the way of UNIX! % Find. -Name '* .el' -Print / | SED 'S / ^ / Foo - /' | / SED 'S / $ /; IF [! -f $ {foo} C]; THEN / ECHO / $ Foo; fi / '| sh bwaaaahh haaahh Haaaahh Haaahh Haaahh Haaahh Haaahh Haaahh !!!! -jamie OK, playing in the sewer to catch a hide and hide? Chapter 8 is over the laughter. We must start programming, and remember how to say that the cute and charming nurse is what to say when you are young. Chapter 9 Programming Don't provoke Unix, it is weak and not forbidden, and it does not move the nucleation (Core Dump) - If you are learning programming by writing C code on UNIX, you may think that this chapter has some awkward. Unfortunately, UNIX is so widely applied to the field of research and education, and few students can realize that many of UNIX is not reasonable. For example, after listening to us about many languages and the environment than C / UNIX, a UNIX enthusiast is such a defense for UNIX and C: Date: 1991 Nov 9 Sender: TMB@ai.mit.edu (Thomas M. BREUEL) Scheme, SmallTalk and Common Lisp These languages have indeed provided a powerful programming environment. But the UNIX kernel, shell and c language are more widely problematic, and these problems are not good at those languages above (some are unable to handle). These problem spaces include memory management and locality (implementation and termination in the process), persistent (use file storage data structure), parallelism (via pipeline, process and process communication Mechanism to achieve), protection and recovery (implementation by separate address space), and data expressions that can be visually read (implemented using text files). From a practical perspective, UNIX can deal with these issues very well. Thomas Breuel praises UNIX to solve complex computer science issues. Fortunately, this is not the way to solve the problem in his scientific field. Date: Tue, 12 Nov 91 11:36:04 -0500 Sender: Markf@altdorf.ai.mit.edu Record: UNIX-HATERS Topic: Random Unix Similes (Random UNIX Smiley) By the generation and termination of the process To perform memory management, this is like dealing with the disease by controlling the life and death - this ignores the real problem. Getting continuity through UNIX files is like putting all your clothes, fantasy can find clothes from inside (unfortunately, I am doing this). Conveying by pipelines, processes, and process communication mechanisms? The cost of UNIX processes is so high, so that it is not worthy. As is an encouragement of employees to have a baby to solve the problem of the company's human resources shortage. Nice, UNIX can of course handle text. He also handles text. Well, there are, have I mentioned that UNIX handles text well? - Mark's spectacular Unix programming environment Unix fmanhers are always promoting the so-called "programming environment" of UNIX. They say UNIX provides a wealth of tools that make programming work easier. This is the statement of Kernighan and Mashey in the "UNIX programming environment": UNIX environment can improve programming efficiency, which is attributed to many small and useful programs - tools, these tools provide help for daily programming work. These procedures listed below are considered to be the most useful. We will explain other views as described below. WC Files - The number of lines, words, and characters in the statistics file. Pr FILES - Print files, support the title and multi-collapse. LPR Files - Print file Grep Pattern Files - Find file rows that match some mode. Many programmers work is done with them and some other related procedures. For example, WC * .c is used to code quantity statistics for all C source code files; GREP GOTO * .C is used to find all GOTO statements. These are "most useful"? ! ? ! Reason. This is the daily work of the programmer. In fact, today I have used a lot of time to count my C code quantity, so that there is not much time to do other things. Wait a minute, I think I have to have a few more. There is also an article on the same period "IEEE computer", which is the "Interlis Programming Environment" written by Warren TEITELMAN and LARRY MASINTER. Interlisp is an extremely complex programming environment. In 1981 InterlisP has a UNIX programmer to 1984 still in the dream. The designers of the InterlisP environment are completely different. They decided to develop a complex tool that takes a lot of time to master, and the advantage is once learned, greatly improves programming efficiency. Listen to some truth. Sadly, very few programmers can experience the use of such environments today. Programming in Plato's cave I always have a feeling, the goal of computer language design and tool development should increase programming efficiency rather than decrease. - A post-Comp.lang.c other industries have long appreciated the meaning of automation. When people walk into the fast food point, they need a consistent standard thing, not what French big dishes. A large-scale providing general food, which makes more money than small batches of cultivation. - NetNews Reply Unix is not the best software environment in the world - it is not even a good environment. Unix programming tools are simple and difficult; UNIX debuggers and PCs can not be better; the parser (interpreters) is still rich toys; a change log and review (Audit TRAIL) always thinks Go do. Unix is still treated as a programmer's dream. Maybe it can only make the programmer dreams of improvement, rather than really improving efficiency. UNIX programmers have a bit like mathematicians. You can observe a mystery from them, and we call "Programming By Impens). Once we chat with a UNIX programmer, talk about the need for such a tool, can answer questions such as "function foo call?" Or "the function changed the global variable bar". He also believes that this tool will be useful. "You can write one yourself." Said that he just said, "You can write a" instead of truly write one, this is because of some of the characteristics of C language And Unix "Programming Environment" is strong, making it difficult to write such a program to go on the sky. Parsing with yacc "YACC" is just after Yacc (1). - Anonymous "YACC" is the meaning of the compiler of the compiler, the YET ANOTHER Compiler. It accepts the syntax of context-free, constructs a PushDown Automaton for parsing. Run this automation, you get a parser for a particular language. This theory is very mature, because an important issue for computer science is how to reduce the time to write compilers. This method has a small problem: many language syntax is not unrelated to the context. Such YACC's users have to add relevant code to each state conversion point to handle and context (type inspection generally). Many C compilers are used by YACC generated parsers; GCC 2.1 YACC syntax has more 1650 lines (if not Yacc, GCC should be a good work for free software foundation). The code generated by YACC is more. Some programming languages are more likely to analyze. For example, LISP can parse with a recursive drop parser. "Recursive decline" is a computer term, meaning is "Drinking Coca-Cola" can be realized. " As a test, we wrote a LISP recursive drop parser and only used 250 line C code. If it is written with Lisp, then a page paper can not be used. The computer science mentioned above, the editor of this book has not been born yet. The computer room is the world of dinosaurs, "real people" programmed in the dashboard. Today, sociologists and historical workers want to break their heads and can't understand why inense programmers are designed, implemented and spreading languages. Perhaps they will always need a difficult research project, designing a language that is difficult to analyze seems to be a good topic. I always want to know what medicine they eat in that era. The tool mentioned above is similar to the front end of a C compiler. The front end of the C compiler is an extremely complex east, which is caused by the complex syntax of C and YACC. No one truly handles a tool, what is the strange thing? Dead unix molecules will say that you don't need such a program, because there is GREP enough. And, you can use GREP in the shell pipe. One day, we wanted to find all where all the MIN functions in the BSD kernel source code. This is one of the results:% grep min netinet / ip_icmp.c icmplen = Oiplen min (8, OIP-> ip_len); * That Not Corrupted and of at Least Minimum Length. * If the incoming packet WAS Addressed Directly To US, * to the incoming interface. * Retrieve Any Source Routing from the incoming packet;% is quite good, GREP found all MIN function calls, but not there. "I don't know how to make love. I withdraw." The ideal programming tool should be like this, it can keep simple problems simple, so that the complex problem is solved Unfortunately, many UNIX tools are excessive pursuit of versatility. Make is a typical. From abstract meaning, Make's input is a description of the relationship. Each node on the relying on the map corresponds to this A set of commands, when the node expires (determined by the node it rely on), these commands will be executed. The node and file are related, the file modification time determines whether the node has expired. The following is a simple dependence diagram, also That is, Makefile: Program: Source1.o Source2.o cc -o program source1.o Source2.o Source1.o: Source1.c cc -c Source1.c Source2.o: Source2.c cc cc Source2.c Here Program, Source1.o, Source2.o, Source1.c, Source2.c are nodes on the relationship map. Node Program relies on Source1.o and Source2.o.