Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

The simple UNIX-opening the History

2025-02-14 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/03 Report--

It is no exaggeration to say that the UNIX model is the prototype of the modern operating system! Whether it is the original UNIX series, such as AIX,Solaris,HP- UX,FreeBSD,NetBSD,... Or similar to UNIX, such as Linux... Or a variety of Microsoft operating systems based on Windows NT architecture, whose basic ideas are derived from UNIX. Although each of these systems is more complex than the other, please remember one sentence: all basic ideas are and must be simple and simple!

Perhaps, many people feel a little dismissive when they see this. After all, they think that they are technophiles. They think that only by fiddling with complicated things can they prove their knowledge and technology. They think that UNIX V6 in the 1970s is an outdated system, and its contents have long been mercilessly abandoned by the years. But in fact, if you really read the UNIX source code of that era, or read Leon's divine book. And after careful consideration, you will find that those simple ideas are not out of date at all, and many of our technologies are based on ideas that were simply realized as early as 1975. To look old at the age of three and to ignore history is to despise the future.

If you want to know what we are going to do next, it is useless to listen to others. You must open the history yourself. Just as I give you part of a curve and ask you about the next step of the curve, you can certainly guess, because everything goes along with history, that is, reading history makes people wise. When I had dinner with a friend a few days ago, I was treated to a chip expert, but he was dismissive of anything except chips. What a traditional Chinese specialty in the so-called technology industry, just like in the world, except chips are high-tech, everything else is pediatrics, what an exclusive! But did you think of anything through this? this friend is a typical Chinese pedant, the so-called professional pedant, and it is this so-called specialization that we in China have never actually led an era. the agricultural revolution originated in Mesopotamia and the middle and lower reaches of the Nile, and the Bronze Age was led by nomads in West Asia and Eastern Europe. The concept of democracy comes from Greece, imperialism comes from Rome, the industrial revolution comes from Western Europe, the electric revolution comes from Western Europe, and the computer revolution comes from the United States. I will elaborate on this last point. I did not mention the Renaissance on purpose, because if I mentioned the Renaissance, many people would say that there are many such things in China, such as Zhou Zhao Republic and King Qin sweeping Liuhe. Han Wu Emperor, Guangwu Zhongxing, Zhenguan Zhizhi,. Kangqian flourishing age, but you should know, like the Renaissance, these are all local events. Anyway, we now live in buildings, play mobile phones, play computers, drive cars, take high-speed rail, all of which are imported, none of which is our own, the reason is that our thinking is not divergent enough, we do software, we do not know that some ideas can also be used in hardware. We are very proficient in learning OO, we are proficient in Clearing Java, but we don't know that OO comes from AI... Know Levi's flight but don't know its connection to the computer memory access model.

Said so many seemingly useless, in fact, I want to say, do something else, although we are all engaged in IT. Many programmers may really disdain the 1975 UNIX source code, that is history, this master knows how to design classes and how to implement a sorting algorithm, but only so, in the final analysis, he is just a high-priced mercenary who can never become a general and may be killed at any time. I wrote this series of articles to make it clear that if you look at the development of the operating system and any technology from a historical perspective, you will get more. Narrowly speaking, UNIX is a treasure house, and any idea you can find in modern software is its shadow in UNIX.

Before we officially begin, let's briefly talk about the background. The computer we are facing now is no longer the computer of 40 years ago. due to the need for industrialization and lowering the threshold, the whole computer industry and the communications industry have undergone many integrations. finally, we strictly distinguish between software and hardware. and strict boundaries are drawn between the various components of software and hardware, which is conducive to the social division of labor. People have embarked on the path of specialization (in fact, many thinkers have attacked specialization since Solon in ancient Greece), but its disadvantage is also obvious, that is, "modern computers are no longer fun for geek."

Modern computers are no longer fun.

After reading Linus's autobiography, I agree that computers are no longer suitable for playing, mainly because they are no longer fun. Linus has always believed that technology is made for entertainment! It is worth noting that Linus's autobiography "Just for fun" is not intended to express the idea that computers are no longer fun, but Linus believes that computers are "as complex as your car" and are no longer suitable for you to mess with.

In fact, because computers are no longer fun, it is impossible or difficult for us to re-create the so-called "era" on the computer itself! Now computer has become a profession, a means of making a living, and its content has also been reduced to routines and rules! To create an era, you must not specialize in the art industry, you must be a generalist.

The times are made out, I thought! Those people at that time did it unintentionally, and then when all the games were standardized, interests dominated everything. Things that can play must be simple, things that are too complex have no beauty in art, and in engineering you have to spend a lot of energy on complexity management, so the essence is hidden under complexity. it's hard for you to dig it and appreciate it. I have read several books recently, and I have a lot of feelings. Too complicated things, you can show off to people who do not understand, but in fact, the people who listen to you do not know what you are talking about, you are just talking to yourself. Have you ever wondered why masters all come together, ancient Greek philosophers, Chinese pre-Qin scholars, Renaissance painters, physicists at the beginning of the 20th century, computer geniuses in the 1950s and 1960s, * * in the 1970s.

The computer industry also started from the time of national agitation, just like Solon's reform in Greece, but just as Pericles ended an era in Greece, industrialization and specialization ended the hippie era of computers. Today, only Richard Stallman,Eric S Raymond are still trying to continue that era of geek, and in fact, today's specialization makes many quasi-geek unable to cope with complex computing. So how did this geek era end? I'll start with where software and hardware meet.

Where hardware and software meet

The original design of the machine is not so that everyone can control it, it belongs to the elite. But if someone discovers the huge business opportunities contained in these machines, he will promote a nationwide movement, which develops to a certain extent, and the technology is complex enough, and there will be specialization, so this technology is gradually far away from the masses and is once again monopolized by the elite, as it did at the beginning.

As far as computer technology is concerned, I call the technology of the former elite era hardware, and the technology of the latter elite era as software. Now in the field of hardware, there are silicon crystal technology, Verilog,HDL, etc., in the field of software, there are terms such as JAVA,Python,PHP,OO, design patterns, agile development and so on. Professionalism makes all these things exclusive in nature, while creating the local side of the computer age. That's where software and hardware meet, and that's where assembly language evolved to C language.

You know, if you don't understand the characteristics of the machine, you can't use assembly language, and if you don't understand the characteristics of the machine, you can't write the most efficient code in C. conversely, if you only know the C language, you can't make the best use of the machine's capabilities. Assembler and C, this is where software and hardware meet, the era is created by them, history is written by them.

Programming in assembly language makes people follow machine instructions, and it is difficult for programmers to build their own high-level logic, because you have to spend a lot of energy on the instruction itself. After the advent of C language, people's logical thinking has been liberated. You can write statements such as a=b+c without having to "put the immediate number into a register and then immediately count." If the values of two registers are added or a register is directly added to an immediate number, the result is stored in. The biggest sharing of the C language lies in solving the "addressing problem", which means that it solves all the problems, because computer programming can be classified as the art of addressing, and C sums up all addressing problems into one concept, that is, pointers! You know, the only problem with the operation of modern memory computers is addressing. Both data and instructions are stored in memory. You must have a way to find instructions or data in the right place. The C pointer shields all addressing details, making it possible to easily build application logic, including complex conditional statements, loop statements, goto statements, and even buffer overflows.

The relationship between hardware and software is the relationship between what to do and what to do. After the programmer tells the hardware what to do, he can rest assured to do something else, because he believes that the hardware knows how to do it. This interface is the C language, and the internal circuit logic of the hardware implements the "how to do" logic.

Water does not know how cold ice is, and ice does not know how affectionate water is. Once the same thing is isolated, it is like a mountain apart. Originally, hardware and software are the same, but now they have become two exclusive industries. Even in the software industry, there are often all kinds of exclusive isolation, such as "bottom layer", "protocol stack", "writing class",. Hard-and-soft full-stack programmers are now associated with being cool (bitter). But 30-40 years ago, the computer age was pioneered by these full-stack programmers! In places where software and hardware meet, those people belong to both the hardware elite and the software elite. In fact, they lead a hip-hop era of universal computing, which was continued by the 1970s and 1980s. The well-known celebrities in the industry almost all come from the era when the spirit was initiated by UNIX, such as Jobs' Apple, Bill. Gates' Microsoft, Richard. Stallman's GNU, Bill. Joey's BSD,IBM PC,Intel IA32,.... be too numerous to enumerate. This is an era of simplicity and innocence, an era of simplicity and concentration.

PC Technology and UNIX Technology

The place where software and hardware meet is where UNIX was born. UNIX, as a nutritious root, is destined to grow into a towering tree with so much fruit that it includes almost all the technologies we use today.

PC technology makes computers miniaturized, and then there is a trend of miniaturization. As the technology is more and more advanced, the hardware is getting smaller and cheaper, and the huge business opportunities promote the development of PC technology, but there is no real-time follow-up in software. UNIX was the most popular multi-user and multi-task time-sharing system at that time, but UNIX was controlled by a variety of non-technical factors, which gave Microsoft, Apple and other companies the upper hand. To this day, another reason for this result is that UNIX has never been grassroots (that is why it is GNU), and whether it is Apple or Microsoft. It all has something to do with a grassroots organization called Home Brewing Club, where people show off their machines and technology in a relatively peaceful environment. It can be said that UNIX did not catch up with the trend of the PC era, but this is UNIX, just as the enlightenment thinkers themselves did not participate in the French Revolution, UNIX as an ideological pioneer, some of its ancient ideas, and even the Windows 8 system are still in use. The design concept of SVR4 VM can be said to subvert the traditional understanding of memory; file abstraction makes the IO interface simple; the idea of virtual memory based on page swapping and on-demand paging affects almost all operating system designs. The idea of hierarchical storage is even better, in CPU, CPU cache is the Cache of memory, in VM, physical memory is the Cache of virtual memory, in MMU, physical memory is the Cache...; of disk or network, the most important thing is the UNIX process model (thread model and process group model are both extensions of the process model), I really don't know how to explain its importance.

I have to mention the relationship between Intel, Microsoft and PC technology, but this topic is so big that I can only remind you that PC technology has created two empires, and Intel and Microsoft work together so tacitly that they become an existence like the Roman Empire.

Mac OS X technology

It can be said that PC technology depends on Intel technology and Microsoft technology is developed in parallel, during which there are too many fancy features, and even a lot of things are directly solidified in the hardware. Today, Apple also uses Intel chips, but it does not use Microsoft technology, but uses UNIX system Mach/BSD to build its own operating system Mac OS X. It can be said that it is Apple that pulls PC technology into UNIX. On the other hand, Android is competing with iOS for the market, but whether it is Android or iOS, the underlying system is based on direct UNIX ideas, one is Linux, the other is Mach/BSD. Going back to basics, UNIX technology has been going on for 40 years, and its basic concept and core have basically not changed, which is enough to show how excellent its design is.

The idea of UNIX success is that it never pays attention to the implementation details, because the details pull you away from your goals. UNIX only provides basic ideas, so whether it is AIX,Solaris,Mach,xxBSD, their implementations are completely different, but they are all called UNIX. Windows NT can also be called UNIX if Microsoft wants, because the NT system implements the basic concept of UNIX to a certain extent.

In addition to UNIX, there is another path of evolution, and that is complexity evolution. It strictly follows the latest features of the hardware, the software caters to the hardware, the hardware is used to bad software, and we flatter each other. In fact, we all forget that we are all of the same family. Sometimes, for some single commercial interests, some complex but not universal mechanisms will be solidified in the hardware. The counterexample of this point is the RISC architecture, which is the direct embodiment of UNIX in the hardware field.

Examples of complexity evolution

It is an unwritten programming convention that the addresses of data are naturally aligned according to the length multiples of their types. But why did you make such an agreement? Is it impossible without alignment? On some architectures of processors, it just doesn't work, such as many RISC processors. However, on CISC processors, such as Intel/AMD x86 processors, the core can do this laborious operation for you for programming convenience. What is the root of all this?

Before analyzing the whole thing, it is important to understand that modern microprocessors are the crystallization of ultra-large-scale integrated circuits, and the chips are carved on semiconductors such as silicon, with extremely exquisite craftsmanship, so the simplicity of wiring is the foundation of everything. how admirable it is to carve circuits on silicon crystals (remember the Nuclear Boat we learned). Unlike high-level language programming, silicon crystal carving does not even allow you to make things too complicated (mainly because of the difficulty of crystal wiring), so a lot of things have to be determined, such as the natural alignment of data. The perfect processor architecture only needs to implement a minimum instruction set to complete any operation, the architecture is simple, and more silicon crystal space can be used to achieve efficient pipelining rather than complex instructions (a complex instruction often has a common part with other complex instructions, which results in a waste of space). This kind of processor is called RISC processor. This fact was summed up after many detours at the beginning of the new era of early integrated circuit technology, but unfortunately, Intel was the first person to eat crabs, and then it was a great success, and the instruction set compatible with it also brought down AMD.

We know that apart from the operation logic that must be implemented inside the microprocessor, the interface between the microprocessor and the outside is addressing and IO.

When Intel first implemented a 16-bit microprocessor, it wanted to implement a hardware instruction for every addressing logic we could think of. In fact, it worked so hard. Open any assembly language book, the first thing you can see is a large number of addressing instructions, if you look at the Intel manual, you will find that many registers do not exist as operands, but built into the instruction itself! Therefore, mov eax 1 and mov ebx 1 may be two completely different instructions, not just different operands. In this way, the implementation of complex instructions occupies a large amount of chip area. Because the transactions completed by individual instructions are still divisible and of different lengths, it is difficult for Intel to achieve efficient long pipelining. Machine instruction is the place where hardware and software meet, and it is the only interface between software request and hardware implementation. in order to be compatible with former software, the interface can not be changed at will, but when Intel realizes that the instruction must be simplified enough and the function is simple enough to achieve a long pipeline, its applications have blossomed everywhere. So the strategy of Intel and AMD is to modify the implementation of the instruction, after taking the instruction, divide the single complex instruction into several simple instructions, even using the ever-changing divine metaphor-add a middle layer!

Therefore, the instructions of Intel are actually realized by simple operations, which are called microoperations, and the microoperations that implement a set of instructions are called microprograms, and all microprograms are called microcodes. As a result, the x86 architecture is actually the CISC architecture in the interface, and its kernel is already the RISC architecture!

In order to explain the memory alignment problem, we must know its physical structure. The key is not how to design the memory chip, but how to correspond the pins of the memory chip to the address bus derived from the CPU. How to establish the relationship between the two is the most important. Don't forget that the only problem that the computer has to solve outside the CPU is the addressing problem! If the data is arbitrarily aligned, then the data may exist on two chips when the CPU issues the addressing instruction, and for the sake of wiring simplicity, the addressing instruction must be completed twice. If the bus is locked between the two operations, the efficiency will be greatly reduced. If the bus is not locked, there will be atomicity problems. It is easiest to ask the compiler or programmer not to write such instructions, so the task falls to the compiler or programmer. This is why data must be aligned naturally on some machines.

The beginning of UNIX

Today, with the vigorous development of application programs, it is difficult to think of the goal of UNIX equal time-sharing system at the beginning of its construction. The purpose of the time-sharing system is to allow the system to replace batch processing in a pipelined way, so that users do not have to wait for a long time. In fact, the time-sharing system was shown to system users in a time-slice illusion. In fact, the throughput and total work delay of the system have not improved, but only people's recognition of this illusion. This is the original time-sharing system. What about UNIX? UNIX has gone further in many ways, and for AT&T, its phone business is big, and it certainly wants to provide users with an illusion-based service with a more cost-effective investment, and users don't feel the difference at all. Bell Labs' original time-sharing system was not to run applications, but to control voice calls, but then it came to an end.

Phone users can easily connect with printers and end users because they all belong to remote users and share the same host resources. Before the emergence of the time-sharing system, either wait or establish a parallel system, no matter for users, or for the return on investment, it is not satisfactory, time-sharing system solves all the problems. The file abstraction is also for the more convenient iUnip O, so you can see what role the early UNIX iUnip O played. Just like today's routers and switches, the early UNIX provides the control plane at the process model abstraction level and the data plane at the file abstraction level. Its kernel is created for the control plane, and the data plane should be completed by using the user-mode Icando O as much as possible. Those who follow this principle will be rewarded, which is still the case today.

Fortunately, today, many manufacturers have gone this way, and there are many such technologies, such as PF_RING technology. All this has been decided as early as in the simple era of UNIX.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report