Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

A brief History of World Computing (part two)

2025-02-05 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >

Share

Shulou(Shulou.com)11/24 Report--

A brief History of World Computing (part I)

A brief History of World Computing (Ⅱ)

The day is finally coming to an end.

In the last article on IBM-PC and compatible computers in the █ 1980-1990:PC era, we said that the rise of microprocessors in the 1970s led to the emergence of a large number of personal computers.

This situation makes the traditional giant IBM feel threatened. For a long time, they have focused on mainframes, resulting in ignoring the market for minicomputers.

In order to make amends, they also decided to launch a personal computer research and development program.

In March 1980, IBM held a high-level secret meeting to set up the "Chess" project to develop personal computers (the word Personal Computer was coined by IBM at this time).

The person in charge of the project is Don Estrich (Don Estridge). He led a team of 13 people squatting in a warehouse in the town of Boca Raton, Florida, to conduct secret research and development.

When Don Estrich started out, they planned to use their own processor (IBM 801) and operating system. But considering that time is tight (the leader wants it to be done within a year), they decided to cooperate with a third party.

On August 12, 1981, their work bore fruit when IBM officially launched IBM-PC (IBM5150), which is equipped with Intel's 8088 processor (16-bit, 4.77MHz) and Microsoft's PC-DOS operating system.

The IBM-PCIBM-PC costs $1565 and has 16K of memory (which can be expanded to 256K as needed) and comes with a 5.25inch floppy disk. It designs a bus plug-in card to expand its capacity, which allows users to install graphics cards and choose black-and-white or color displays.

After the launch of IBM-PC, it was a great success, selling more than 200000 units in the first year and more than 1 million in 1985.

It was not only named "person of the year" on the cover of time magazine, but also won the title of "the greatest product of the 20th century". (unfortunately, as the founder of IBM-PC, Don Estrich died in a plane crash in 1985. )

The success of IBM-PC has attracted many manufacturers to "copy" it. They refer to IBM-PC standards to create products that are "compatible" with IBM-PC software, expansion cards and peripherals, called "compatible computers" (the ancestor of computer DIY).

In June 1982, Columbia data products (Columbia Data Products) launched its first IBM PC compatible computer, the MPC 1600. In November, Compaq followed suit with the launch of the Portable, an IBM PC-compatible portable computer (produced in March 1983).

The flexible configuration and cheap price of Compaq Portable "compatible computer" quickly took away the market share of IBM-PC. In 1983, IBM accounted for about 76 per cent of the PC market share. By 1986, it had fallen to 26%. This depresses IBM.

Intel's rise the real beneficiaries of the overall rise of PC compatible machines are Intel and Microsoft.

The 8088 used by IBM-PC was launched by Intel in 1979.

In February 1982, Intel developed a second-generation PC processor 80286 that is fully compatible with 8088, which is used on IBM PC / AT.

The 808880286 chip, all 16-bit processors, was not technologically advanced at that time. In 1979, Motorola took the lead in launching the 32-bit processor, MC68000, at least half a generation ahead of Intel.

MC68000 Apple's Apple Lisa and Macintosh (Macintosh, released in January 1984, was the first PC to use a graphical operating system), using the MC68000.

It wasn't until July 1985 that Intel finally launched its belated 32-bit processor, the 80386.

This processor caters to the needs of compatible computers and has been a great success.

It is worth mentioning that IBM was relatively strong in the early days. When they developed IBM-PC, they chose Intel's chip and forced Intel to open its design and code to AMD, making AMD a second supplier.

Later, more and more compatible machines used Intel chips, and Intel had the right to speak. So, starting in 80386, Intel no longer opened up any data to AMD.

In 1987, AMD filed a lawsuit against Intel for breach of contract, and Intel countersued. The monopoly and tort lawsuits of the two have been fought one after another for eight years.

Although AMD won the case in the end, it missed the golden age of CPU development and was shaken off by Intel.

In the mid-1980s, the rise of Japanese semiconductors also posed a great threat to Intel and other American companies.

Intel was later rescued by legendary CEO Andy Grove (Andy Grove) at the helm of Intel, cutting off the memory semiconductor business and focusing on the microprocessor business.

Andy Grove in 1989, Intel launched the 80486 processor, which was welcomed by the market.

With its outstanding performance in 80486, Intel surpassed all Japanese semiconductor companies to become the world's number one semiconductor manufacturer.

DOS / Windows of Microsoft, let's take a look at Microsoft.

The DOS that Microsoft developed for IBM-PC at that time was a "second-hand product" bought by itself.

In MS-DOS1976, DR Company of the United States successfully developed a set of operating system called CP / M (Control Program / Monitor, Control Program / Monitoring), which is specially used for microcomputers equipped with Intel 8080 chips (including Altair 8800).

IBM wanted to use the system, but didn't talk about it.

Later, a programmer named Tim Paterson at SCP (computer products in Seattle) wrote a QDOS system (Quick and Dirty Operating System, a fast and dirty operating system, later renamed 86-DOS), a variant of CP / M.

Tim Patterson, the father of DOS, Bill Gates had a good eye, bought out the system (and poached Tim Patterson), changed it, turned it into PC-DOS, and sold it to IBM.

After IBM-PC became popular, Microsoft's DOS became famous. Then, Microsoft kept updating and released a lot of new versions.

Bill Gates was shocked by the introduction of Apple's Macintosh graphical interface operating system. As a result, a "reference" was carried out and Windows 1.0 was launched in November 1985.

The early Windows of the Windows 1.0 interface was just a "shell" of DOS, which was not useful, so it received a lot of complaints from users. As a result, Microsoft began the development of a new kernel, later known as Windows NT.

Microsoft actually built an OS / 2 operating system with IBM, but later put IBM together and gave up.

In the 1980s, because of the popularity of PC-compatible computers, it created a huge IT market. Many new companies have been set up, and many new products have been launched.

For example, in September 1982, 3Com launched the world's first network card. In 1984, the British AdlibAudio company launched the first sound card-Moqi sound card. In 1985, Philips and Sony jointly launched the CD-ROM drive.

These hardware products make PC more powerful and bring a better experience to users.

█ 1990-2000: after the Internet era Wintel Alliance entered the 1990s, Intel and Microsoft have become real giants with a market capitalization of more than 100 billion US dollars.

Intel's Pentium x86 processors, as well as Microsoft's Windows operating system, are standard for all PC. The Wintel alliance they formed firmly holds the initiative of the PC market.

In the world of workstations and servers, Intel and Microsoft face a somewhat complicated situation.

In terms of processors, the industry is extremely competitive. At that time, it was mainly divided into two camps.

One is the RISC-CPU camp represented by SUN, SGI, IBM, DEC, HP, Motorola and other manufacturers. They advocate the use of RISC-CPU architecture (RISC, simple instruction computer).

The other is the CISC-CPU camp represented by Intel and AMD. They advocate the use of CISC-CPU architecture (CISC, complex instruction computer).

Although RISC is faster and more favored by the industry at that time, Intel, led by Andy Grove, still insists on taking CISC-CPU as its main direction.

In the end, Intel successfully consolidated its position by virtue of its huge R & D investment, compatibility and mass production speed. (Intel wouldn't have thought that a few years later, however, they would still stumble on RISC. )

On the operating system side, Microsoft competes with the powerful UNIX / Linux camp.

UNIX, and later Linux and its distributions (such as Ubuntu, Debian, Centos, Fedora, Redhat Linux), are the mainstream choice of server operating systems.

Linus Towaz (Linus Torvalds), author of the Linux kernel (1991), Windows also launched Windows NT, but it has no advantage in market share because it is not as stable as Unix / Linux.

With the help of Pentium processor, the performance of PC has been greatly improved. With the continuous improvement of Windows, ordinary people have the ability to operate computers.

In the 1990s, because of the high-speed iteration of semiconductor technology, storage technology has become more and more mature. The capacity of memory and hard disk is getting larger and larger, and flash memory and a variety of memory cards are also beginning to appear, making it easier to copy and save media.

If we say that the PC in the 1980s is just a new taste for users. So, in the 1990s, PC was already a real productivity tool.

People not only use PC to listen to music, watch videos and play games, but also use it to edit documents, build tables, and process data.

With the help of PC, people fully feel the improvement of quality of life and production efficiency brought about by IT computing.

The informationization process of the whole human society begins to accelerate.

The explosion of the Internet has added another fire to the informationization, of course, the Internet.

After the continuous expansion and expansion in the 1980s, ARPANET finally evolved into a global Internet.

On August 6, 1991, British physicist Tim Berners-Lee (Tim Berners-Lee) officially proposed World Wide Web, the www World wide Web that we are now very familiar with.

Tim Berners-Lee also proposed HTTP (Hypertext transfer Protocol) and HTML (Hypertext markup language), designed the first web browser, and built the world's first web website.

The emergence of the Internet has opened the door to a new world for people. The Internet is a treasure house with unlimited resources, a variety of websites and forums are dazzling. Powerful instant messaging tools also meet people's communication and social needs.

The Internet has gone beyond the scope of technology. It builds an online virtual world, derives many new business models, and completely changes human society.

The vigorous development of the Internet has given birth to a lot of Internet companies.

These companies have purchased a large number of servers and built computer rooms to provide services for users. For example, mailbox service, audio and video download service, web access service and so on.

The development direction of information technology began to change. A new mode of computing service is coming to us gradually.

█ 2000-now: after the rise of the cloud computing Internet in the cloud computing era, the rapid growth of users and the trendy characteristics of the business (sometimes many people, sometimes few people) have put a lot of pressure on service providers.

How to meet the needs of users with lower cost and more flexibility has become a difficult problem for many enterprises.

In the mid-1990s, someone put forward the idea of "cloud computing".

In 1996, a group of Compaq technical executives first used the word Cloud Computing when discussing the development of the computing business. They believe that business computing will shift to Cloud Computing.

After Compaq's document on cloud computing entered the 21st century, the idea gradually became a reality.

In 2006, Internet ecommerce Amazon first launched two blockbuster products, S3 (Simple Storage Service, simple storage service) and EC2 (Elastic Cloud Computer, elastic cloud computing), which laid the foundation for its own cloud computing service.

Another company that is doing something about cloud computing is Google.

The young company, founded in 1998, published four major articles on distributed file systems (GFS), parallel computing (MapReduce), data management (Big Table) and distributed resource management (Chubby) during 2003-2006.

These articles not only laid the foundation of Google's own cloud computing services, but also pointed out the direction for the development of cloud computing and big data all over the world.

In 2006, Google engineer Christopher Bishlia first presented the idea of "cloud computing" to Eric Eric Schmidt, chairman and CEO.

On August 9, Schmidt formally put forward "Cloud Computing (Cloud Computing)" at the search engine conference.

The essence of Eric Schmidt cloud computing is to turn scattered physical computing resources into flexible virtual computing resources, cooperate with distributed architecture, and provide theoretically unlimited computing services.

The trend of numeracy since 2010, there are two significant trends in the development of numeracy.

First, generalization.

In the 1990s, with the popularity of 2G mobile communications, many users used mobile phones. At that time, devices such as PDA PDAs also became popular.

Apple's Newton Pocket PC (1992) is a relatively simple device that uses chips that don't require high performance, but care about energy consumption.

This provides an opportunity for a company called ARM (Advanced RISC Machines). They hold high the banner of RISC, specializing in low-power, low-cost road, just to meet the chip needs of mobile terminals.

Earlier, Xiao Zaojun told you that Intel engaged in CISC and killed several big manufacturers engaged in RISC in the server market. At that time, they didn't like ARM at all and thought RISC had no future. As a result, raising tigers is a problem.

In 2008, Jobs' Apple launched the iPhone, bringing mobile phones into the smart age.

Jobs' mobile phone, pad and other mobile terminals have completely exploded, and ARM and his ARM architecture chips have also completely exploded, becoming the big winners in the mobile Internet era.

The ability of the mobile terminal chip is getting stronger and stronger, and it is no less than the desktop terminal chip. People pay more attention to terminal chips than PC chips.

The development of 3G / 4G / 5G mobile communication and optical fiber broadband has built a powerful network and created conditions for the "mobility" of computing power.

Today, computing power no longer stays in the cloud, but can sink to the edge, resulting in a "cloud computing-edge (edge computing)-end computing" three-tier architecture. Operators also put forward the computing power network, and want to achieve the comprehensive generalization of computing power.

Second, subdivision.

Information and network, let people taste the sweetness. Today, with the continuous upgrading of technology, we begin to propose digital economy and digital transformation. To put it bluntly, all industries should be digitized.

Different industries have different needs for numeracy. As a result, the computing power gradually began to be subdivided into general computing power, supercomputing power, and intelligent computing power.

Different arithmetic requirements also make the arithmetic chip have different forms. From the original unitary computing of CPU, it has gradually evolved into the pattern of "general computing chip + special computing chip".

In addition to the traditional CPU and GPU, NPU, DPU and other computing units began to appear, and become the focus of public attention.

In high-performance computing, computing power cluster has become the new favorite of supercomputing and intelligent computing. The AIGC model, which rose in an all-round way in 2023, is a shot in the arm for the development of computing power.

On the contrary, a computing chip like GPU is stronger than CPU in artificial intelligence computing. Nowadays, high-end GPU is hard to get.

Twenty years ago, no one would have thought that Nvidia, which makes GPU, would have a market capitalization eight times that of Intel, a CPU company.

The conclusion of █ is here, the series of brief history of arithmetic and power is finally coming to an end.

The development of human numeracy can really be called a magnificent scientific and technological epic.

From the earliest knot recording, to abacus, to mechanical computers, it has been a long exploration for thousands of years.

After the advent of the electronic computer, it took less than a hundred years to increase the computing power a hundred trillion times.

In the past forty years, the wave of information technology revolution has swept every corner of our lives. The whole human society, driven by computing power, has undergone earth-shaking changes.

In the future, digitization and intelligence will continue to move forward. Our demand for numeracy is still growing frantically.

Under the premise that Moore's Law is gradually moving towards the bottleneck, how can we realize the multiplication of computing power? Will the new computing power, represented by quantum computing, rise in an all-round way?

Let time tell us the answer.

This article comes from the official account of Wechat: fresh Jujube classroom (ID:xzclasscom), author: Xiaozaojun

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report