Intel was founded on July 18, 1968 by two engineers, Gordon Moore and Robert Noyce, who worked for Fairchild. A little later, the company had another co-founder - Andrew Grove. At that time, engineers already had a definite goal in mind - to make semiconductor memory practical and affordable. The idea was very ambitious, because at that time, semiconductor-type memory cost more than a hundred times more than memory created using magnetic technologies. At that time, the cost of semiconductor memory was more than one dollar per bit.

Intel 4004 processor

By 1970 Intel was already quite famous as a successful supplier of memory chips, the first to create the most capacious 1 kilobyte memory module at that time. The module, known as Dynamic Random Access Memory (DRAM), was the best-selling semiconductor chip for the year. At that time, the company already had more than a hundred employees.

Busicom drew attention to Intel products. Soon, a Japanese company ordered Intel to develop chips for a whole family of programmable calculators. At that time, all control microcircuits were created according to individual projects for the required tasks, which did not allow such microcircuits to be widely used.

Initially, the Busicom project involved the creation of at least twelve different chips of a unique architecture. Ted Hoff, an Intel engineer, abandoned this idea and proposed to create a single-chip universal device that loads instructions from semiconductor memory. With just four chips (processor 4004, I/O controller, RAM, ROM), the program could change their functions and perform certain tasks. The new microcircuit was universal, which allowed it to be installed in other devices, and not just in calculators. All previously developed microcircuits had support for a unique "hardwired" instruction set, and the new invention provided the ability to execute various instructions loaded from memory.

In April 1970, Intel hired engineer Frederico Faggin to design the 4004 control chip in accordance with Hoff's work. Faggin, like the founders of Intel, previously worked at Fairchild Semiconductor, where he was directly involved in the development of silicon gate technology, which played a significant role in the creation of microprocessors. Work on the 4004 series of microcircuits was completed in March 1971, and mass production was launched in June of the same year.

The 4004 processor was originally intended for use in programmable calculators, but later found other applications. For example, the chip was used in blood tests, to control traffic lights, and even in the Pioneer 10 rocket, created and launched by NASA specialists.

The next version of the processor, 8080, was introduced in April 1974. This chip contained about 6 thousand transistors and was able to address 64 kilobytes of memory. It was he who was used to install the first personal computer Altair 8800. This computer used operating system CP/M as well as an interpreter BASIC language developed by Microsoft.

HISTORY OF CREATION AND DEVELOPMENT
MICROPROCESSOR AUTOMATION TOOLS

Modern solutions in the field of automation, robotization and electric drive cannot be imagined without the use of microprocessor tools and systems. A significant contribution to the development of semiconductor microcircuitry was made by the well-known American company Intel, founded in 1968. This was the time of the emergence of new technologies, thanks to which it became possible to create miniature semiconductor devices - microcircuits. Their application opened up new prospects in all areas of technology, including automation. The era of digital computer processing of information began. The first ENIAC computer, created in 1946, weighed about 30 tons and occupied a large room. In 1968, there were already 30,000 computers in the world. These were mainly large mainframe computers (electronic computing machines) and cabinet-sized "mini-computers". An unpleasant feature of these computers was frequent emergency situations due to overheating of the lamps and a large number connectors. Therefore, the emergence of integrated electronics was due to objective reasons.


Rice. 1. The first general-purpose electronic digital computer ENIAC (Electronic Numerical Integrator and Computer))


The founders of Intel were talented scientists and inventors Robert Noyce, Gordon Moore and Andrew Grove. It was Robert Noyce who invented the integrated circuit in 1959. In the mid-60s, Noyce worked as a manager for the American company Fairchild Semiconductor, known for its developments in the field of electronic technology. Gordon Moore led research and development at Fairchild Semiconductor and was one of the eight founders of Fairchild. Andy Grove, a native of Hungary, was a process engineer. He joined Fairchild Semiconductor after receiving a Ph.D. in chemical engineering from the University of Berkeley.

In the late 60s, many talented engineers quit Fairchild Semiconductor and started their own firms. Robert Noyce and Gordon Moore founded Intel and became its first employees. Over time, Andy Grove joined them. Starting capital ($2.5 million) was provided by San Francisco financier Arthur Rock.

Intel specialized in the production of semiconductor storage devices. The first serial device was the microcircuit "3101" 64-bit Schottky-bipolar static random access memory. The special place that Intel has taken in the world of electronics is associated with other devices - microprocessors. It was they who became the technical basis of the current computer scientific and technological revolution.

The impetus for the creation of the microprocessor was a contract with the Japanese company Busicom, which specialized in the production of calculators. Busicom commissioned Intel to develop twelve custom chips, but Intel lacked the human, financial, and manufacturing resources to fulfill such a large order. Then the talented engineer Ted Hoff proposed instead of twelve specialized microcircuits to create one universal one that could replace them. R. Noyce and G. Moore appreciated the sophistication of the solution proposed by T. Hoff. The idea also satisfied the Busicom company, which financed the work. Thus, Intel began the development of a universal chip that can be programmed to execute certain commands. For the first time, there was no need for a hardware implementation of the device operation algorithm: all operations for processing numerical data were now carried out in accordance with certain program which promised to save money and time. A group of engineers and designers from Intel, headed by Federico Fagin, worked on the implementation of T. Hoff's plan. After 9 months of hard work, the world's first microprocessor "4004" appeared. It consisted of 2300 semiconductor transistors, but it easily fit in the palm of your hand. In terms of performance, the new processor was not inferior ENIAC computer, which occupied 85 cubic meters and consisted of 18,000 vacuum tubes. Ted Hoff designed the architecture of the first processor, Sten Mazor designed its instruction set, and Federico Fagin designed the processor chip.

After evaluating the advantages of using microprocessors, Intel management entered into negotiations with Busicom, as a result of which Intel acquired all rights to the "4004" processor for 60 thousand dollars (it should be noted that Busicom soon went bankrupt). After that, a wide advertising campaign began, the purpose of which was to convey to the engineering community the great potential of programmable devices in various fields - from control road traffic to automating complex manufacturing processes. Intel held seminars for engineers, published promotional materials and reference manuals on the use of microprocessors. In some weeks, the firm sold more reference documentation than microprocessors themselves. After some time, they became very widespread.

Thus, the "4004" chip became the first microprocessor. Approximately six months later, several other companies announced the appearance of such devices. These p-MOS microprocessors were four-bit, i.e., they could process only 4 bits of information at a time. The length of the program and the set of instructions were limited, the first processors did not have many of the functions required for modern microprocessors. In 1972, Intel released the "8008" processor, which inherited the main features of the "4004". It was the first 8-bit processor, which today is referred to as the first generation of processors. It already had an accumulator, six general purpose registers, a stack pointer, eight address registers, and special instructions for data input/output, but this processor did not become widely used in commercial developments.

At the end of 1973, Intel developed a new 8-bit microprocessor "8080". Its architecture and command system proved to be so successful that even today it is considered a classic.

The widespread use of microprocessors in technology began precisely with the advent of the 8080 chip, which belonged to third-generation processors, but was not the only successful one. bit processor. Six months later, the microprocessor "6800" of the American company Motorola appeared, which made tough competition to the Intel processor. Like the 8080, the 6800 microprocessor was made using n MOS technology, required a separate clock generator, had a three-bus structure with a 16-bit address bus, a well-developed architecture and instruction set. Its main advantages were a more powerful interrupt system than the 8080 and one (not three, like the 8080) supply voltage. The principles of the internal architecture of the 6800 also differed significantly from the 8080, primarily in the absence of general-purpose registers, in which, depending on the tasks, both address information and numerical data could be stored. Instead, a second equivalent accumulator for data processing and specialized 16-bit registers were added to the processor, where only address information was stored. The data for processing were selected from external memory and returned there after processing. Memory commands were simpler and shorter, but transferring a byte to memory took longer than swapping between the 8080's internal registers. The architecture of neither of the two mentioned processors had significant advantages, and each of them became the ancestor of two large families of microprocessors - Intel and Motorola, whose representatives compete to this day.

In 1978, the first 16-bit microprocessor "8086" was manufactured by Intel, used by International Business Machines (IBM) to create personal computers, and Motorola's 16-bit "68000" chip was used in well-known Atari and Apple computers. As for "home" computers, they became widespread with the advent of the ZX Spectrum model (based on the "Z80" processor) of the English company Sinclair Research Ltd, founded by the talented engineer Sir Clive Sinclair. The idea to use a TV instead of an expensive monitor and a household tape recorder for storing programs and data significantly reduced the cost home computer and made it affordable for the average buyer.

Intel 4004- A 4-bit microprocessor developed by Intel Corporation and released on October 15, 1971.

This chip is considered the world's first commercially available single-chip microprocessor.


Intel 8080- 8-bit microprocessor, released in 1974. Provided a tenfold increase in computing performance compared to the previous processor.

This is the device that brought the idea of ​​microprocessors to the engineering community. This chip sparked the personal computer boom.


Intel 8048- the world's first microcontroller, was released in the late 70s.

This device has become widespread due to its use in personal computer keyboards and in game consoles


Intel 8051- the second generation microcontroller, was released in 1980.
Thanks to the successful architecture and command system, it has become the de facto industry standard. It is still produced by well-known corporations in America, Korea and Japan.

Modern multi-core processor

The computational performance of modern microprocessors according to the results of various tests is approximately tens of thousands of times higher than the performance of the first processor.

Rice. 2. Line of key models of microprocessors and microcontrollers


A year after the creation of the "8080" microprocessor, several Intel engineers moved to Zilog and began working on the creation of a new processor, building on their previous designs. As a result, in 1977, the "Z80" microprocessor appeared, which became the best representative of 8-bit processors. Compared to the 8080, it required only one supply voltage, had a more powerful and flexible interrupt system, three times the clock speed, two batteries and a double set of general purpose registers. The Z80 instruction set contained all 78 instructions of the 8080 microprocessor and almost the same number of additional instructions, so programs created for the 8080 were transferred to the Z80 without any changes.

Later (mid-70s) another trend emerged in the development of microprocessors, which is directly related to automation and the emergence of processors for embedded solutions. It started with the Intel 8085 processor. At first it was conceived as a continuation of the 8080 chip, but after a while the Z80 appeared and the new Motorola 6809 microprocessor. Both of them significantly outperformed the 8085 in performance, prompting Intel to take on the development of the first 16-bit microprocessor, the 8086, but with the development of the 8156 and 8755 peripheral chips, the 8085 received new perspectives. The first chip contained static RAM (random access memory) with a capacity of 256 bytes, two 8-bit bit-by-bit configurable I/O ports, and a programmable timer-counter. The second included three multi-bit input/output ports and a 2K ROM (Read Only Memory) with ultraviolet erasure. Combining the outputs of these three microcircuits in an appropriate way, the developers of electronic equipment received a functionally complete module - a microcontroller that can be built into any device: a voltmeter, a frequency meter, into various amplifying devices or converters. Several companies have produced power-saving k MOS versions of this family. This made it possible to create microprocessor-based devices with autonomous battery power. Finally, in the late 70s years of Intel"combined" these three microcircuits into one chip and created a single-chip microcomputer (microcontroller) "8048", which included RAM and ROM, an arithmetic logic unit, a built-in clock generator, a timer-counter, input / output ports. Further, microcontrollers similar to the forty-eighth "8035" and "8748" were developed. The command system of single-chip microcontrollers was much weaker than that of the 8085 processor, the amount of RAM and ROM, the number of I / O ports was also smaller than that of the above-mentioned three-package module, but all this was placed in one chip, which greatly simplified development and production new devices based on single-chip microcomputers. The idea of ​​creating universal hardware with software setting for specific tasks, which became the impetus for the emergence of microprocessors, received the highest degree implementations in single-chip microcontrollers.

In the early 80s, Intel released a more powerful microcontroller "8051", and soon - and its modifications "8031" and "8751". The microcomputer core of this series has become a classic for microcontrollers. From the point of view of technology, the microcontroller "8051" was for its time a very complex device MCS 51 - the undisputed leader in the number of varieties and companies producing its modifications. To date, there are more than 200 modifications of MCS 51 microcontrollers, which are produced by almost 20 leading manufacturers electronic components(Atmel, Infineon Technologies, Philips, Hyundai, Dallas Semiconductor, Temic, TDK, Oki, AMD, MHS, LG, Winbond, Silicon Labs, etc.). Microcontrollers of the original architecture from Motorola, Zilog, Analog Devices, Microchip, Scenix, Holtec have also found their niche.

Bob Noyce

He is known for his innovative views on the development of semiconductor technologies. It was Robert Noyce who invented the integrated circuit in 1959. In the mid-60s, Noyce was the manager of the influential firm Fairchild Semiconductor. In the future - one of the founders of Intel.

Gordon Moore

A talented and hardworking engineer who enjoyed great prestige among his colleagues. One of the founders of Intel.
“We are real revolutionaries. After all, these latest advances in electronics are changing the world much faster than any political events.

Endi Grove (Andy Grove)

Energetic and enterprising, Andrew Grove worked for Fairchild Semiconductor as a process engineer. Grove joined Fairchild after receiving a PhD in chemical engineering from the University of Berkeley. One of the founders of Intel.

Ted Hoff

Teddy Hoff is one of the inventors of the microprocessor. It was he who proposed the concept of a universal micro-circuit and developed the architecture of the first processor.
“Most of all, I am personally impressed by the fact that, thanks to microprocessors, computers have become a mass accessible product.”

Rice. 3. Outstanding scientists-inventors, revolutionaries in the field of microelectronics


The creation of the microprocessor is recognized as one of the outstanding achievements of the twentieth century. Hundreds of millions of microprocessors and billions of microcontrollers are sold worldwide every year. According to the magazine "World of Computer Automation", the average American during the day about 300 times (!) Deals with microcontrollers, embedded literally everywhere - from washing machines, elevators and telephones to traffic lights, cars and industrial machines.

The Semiconductor Industry and Business Survey magazine says that if the automotive and aviation industries grew at the same rate as the semiconductor industry for 30 years, then a Rolls-Royce car would cost 2 dollars 75 cents and using just one liter of gasoline could travel almost one and a half thousand kilometers, and a Boeing 767 aircraft would cost 500 dollars and could fly around the globe in 20 minutes using only a can of kerosene. In 1996, the names of the creators of the microprocessor, Dr. Tedd Hoffa, Dr. Federico Fagin and Stan Mazor, were inducted into the US National Inventors Hall of Fame (Akron, Ohio) and stood in line with the names of Thomas Edison, the Wright brothers and Alexander Bell.

Another direction in the development of microprocessor systems was born in 1969, which was due to the need to replace complex, cumbersome and unreliable automatic control relay-contactor circuits at industrial enterprises. It was this year that General Motors prepared a tender request for the development of a universal microprocessor device for the needs of industrial production.

The tender was won by Bedford Associates of Massachusetts, which at the time was headed by Richard Morley. They developed a microprocessor device (controller), which made it possible to switch the signal wires connected to it in different combinations. These combinations were set by the control program, which was compiled on a computer, and then loaded into the controller's memory. Thus, with the help of one microprocessor device with a program loaded into it, it became possible to implement a control system, for the development of which it was previously necessary to switch tens or even hundreds of various electromechanical components, such as relays, timers, counters, regulators, etc. At the same time, one and the same controller could be used to control a variety of machines and mechanisms just by changing the program loaded into it. Thus, the world's first programmable logic controller (PLC) appeared, which Bedford Associates dubbed "Project 084".

The company began to develop the production of industrial controllers and was later renamed "Modicon" (short for "Modular Digital Controller", that is, a modular digital controller). In 1977, the Modicon brand was sold to Gold Electronics, later it was bought by the well-known German company AEG. As a result, the Modicon brand became the property of the French company Schneider Electric, which owns it to this day. It should be noted that Schneider Electric is one of the world leaders in the development, production and implementation of technical means of power supply, electric drive and automation.

Another company also took part in the tender at the request of General Motors, which still occupies a high position among the leaders in manufacturers of components for automation. We are talking about Allen Bradley. Although the company lost the tender, work in this direction was carried out further. Allen Bradley management acquired a controlling interest in Information Instruments and Bunker-Ramo Corporation, which at that time had already developed the PDQ II controller (short for Program Data Quantizer, software data modulator). This controller model turned out to be too bulky and difficult to program. However, Allen Bradley persisted, and in 1970, based on the "PDQ II", the PMC controller ("Programmable Matrix Controller", or programmable matrix controller) was developed. However, this model also did not meet the requirements of customers for controlling technological units. After completion, a model was born, called PLC 1 (“Programmable Logic Controller”, programmable logic controller). It is this name and abbreviation PLC that have established themselves in the field of automation and are used by specialists to designate this class of devices.

a) b)

In the mid-70s of the last century, the market for programmable logic controllers began to grow rapidly and Modicon and Allen Bradley had a number of competitors, among which should be noted General Electric, Siemens, Square D, Industrial Solid State Controls, etc.

A significant step towards simplifying the use of programmable logic controllers was the introduction of the international standard IEC 61131 3, which declares programming languages ​​for PLCs. Thanks to this, an engineer of any profile (technologist, electrician, chemist, etc.) can easily create programs for controlling technological installations, even without knowledge of the intricacies of programming. Also, the indicated languages ​​are universal for PLCs from different manufacturers.

Processors for personal computers became widespread in the seventies of the last century. They were produced by a large number of manufacturers. Almost every company at that time, as well as now, wanted to use only the latest technologies for their production. However, not all companies managed to get their development as strong as that of Intel and AMD. Some manufacturers completely disappeared from the market, while others moved into another field of activity. However, it should be told about everything in stages.

How the creation of the processor began

The world first heard about processors in the fifties of the last century. They functioned on a mechanical relay. Subsequently, models began to appear that worked with the help of electronic tubes and transistors. In those days, the computer devices on which they were installed looked like complex and very large equipment. Their cost was very high.

All processor components were responsible for the calculation process. It was necessary to figure out how they could be connected into a single microcircuit. This idea came to life almost immediately after the appearance of semiconductor-type circuits. In those days, processor developers could not even imagine that these schemes would be useful in their business. It is for this reason that they have been developing processors on several microcircuits for several more years.

In the late sixties, Busicom began developing their new desktop calculator. She needed 12 chips and ordered them from Intel. At that time, the developers of this company had ideas for connecting several microcircuits into one. This idea pleased the head of the company. Its advantage was that it was possible to save significantly. After all, it was not necessary to produce several microcircuits at once. In addition, due to the arrangement of processor elements on a single chip, it was possible to create a device that would be suitable for use on a variety of types of equipment used to perform computing processes.

As a result of the work carried out by the corporation's specialists, the world's first microprocessor called Intel 4004 appeared. It had the ability to perform six tens of thousands of operations at once in just one second. He even handled binary numbers. However this species the processor could not be used for computers, because such devices had not yet been created for it.

The very first personal computer

The first computer was created by a student from America, Jonathan Titus. In the magazine "Electronics" he received the name Mark 2. In it, among other things, a description was given this device. This invention did not help the student earn big money. Initially, Titus planned to make money with his invention. He planned to distribute for a certain cost printed circuit boards to build your own computers. Consumers had to purchase the rest of the parts in stores. Of course, he did not manage to earn a lot, but he made a great contribution to the development computer technology.

The history of the development of Intel processors

Intel's first processor was 4004. Later, this developer introduced the 8008 model to users. It differed from the previous model in that the operating frequency given processor ranged from 600 to 800 kilohertz. It contained over 3,000 transistors. It was actively used on all kinds of computers.

At the same time, the first personal computer devices began to appear in the world, and Intel decided to manufacture processors suitable for them. Later short term time, the company developed the 8080 processor, which was dozens of times more productive than its predecessor.

The cost of this processor model was very high by those standards. However, manufacturers believed that the cost was completely justified for a processor that has a high level of performance and is able to fit perfectly into any computer device. He was in great demand. It is thanks to this that the company's revenues only grew.

A few years later, the Altair - 8800 computer was born. MITS became its manufacturer. This model of a personal computer device operated on a processor from the company Intel models 8800. It was thanks to him that numerous companies began to produce their own microprocessors.

At the same time in the USSR

In the USSR, the production of various types of computing mechanisms was rapidly developing. The peak of computer development occurred in the seventies of the last century. They could, in terms of their level of performance, be quite comparable with their foreign counterparts.

In 1970, a decree appeared from the domestic leadership that standards for the compatibility of computer programs and hardware were developed. At this time, a new concept of computer technology was formed. It is based on IBM developments. Domestic experts used IBM 360 technology.

Domestic technologies that were developed in Soviet times have lost their relevance. Instead, technologies of imported origin began to be used. Gradually, the domestic electronic industry began to lag significantly behind the one that existed in the West. All computer devices that were developed after the eighties of the last century carried out their activities using Zilog or Intel processors. Russia began to lag behind America in its technologies by almost a decade.

Processor evolution

In the mid-seventies of the last century, Motorola presented its first processor to the user, which was called the MC6800. He had a high level of performance. He had the ability to work with sixteen bit numbers. Its cost was the same as that of the Intel 8080 processor. Its consumers were not very eager to buy. It is for this reason that it was never used for personal computers. The company had to part with four thousand employees due to financial difficulties.

In 1975, a new company called MOS Technology was formed by ex-Motorola employees. They developed the MOS Technology 6501 processor. In terms of its characteristics, it resembled the development of Motorola, which accused the company of plagiarism. Later, MOS employees tried to radically remake their offspring and released the 6502 chip. Its cost was much more acceptable, and it began to be in great demand. It was even used for Apple computer equipment. He had a fundamental difference from his predecessor. He had a much higher level of frequency of work.

The path of dismissed Motorola employees was followed by those who lost their place at Intel. They also created a company and launched their Zilog Z80 processor into production. It did not differ much from the Intel 8080 product. It had a single power line, and it had an acceptable cost. It could function with the same programs. In addition, the performance of this device could be made higher, and this did not require the influence of RAM. Thus, Zilog began to be in huge demand among consumers.

In Russia, this processor model was used mainly in military equipment, in various controllers and on many other devices. It has even been used on a variety of game consoles. In the nineties and eighties, he enjoyed great popularity among consumers on the Russian market.

Processors in the movie "Terminator"

The movie "Terminator" is full of moments when the robot scans everything that happens in front of him. Strange codes for the audience are formed before his eyes. A few years later, it becomes obvious that the filmmakers owe the creation of such codes to the MOS company with its version 6502 processor. This makes the developers have fun, who find it funny that a movie about the distant future uses a processor from the seventies.

The evolution of Intel, Zilog, Motorola processors

In the late seventies, Intel introduced its next novelty. It was called Intel 8086. Thanks to this chip, all the closest pursuers of the company on the market were left far behind. He had a high level of power, but this gave him the opportunity to become popular. It used a 16-bit bus, which had a high level of cost. For this processor, it was necessary to use special microcircuits and remake motherboard.

The company then released its more successful Intel 8088 product. It had over thirty thousand transistors.

Motorola launched its MC68000 at the same time. He was one of the most powerful at that time. To use it, it was necessary to have special microcircuits. However, it was still in great demand among consumers. It offered huge opportunities for users to use it.

At the same time, Zilog also introduced users to its new development. She created the Z8000 processor. This novelty still causes a lot of controversy. By their own technical parameters it was acceptable and its cost was low. However, not many users wanted to use it on their computing devices.

New generation processors from Intel

In early 1993, Intel introduced its P5 processor. Today it is known as the Pentium. The company has been able to improve on the technologies it used to create its products. Now their novelty had the ability to cope with two tasks at once. The bandwidth of the bus has become almost twice as large. However, users were not fully able to use this processor, because it was necessary to have a special motherboard for it. However, after the release of the next model of the Pentium processor, the situation became completely different.

It is thanks to high technology that chips from the manufacturer Intel have become very popular with consumers. They occupied the first places in the world for a long time.

Inexpensive Intel developments

In order to fully compete with AMD in the field of affordable processors, Intel developers decided not to reduce the cost of their products, but began to create not very powerful processors, which soon became known as Celeron. In 1998, the first such low-power model of the Celeron processor appeared, running on the second-generation Pentium processor core. She did not have a high level of performance. However, she could well work with technological innovations.

Intel 4004 microprocessor

Story

Why 4004?
The thing is that each category of products was assigned its own figure. Intel's first products were memory chips ( PMOS-chips) that have been numbered 1xxx. Chips were developed in the 2xxx series NMOS. Bipolar microcircuits have been assigned to the 3xxx series. 4-bit microprocessors were designated 4xxx. CMOS microcircuits were designated 5xxx, magnetic domain memory - 7xxx, 8-bit or more microprocessors and microcontrollers belonged to the 8xxx series. The 6xxx and 9xxx series were not used.

The second digit indicated the type of product: 0 - processors, 1 - RAM chips, 2 - controllers, 3 - ROM chips, 4 - shift registers, 5 - chips EPLD, 6 - PROM chips, 7 - chips EPROM, 8 - observation chips and synchronization circuits in pulse generators, 9 - chips for telecommunications.

The third and fourth digits corresponded serial number products, And since the first processor required three more specialized chips (ROM chips, RAM chips and an I / O expander), which were released earlier than 4004, the microprocessor was named 4004.

The 4004 microprocessor was produced in a 16-pin package of type DIP, the size of the crystal was less than 1 sq. see The processor could execute 60,000 instructions per second. (For comparison, one of the first completely electronic computers- American ENIAC- performed only 5000 instructions per second, occupied an area of ​​​​278.7 square meters. m. and weighed 30 tons.) Intel foresaw the decisive role of microprocessors in the miniaturization of computers, and therefore bought the copyright for the 4004 microprocessor and its improved versions from Busicom for $ 60 thousand.

However, in 1971 the processor did not become a bestseller. Intel's strategy was to market the 4004 to expand the market for the much more popular 1101/1103 memory chips. Only the microprocessor, the electronic "great-grandson" 4004, began to enjoy well-deserved popularity.

4xxx Series Specialized Chips

The 4004 chip came with 3 specialized chips: ROM chips, RAM chips, and an I/O expander. And although these microcircuits had their own designation system (series 1xxx, 2xxx and 3xxx), they received a second name in the 4xxx category, which began to be indicated next to their usual numbering.

  • Collecting

    The Intel 4004 is, of course, one of the most popular chips to collect. The most highly valued are white and gold Intel chips 4004 with visible gray marks on the white part (original case type). So in 2004, such a microcircuit, on