Central processing unit contains the electronic heart of the computer: the microprocessor chip. The chip is a piece of silicon on which has been printed a complex circuit that processes the instruction to the computer.
The idea of a single chip computer was around even as early as the 1950s but at that time electronics and integrated circuit technology were still in their infancy.
The creation of the transistor in 1947 and the development of the integrated circuit in 1958-59, is the technology that formed the basis for the microprocessor.
Around that time, there were many companies in the field trying to develop a single chip computer and one of them was Intel. Intel released its 4-bit all-purpose chip, the Intel 4004, in November 1971.
The world’s first commercial microprocessor chip was released by Intel on November 15, 1971. It was developed for a Japanese calculator company, Busicom, as an alternative to hardwired circuitry.
The personal computer was first made possible by the development of the microprocessor during that year. Single chip computers could dramatically cut costs and expand the effectiveness of systems.
In April 1974, Intel introduced the 8080 microprocessor ,which was 10 times faster than the earlier 8008 and addressed 64kb of memory.
By the late 1990s, technology had developed printable, affordable personal computers, known as laptops, which could be carried anywhere.
History of computer chips
Understanding Beverage Tonicity: Choosing the Right Drink for Hydration and
Energy
-
Tonicity refers to the osmotic pressure gradient across a semipermeable
membrane, driven by differences in solute concentrations between two
solutions. I...