July 16, 2020, ainerd

History of Computers

The history and development of computers is indeed exceptional, and we know that the history of computers goes back to the 19th century. The first computers were designed by Charles Babbage in the mid-19th century and are sometimes known as the “Babbage Engines.” But many of these early computer technologies and innovations were kept secret from the public because they were tied to defense contracts.

In the same century Charles Babbage (1792 – 1871) developed the differential machine, which calculated and printed simple mathematical tables. He also improved the cards for performing complex calculations, using punch cards to do them, although he never had the means to build one himself. This includes the invention of the first computer in the USA in 1881 and the development of computers in other countries of the world.

Thus, at the end of the 19th century, the basic principles of what constitutes modern computer work were already in place. The second generation developed technologies used to build basic circuits and programming languages used to write scientific applications.

The first machines to be built with this technology included the first computers for the US Census Bureau and the United States Postal Service. These computers, like their mechanical predecessors, were sold to the Census Bureau in the 1950s, but they were the first modern types used in business. The electronic switches of this time were based on the same principles as the mechanical switches of the 19th and early 20th centuries.

When transistors were invented in 1947, this enabled computers to process information in the vast mass space that was once needed, and information could be processed in much less space than was once needed. One of the earliest computers to use a transistor was the IBM 360, which dominated the mainframe computer market from the mid-to-late 1960s. In the 1950s, computers were increasingly used to process large amounts of data and were considered for high-tech design and manufacturing purposes that require complex calculations.

In the early 1960s, microchips or integrated circuits (IC) were invented by Jack Kirby and Robert Noyce. Integrated circuits contain miniature circuits on a single silicon chip that form the basis of many modern computers, such as Apple II, IBM 360, and Intel Core II. The inventions of the integrated circuit allowed computers to become even smaller, with a circuit board able to fit into the same space as one of the most powerful computers on the market today.

Although Turing’s computer was an abstract concept, it was the brainchild of a German engineer named Konrad Zuse, who would later build the world’s first programmable computer. The Z1 electronic computer, developed by the University of California, Berkeley’s Department of Electrical Engineering and Computer Science, was a binary-powered calculator that read instructions for punching into a 35-millimeter film.

This technology followed similar devices that used electromechanical relay circuits, but was unreliable and could only function in a limited number of situations.

In the end, Babbage, who had made the most progress with prototypes, could only complete one – the seventh of his first draft. Despite these improvements, everything that came together to assemble Zuse’s third model was so much better than the first that it was assembled for him.

Four years later, his older brother James Thomson developed a computer capable of solving mathematical problems known as differential equations. The device was called an integrated machine and was to serve in later years as the basis for a system known as a differential analyzer. In the early era of computer technology, some notable achievements were achieved in the age of the tides – the prediction machine, invented in 1872 and considered the first modern analog computer.

Throughout human evolution, these devices have been used for calculations over thousands of years, but never in such sophisticated ways as these.

Then, in 1822, the father of the computer, Charles Babbage, began to develop the first mechanical computer. Then, in 1833, he actually constructed the Analytical Engine, which was an all-purpose computer with a high-speed low-power processor and a wide range of computing capabilities. It included an integrated memory, a computer processor and a number of memory chips, as well as a number of other functions.

But did the mechanical calculator designed by Pascal Leibniz really qualify as a computer? The calculator was a device that made it faster and easier for people to make sums, but it needed a human operator.

A computer, on the other hand, is a machine that works by following a set of stored instructions called a program. In 1935, a computer was a device for one person to perform arithmetic calculations, but not a human operator. When people found a way to make a fully automatic, programmable calculator, the calculator became a computer.

From 1935 to 1945, the definition referred to a machine and not to a person, but from 1945 it referred to the computer as a device for a person and not a computer.

 

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x