answersLogoWhite

0

What is the history of computing?

Updated: 11/10/2022
User Avatar

Nathaniel Lebsack

Lvl 10
4y ago

Best Answer
History of computersThe first computer ever was made in 1945. They were not like we have them today. They were a sort of typewriters.

Follow the link for detailed information.

The first computers were people! That is, electronic computers (and the earlier mechanical computers) were given this name because they performed the work that had previously been assigned to people. "Computer" was originally a job title: it was used to describe those human beings (predominantly women) whose job it was to perform the repetitive calculations required to compute such things as navigational tables, tide charts, and planetary positions for astronomical almanacs. Imagine you had a job where hour after hour, day after day, you were to do nothing but compute multiplications. Boredom would quickly set in, leading to carelessness, leading to mistakes. And even on your best days you wouldn't be producing answers very fast. Therefore, inventors have been searching for hundreds of years for a way to mechanize (that is, find a mechanism that can perform) this task.

"The Father of Computers" Charles Babbage who invented the 1st mechanical Computer.. in 1834 he dreamt of designing mechanical calculating machines."... I was sitting in the rooms of the Analytical Society, at Cambridge, my head leaning forward on the table in a kind of dreamy mood, with a table of logarithms lying open before me. Another member, coming into the room, and seeing me half asleep, called out, "Well, Babbage, what are you dreaming about?" to which I replied "I am thinking that all these tables" (pointing to the logarithms) "might be calculated by machinery. "

This machine used the decimal numbers system and was powered by cranking a handle. The British government first financed the project but then later cut off support. Babbage went on to design his much more general analytical engine but later returned and produced an improved design (his "Difference Engine No. 2") between 1834 and 1869. Meanwhile Ada Lovelace is credited as the "first Computer programmer" since she was writing programs -that is, manipulating symbols according to rules-for Babbages machine.

The rest Followed.

For more details Click on the link to your right for the answer.
The history of computers starts out about 2000 years ago, at the birth of the abacus, a wooden rack holding two horizontal wires with beads strung on them. When these beads are moved around, according to programming rules memorized by the user, all regular arithmetic problems can be done. Another important invention around the same time was the Astrolabe, used for navigation. Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers entered with dials and was made to help his father, a tax collector. In 1671, Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It could add, and, after changing some things around, multiply. Leibniz invented a special stepped gear mechanism for introducing the addend digits, and this is still being used. The prototypes made by Pascal and Leibniz were not used in many places, and considered weird until a little more than a century later, when Thomas of Colmar (A.K.A. Charles Xavier Thomas) created the first successful mechanical calculator that could add, subtract, multiply, and divide. A lot of improved desktop calculators by many inventors followed, so that by about 1890, the range of improvements included: * Accumulation of partial results * Storage and automatic reentry of past results (A memory function) * Printing of the results Each of these required manual installation. These improvements were mainly made for commercial users, and not for the needs of science.While Thomas of Colmar was developing the desktop calculator, a series of very interesting developments in computers was started in Cambridge, England, by Charles Babbage (left, of which the computer store "Babbages, now GameStop, is named), a mathematics professor. In 1812, Babbage realized that many long calculations, especially those needed to make mathematical tables, were really a series of predictable actions that were constantly repeated. From this he suspected that it should be possible to do these automatically. He began to design an automatic mechanical calculating machine, which he called a difference engine. By 1822, he had a working model to demonstrate with. With financial help from the British government, Babbage started fabrication of a difference engine in 1823. It was intended to be steam powered and fully automatic, including the printing of the resulting tables, and commanded by a fixed instruction program. The difference engine, although having limited adaptability and applicability, was really a great advance. Babbage continued to work on it for the next 10 years, but in 1833 he lost interest because he thought he had a better idea - the construction of what would now be called a general purpose, fully program-controlled, automatic mechanical digital computer. Babbage called this idea an Analytical Engine. The ideas of this design showed a lot of foresight, although this couldn't be appreciated until a full century later. The plans for this engine required an identical decimal computer operating on numbers of 50 decimal digits (or words) and having a storage capacity (memory) of 1,000 such digits. The built-in operations were supposed to include everything that a modern general - purpose computer would need, even the all important Conditional Control Transfer Capability that would allow commands to be executed in any order, not just the order in which they were programmed. The analytical engine was soon to use punched cards (similar to those used in a Jacquard loom), which would be read into the machine from several different Reading Stations. The machine was supposed to operate automatically, by steam power, and require only one person there. Babbage's computers were never finished. Various reasons are used for his failure. Most used is the lack of precision machining techniques at the time. Another speculation is that Babbage was working on a solution of a problem that few people in 1840 really needed to solve. After Babbage, there was a temporary loss of interest in automatic digital computers. Between 1850 and 1900 great advances were made in mathematical physics, and it came to be known that most observable dynamic phenomena can be identified by differential equations(which meant that most events occurring in nature can be measured or described in one equation or another), so that easy means for their calculation would be helpful. Moreover, from a practical view, the availability of steam power caused manufacturing (boilers), transportation (steam engines and boats), and commerce to prosper and led to a period of a lot of engineering achievements. The designing of railroads, and the making of steamships, textile mills, and bridges required differential calculus to determine such things as: * center of gravity * center of buoyancy * moment of inertia * stress distributions Even the assessment of the power output of a steam engine needed mathematical integration. A strong need thus developed for a machine that could rapidly perform many repetitive calculations.A step towards automated computing was the development of punched cards, which were first successfully used with computers in 1890 by Herman Hollerith (left) and James Powers, who worked for the US. Census Bureau. They developed devices that could read the information that had been punched into the cards automatically, without human help. Because of this, reading errors were reduced dramatically, work flow increased, and, most importantly, stacks of punched cards could be used as easily accessible memory of almost unlimited size. Furthermore, different problems could be stored on different stacks of cards and accessed when needed. These advantages were seen by commercial companies and soon led to the development of improved punch-card using computers created by International Business Machines (IBM), Remington (yes, the same people that make shavers), Burroughs, and other corporations. These computers used electromechanical devices in which electrical power provided mechanical motion - like turning the wheels of an adding machine. Such systems included features to: * feed in a specified number of cards automatically * add, multiply, and sort * feed out cards with punched results As compared to today's machines, these computers were slow, usually processing 50 - 220 cards per minute, each card holding about 80 decimal numbers (characters). At the time, however, punched cards were a huge step forward. They provided a means of I/O, and memory storage on a huge scale. For more than 50 years after their first use, punched card machines did most of the world's first business computing, and a considerable amount of the computing work in science. Electronic Digital ComputersThe start of World War II produced a large need for computer capacity, especially for the military. New weapons were made for which trajectory tablesand other essential data were needed. In 1942, John P. Eckert, John W. Mauchly (left), and their associates at the Moore school of Electrical Engineering of University of Pennsylvania decided to build a high - speed electronic computer to do the job. This machine became known as ENIAC (Electrical Numerical Integrator And Calculator) The size of ENIAC's numerical "word" was 10 decimal digits, and it could multiply two of these numbers at a rate of 300 per second, by finding the value of each product from a multiplication table stored in its memory. ENIAC was therefore about 1,000 times faster then the previous generation of relay computers. ENIAC used 18,000 vacuum tubes, about 1,800 square feet of floor space, and consumed about 180,000 watts of electrical power. It had punched card I/O, 1 multiplier, 1 divider/square rooter, and 20 adders using decimal ring counters, which served as adders and also as quick-access (.0002 seconds) read-write register storage. The executable instructions making up a program were embodied in the separate "units" of ENIAC, which were plugged together to form a "route" for the flow of information. These connections had to be redone after each computation, together with presetting function tables and switches. This "wire your own" technique was inconvenient (for obvious reasons), and with only some latitude could ENIAC be considered programmable. It was, however, efficient in handling the particular programs for which it had been designed. ENIAC is commonly accepted as the first successful high - speed electronic digital computer (EDC) and was used from 1946 to 1955. A controversy developed in 1971, however, over the patentability of ENIAC's basic digital concepts, the claim being made that another physicist, John V. Atanasoff (left) had already used basically the same ideas in a simpler vacuum - tube device he had built in the 1930's while at Iowa State College. In 1973 the courts found in favor of the company using the Atanasoff claim.

User Avatar

Jerald Orn

Lvl 10
2y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

6y ago
History of computersThe first computer ever was made in 1945. They were not like we have them today. They were a sort of typewriters.

Follow the link for detailed information.

The first computers were people! That is, electronic computers (and the earlier mechanical computers) were given this name because they performed the work that had previously been assigned to people. "Computer" was originally a job title: it was used to describe those human beings (predominantly women) whose job it was to perform the repetitive calculations required to compute such things as navigational tables, tide charts, and planetary positions for astronomical almanacs. Imagine you had a job where hour after hour, day after day, you were to do nothing but compute multiplications. Boredom would quickly set in, leading to carelessness, leading to mistakes. And even on your best days you wouldn't be producing answers very fast. Therefore, inventors have been searching for hundreds of years for a way to mechanize (that is, find a mechanism that can perform) this task.

"The Father of Computers" Charles Babbage who invented the 1st mechanical Computer.. in 1834 he dreamt of designing mechanical calculating machines."... I was sitting in the rooms of the Analytical Society, at Cambridge, my head leaning forward on the table in a kind of dreamy mood, with a table of logarithms lying open before me. Another member, coming into the room, and seeing me half asleep, called out, "Well, Babbage, what are you dreaming about?" to which I replied "I am thinking that all these tables" (pointing to the logarithms) "might be calculated by machinery. "

This machine used the decimal numbers system and was powered by cranking a handle. The British government first financed the project but then later cut off support. Babbage went on to design his much more general analytical engine but later returned and produced an improved design (his "Difference Engine No. 2") between 1834 and 1869. Meanwhile Ada Lovelace is credited as the "first Computer programmer" since she was writing programs -that is, manipulating symbols according to rules-for Babbages machine.

The rest Followed.

For more details Click on the link to your right for the answer.
The history of computers starts out about 2000 years ago, at the birth of the abacus, a wooden rack holding two horizontal wires with beads strung on them. When these beads are moved around, according to programming rules memorized by the user, all regular arithmetic problems can be done. Another important invention around the same time was the Astrolabe, used for navigation. Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers entered with dials and was made to help his father, a tax collector. In 1671, Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It could add, and, after changing some things around, multiply. Leibniz invented a special stepped gear mechanism for introducing the addend digits, and this is still being used. The prototypes made by Pascal and Leibniz were not used in many places, and considered weird until a little more than a century later, when Thomas of Colmar (A.K.A. Charles Xavier Thomas) created the first successful mechanical calculator that could add, subtract, multiply, and divide. A lot of improved desktop calculators by many inventors followed, so that by about 1890, the range of improvements included: * Accumulation of partial results * Storage and automatic reentry of past results (A memory function) * Printing of the results Each of these required manual installation. These improvements were mainly made for commercial users, and not for the needs of science.While Thomas of Colmar was developing the desktop calculator, a series of very interesting developments in computers was started in Cambridge, England, by Charles Babbage (left, of which the computer store "Babbages, now GameStop, is named), a mathematics professor. In 1812, Babbage realized that many long calculations, especially those needed to make mathematical tables, were really a series of predictable actions that were constantly repeated. From this he suspected that it should be possible to do these automatically. He began to design an automatic mechanical calculating machine, which he called a difference engine. By 1822, he had a working model to demonstrate with. With financial help from the British government, Babbage started fabrication of a difference engine in 1823. It was intended to be steam powered and fully automatic, including the printing of the resulting tables, and commanded by a fixed instruction program. The difference engine, although having limited adaptability and applicability, was really a great advance. Babbage continued to work on it for the next 10 years, but in 1833 he lost interest because he thought he had a better idea - the construction of what would now be called a general purpose, fully program-controlled, automatic mechanical digital computer. Babbage called this idea an Analytical Engine. The ideas of this design showed a lot of foresight, although this couldn't be appreciated until a full century later. The plans for this engine required an identical decimal computer operating on numbers of 50 decimal digits (or words) and having a storage capacity (memory) of 1,000 such digits. The built-in operations were supposed to include everything that a modern general - purpose computer would need, even the all important Conditional Control Transfer Capability that would allow commands to be executed in any order, not just the order in which they were programmed. The analytical engine was soon to use punched cards (similar to those used in a Jacquard loom), which would be read into the machine from several different Reading Stations. The machine was supposed to operate automatically, by steam power, and require only one person there. Babbage's computers were never finished. Various reasons are used for his failure. Most used is the lack of precision machining techniques at the time. Another speculation is that Babbage was working on a solution of a problem that few people in 1840 really needed to solve. After Babbage, there was a temporary loss of interest in automatic digital computers. Between 1850 and 1900 great advances were made in mathematical physics, and it came to be known that most observable dynamic phenomena can be identified by differential equations(which meant that most events occurring in nature can be measured or described in one equation or another), so that easy means for their calculation would be helpful. Moreover, from a practical view, the availability of steam power caused manufacturing (boilers), transportation (steam engines and boats), and commerce to prosper and led to a period of a lot of engineering achievements. The designing of railroads, and the making of steamships, textile mills, and bridges required differential calculus to determine such things as: * center of gravity * center of buoyancy * moment of inertia * stress distributions Even the assessment of the power output of a steam engine needed mathematical integration. A strong need thus developed for a machine that could rapidly perform many repetitive calculations.A step towards automated computing was the development of punched cards, which were first successfully used with computers in 1890 by Herman Hollerith (left) and James Powers, who worked for the US. Census Bureau. They developed devices that could read the information that had been punched into the cards automatically, without human help. Because of this, reading errors were reduced dramatically, work flow increased, and, most importantly, stacks of punched cards could be used as easily accessible memory of almost unlimited size. Furthermore, different problems could be stored on different stacks of cards and accessed when needed. These advantages were seen by commercial companies and soon led to the development of improved punch-card using computers created by International Business Machines (IBM), Remington (yes, the same people that make shavers), Burroughs, and other corporations. These computers used electromechanical devices in which electrical power provided mechanical motion - like turning the wheels of an adding machine. Such systems included features to: * feed in a specified number of cards automatically * add, multiply, and sort * feed out cards with punched results As compared to today's machines, these computers were slow, usually processing 50 - 220 cards per minute, each card holding about 80 decimal numbers (characters). At the time, however, punched cards were a huge step forward. They provided a means of I/O, and memory storage on a huge scale. For more than 50 years after their first use, punched card machines did most of the world's first business computing, and a considerable amount of the computing work in science. Electronic Digital ComputersThe start of World War II produced a large need for computer capacity, especially for the military. New weapons were made for which trajectory tablesand other essential data were needed. In 1942, John P. Eckert, John W. Mauchly (left), and their associates at the Moore school of Electrical Engineering of University of Pennsylvania decided to build a high - speed electronic computer to do the job. This machine became known as ENIAC (Electrical Numerical Integrator And Calculator) The size of ENIAC's numerical "word" was 10 decimal digits, and it could multiply two of these numbers at a rate of 300 per second, by finding the value of each product from a multiplication table stored in its memory. ENIAC was therefore about 1,000 times faster then the previous generation of relay computers. ENIAC used 18,000 vacuum tubes, about 1,800 square feet of floor space, and consumed about 180,000 watts of electrical power. It had punched card I/O, 1 multiplier, 1 divider/square rooter, and 20 adders using decimal ring counters, which served as adders and also as quick-access (.0002 seconds) read-write register storage. The executable instructions making up a program were embodied in the separate "units" of ENIAC, which were plugged together to form a "route" for the flow of information. These connections had to be redone after each computation, together with presetting function tables and switches. This "wire your own" technique was inconvenient (for obvious reasons), and with only some latitude could ENIAC be considered programmable. It was, however, efficient in handling the particular programs for which it had been designed. ENIAC is commonly accepted as the first successful high - speed electronic digital computer (EDC) and was used from 1946 to 1955. A controversy developed in 1971, however, over the patentability of ENIAC's basic digital concepts, the claim being made that another physicist, John V. Atanasoff (left) had already used basically the same ideas in a simpler vacuum - tube device he had built in the 1930's while at Iowa State College. In 1973 the courts found in favor of the company using the Atanasoff claim.

This answer is:
User Avatar

User Avatar

Wiki User

11y ago

The computer was first as big as a living room. It could only type and print 2 documents. Then, it got smaller and smaller and more and more files could be stored and printed.

As the need for rapid calculations increased, commercial organizations introduced the electronic calculator to the market. Then the computer appeared.

The price of computers decreased, and they started appearing in more businesses, and then also in schools, libraries, and wealthier governments.

The internet connected computers together and provided more information to users, but there were also problems such as fraudulent websites, overabundant commercial emails, copyright violations, server outages, and lack of connectivity to poorer countries.

The dawn of the computer age brought about some disadvantages, such as incompatible file formats, fears about privacy, and error messages such as the Blue Screen of Death.

There are some websites to help at the links below.
To do so would require writing several hundreds or several thousands of volumes. This is entirely outside the scope of a website like this.

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

"The Father of Computers" Charles Babbage who invented the 1st mechanical Computer.. in 1834 he dreamt of designing mechanical calculating machines. "... I was sitting in the rooms of the Analytical Society, at Cambridge, my head leaning forward on the table in a kind of dreamy mood, with a table of logarithms lying open before me. Another member, coming into the room, and seeing me half asleep, called out, "Well, Babbage, what are you dreaming about?" to which I replied "I am thinking that all these tables" (pointing to the logarithms) "might be calculated by machinery. " This machine used the decimal numbers system and was powered by cranking a handle. The British government first financed the project but then later cut off support. Babbage went on to design his much more general analytical engine but later returned and produced an improved design (his "Difference Engine No. 2") between 1834 and 1869. Meanwhile Ada Lovelace is credited as the "first Computer programmer" since she was writing programs -that is, manipulating symbols according to rules-for Babbages machine. The rest Followed. For more details Click on the link to your right for the answer.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What is the history of computing?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

When was The Centre for Computing History created?

The Centre for Computing History was created on 2007-08-20.


Where can I get the history of Cloud Computing?

You can get the history on cloud computing on Wikipedia. It gives you everything you need to know about it including, models, architecture, and even the issues it may have.


How was HCC important in computing history?

HCC helped make the "Apple I"


Figuring Out Information on Cloud Computing Wiki?

When someone is looking for cloud computing Wiki information, they are looking to find out what cloud computing actually is on the Wikipedia page for this subject. People can find out the history of cloud computing, as well as how it works in the most basic sense as a computer networking strategy.


What function does the ATCA provide?

The ATCA, stands for the Advanced Telecommunications Computing Architecture . The ATCA sets specifications for communications equipment. The ATCA is the largest specifications effort in Computing history.


Which website offers the best cloud computing history?

Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a metered service over a network (typically the Internet)


What is the history and development of cloud computing?

Cloud computing evolved in response to the need for a more dynamic hosting environment for all sorts of computer applications. The idea was first voiced at MIT when a scientist spoke of a future where computing services would be more like a utility.


How was Homebrew Computer club important in computing history?

They invented the, "Log off, Stand By, and sleep buttons".


What is the history of Cloud computing?

The history of cloud computing goes back to the early 1960s. Computer Scientist, John McCarthy, delivered a speech where he spoke about the possibility of someday having computing power and related applications that could be accessed remotely, similar to electricity, water, and other utilities. The idea was further discussed in Douglas Parkhill's 1966 book, "The Challenge of the Computer Utility." The idea of cloud computing became popular in the late 1960s, but did not become available until the early 2000s.


American history words beginning with the letter i?

· IBM (Tabulating Computing Recording Corporation) incorporated on June 15, 1911.


How does the Analytical engine help us today?

it doesn't. except as part of the history of thought on possible computing machine designs.


Computing is computing that relates to emotion or deliberately tries to influence emotion?

affective computing!