We are pretty sure that the first word that crosses your mind when it comes to computing is revolution. This is mainly due to the fact that over the years, computer technology has not stopped evolving .
This has led us to a scenario where you have at your disposal inventions and devices that if you look closely, it is even possible that they did not exist a few months ago.
This evolution of computing and technology is easily verifiable by any user, that is, it is not necessary to be an expert to realize it.
Just look at the specifications of each new cell phone, tablet or laptop that is released every few weeks. All of them offer greater processing capacity in the same space or some device that will place them above the others in the competition.
What you will find here: The history of computing. The importance of integrated circuits. The origin of the personal computer. Generations of computers and much more.
The first beneficiary of the evolution of computing is of course the user.
However, the constant change in technology can also be confusing for many. In this sense, it can push many people to think that their device, bought perhaps a few months ago, is already old and useless.
Despite all the edges, it cannot be argued that computing has forever transformed the way we did things.
In this post we are going to talk about computing and how it relates to the modern world and people, clearly, understandably and without technicalities.
You can also learn more about computing in this post: What is computing?
The evolution of computing
Computing has reached a level that until recently was truly unthinkable. We can talk about 30, 40 or 50 years, which in a way seems like a lot, however in terms of technological evolution it is just a blink of an eye.
This means that in a relatively short time, computing was born and expanded to unsuspected limits.
The capacity and performance that a simple tablet has today, a few decades ago could only be obtained with huge printed circuits and many electronic components. Before that, with huge structures full of vacuum valves, coils and kilometers of cables located in places specially equipped for it.
At present, this evolution in computing has allowed us to build a world to our measure, where everything can be measured, cataloged and put in its place.
And although it seems that we are talking about some dystopian novel, the truth is that in these times technology is used much more for good than for evil.
Among the advantages that computing offers us today are air traffic control, medical and scientific research, communications and a long etcetera.
The truth is that without the advances that have occurred in terms of computing in recent years, both in hardware and software, none of this would be possible. Computing advances, investigations are streamlined, new devices are created and with them computing advances again.
Fortunately, this is the cycle of evolution that is giving so many good results to Humanity.
This is demonstrated by the enormous number of devices that you have in use at home, at work, in the car or even as part of your clothing.
All of this is evolving at a dizzying rate, and they are becoming more powerful and capable. All these devices allow you to do things that you could not before , and can even be beneficial for your health.
For a long time, in all corners of the world, Humanity was in search of a method that would allow it to symbolize and manage information .
In this way, it could be standardized, and in this way it could be analyzed anywhere and at any time by people with the necessary knowledge.
With this, the information would be available to consult or modify it according to the needs.
This is something that started with knots in ropes, went through clay tables and reached the first calculating machines . Humanity has always found the way to improve itself and its inventions.
All of this led, a few years later, to the fact that we could land a space probe on the surface of an asteroid to confirm what they are composed of.
This path began when Humanity became industrialized and its needs became more difficult to solve. From there to having one or more computers, smartphones, tablets and other amazing electronic devices in your own home, there was only one step.
To try to know how the history of computing developed, you have to understand a fundamental point:
It all started with Humanity’s need to safely manipulate and recall information in a symbolic way.
We could assure you that computing was born in the mid-1940s.
However, being a little more rigorous, we must mention that computing has at least four stages of development.
The first of the acts in this kind of historical play occurs in the ancient world. It begins when the wise men of that time began to perform calculations with the help of common objects in their environment such as stones.
At this point it should be noted that the word “calculus” has its origin in the Latin term “Calculi”, that is to say “little stone”.
From there it evolved until it reached the abaci and the calculation boards. Elements that can still be used without problems to make calculations of all kinds.
The second act of the play is perhaps the most important of all. Since it should be considered as the point where computing really began its journey: The creation of ENIAC .
This tube computer was a military-grade device capable of plotting the direction of a projectile without having to perform the relevant field tests.
ENIAC, acronym for Electronic Numerical Integrator and Computer, whose translation into Spanish is ” Electronic Numerical Calculator and Integrator”, was released to the public in February 1946.
It should be noted that it had approximately 18,000 valves in its electronics.
At this point there are certain discrepancies, since there are certain researchers who argue that before ENIAC, the English already had “Colossus” in service .
Of the latter, several units were manufactured during World War II in order to help decipher German codes.
In this sense, it should be noted that there was also a predecessor to both machines, although it never came into operation. This was a device capable of performing calculations at high speeds by means of vacuum tubes.
This computer was developed by John V. Atanasoff and Clifford Berry at Iowa University in 1942.
The third act of this work takes place at the end of the 1970s, precisely in 1977, when Steve Jobs and Steve Wozniak presented the Apple II to the world.
Why should we consider this fact as a turning point in this story?
Although first on the list is the Altair 8800, because it was released earlier, it was really impractical to use.
This unfortunately made it a device for people “in the know.” However, it managed to give the initial push for the computing world to spread rapidly.
What really changed the story is the Apple II. This computer, together with the later IBM PC, managed to displace computing from universities and large companies to homes across the globe.
And they could also be used by any type of user regardless of their knowledge or ability!
Addendum: The First Personal Computer
The third act in this story is really important. It is about the moment when computing went from being something academic to becoming a device that could coexist with television and radio.
This meant that it became an integral part of any household appliance, such as the oven or the refrigerator.
This story begins in 1975, when a company known as MITS released a computer kit that could be put together by anyone with the necessary knowledge.
The main feature of this kit was that it cost about $ 400. Much less than the tens of thousands of dollars that computers on the market used to cost at the time.
In addition to the price, it offered many other advantages, such as a floppy disk drive to store data and programs. It also had a version of BASIC so that users could create their own programs with their new computer.
This BASIC was developed by a team led by Bill Gates, then Altair developer.
It also offered some other very important advantages for the time.
Despite all these favorable features, the Altaír’s biggest problem was that it was not designed to be used by just any user.
To use it, a series of knowledge in programming, electronics, mathematics and other sciences was necessary.
Fortunately, this changed in the late 1970s, precisely in 1977 when the Apple II appeared on the market. The first really easy to use and install computer for any type of person.
The real boost that brought personal computers to the preferred place where they are today, however, was an application called Visicalc.
This program allowed you to work with incredible speed with rows and columns just as if they were spreadsheets , a concept known to accountants around the world.
It could be said that this combination, the Apple II and Visicalc, were the true origin of personal computing. He was also responsible for turning the scenario known up to that point, since after that the software developers came to the fore.
This left behind the prominence of large device manufacturers such as IBM.
The best example of this can be found in Microsoft and Bill Gates, who have been leading the course of computing around the world for years.
Of course, the aforementioned IBM was not going to be left out of this fabulous business, and soon released its own computer, the PC or “Personal Computer”, on the market in 1981 .
This new computer offered a number of advantages that made it interesting for other hardware manufacturers and developers.
The Personal Computer or by its translation into Spanish “Personal Computer”, was designed around an open architecture. This allowed other companies in the IT sector to offer peripherals, expansion boards and software that were perfectly compatible with the IBM PC.
A very advanced processor was also used, which allowed it to use much more memory than any of its rivals on the market. It also included a spreadsheet called Lotus 1-2-3 and an operating system developed by Microsoft.
In this type of competition, whoever is left behind always loses. For this reason, Apple invested a great deal of resources and effort to launch the Apple Macintosh in 1984.
This was the first personal computer and any other type to include a graphical interface and a pointing device, called a mouse.
These concepts were taken from research conducted in the 1960s at Xerox laboratories in Palo Alto, California.
Apple took this concept a little further and presented it to the general public, and it achieved great commercial success. This same concept was applied by Microsoft to its new operating system, called Windows.
At this point the bifurcation between architectures and operating systems was already clear, which would carry over until decades later.
The fourth, and for now the last act in the history of computing, could be considered as the emergence of the Internet as a method of communication.
The Internet was born within the United States Department of Defense and the Pentagon’s Advance Research Projects Agency , ARPA.
From the tests and protocols ordered by these organizations, as far back as 1960, the Internet was born . From that incipient communication between two nodes in 1969 extends the network of networks that we all know today.
Personal computers became popular with everyone in the 1980s. Over time, they relegated to large computers and mainframes everything related to managing large volumes of data.
This was basically because they could not compete with the price of PCs and their applications and ease of use.
But fundamentally, it was the possibility of communicating two computers and allowing information to be shared between them regardless of distance.
All of these elements led to the reign of the personal computer to this day.
Addendum: The Birth of Ethernet and the Internet
Over time, business activities where large amounts of data were required to be managed had already begun to incorporate computers.
However, the truth is that in order to handle this flow of data, these computers were huge, required custom-designed software, and were also incredibly expensive.
This was when the thought of incorporating smaller but interconnected computers began.
This resulted in the development, again by the Xerox corporation, of a system for interconnecting computers through cables called “Ethernet.”
As a curious fact, Ethernet was named after the beliefs of ancient physicists who held that light was transmitted through the Ether.
This computer interconnection system allowed multiple devices to connect to each other regardless of their location within a building. With this, it was possible to share resources such as memory, printers and other peripherals between them.
In addition to providing one of the most useful tools of all time: email.
Also, years later, it motivated the birth of a much larger network, the Internet.
Simultaneously with the birth of Ethernet, other things were also developing.
In this sense, thanks to the investigations carried out by ARPA and other organizations, it was possible to do the same but with computers that were geographically much further apart.
Originally, these investigations had the objective of developing a method so that in the event of a war , communications could continue to operate, even in the face of a nuclear disaster.
It should be noted that until now the communications networks were centralized. In this sense, as much as the buildings containing the nodes and the control electronics were well secured underground and in concrete structures, they could always be hit and destroyed.
This obviously would completely dismantle any attempt at communication between these computers and devices.
That is why a decentralized communication network system was necessary . And that is why the aforementioned ARPA began to finance research around this type of model.
This led to the implementation of a communications system in which the information was divided into packets. Each of these packets with a specific address to reach a receiving computer.
This allowed that if one or more nodes of the network were not in operation, the system could always find another way to deliver the corresponding packet.
Once all the packets were received, the receiving computer reassembled the packets into the original document.
Although it had been designed for other purposes, the truth is that specialists and scientists began to use the system to send and receive short messages .
Of course this was quite cumbersome, as it had not been developed to do this easily.
This was until 1973, when Ray Tomlinson changed history forever. In order to send messages in a more efficient and orderly manner between computers , he placed an at sign “@” between the name of the recipient of the message and the name of the computer that issued it.
The simple reason why the “@” symbol was chosen for this is quite simple. It’s basically because it was one of the few non-letter symbols on the ARPANET ticker input device keyboard at the time.
We are at the birth of email as we know it today.
Going back to the beginnings of the Internet, some years later permission was granted to the National Science Foundation (NSF), a civil agency that was funded by the state, to expand the network.
With this it was possible that other networks could be interconnected with ARPANET.
Why don’t you know what the chosen name was? The name used by the researchers to call this union of networks was the Internet!
Finally, in 1983, a protocol called TCP / IP was adopted for the standardization of all modes of data transmission . This laid the foundations of the network of networks and is the protocol used today.
With these changes in the incipient Internet, new models of computers , called workstations , also appeared on the market .
These computers were much better suited for networking than a standard computer.
These machines were perfect for the job, as they had specific hardware and operating systems for network management , such as UNIX.
But the panorama of that Internet really changed in 1991, when the World Wide Web was presented in society , by the hand of Tim Berners-Lee.
The World Wide Web was basically a series of protocols operating on the Internet’s own protocols.
These essentially allowed a simple way to access the various contents of the network, regardless of the type of computer, device, operating system or protocol used to connect.
With this set of guidelines, the first web browser also appeared .
Of course, this was the spark that got the Internet to unfold to become what it is today, with all the services that it is capable of providing, such as online storage, various apps in the cloud, and more.
Tim Berners-Lee, in addition to being the developer of the World Wide Web, was also responsible for the development of a small application called a web browser.
This program made it possible to access Internet content easily from any personal computer.
In this sense, although the creation of Berners-Lee was overcome in a short time, it laid the foundations for the development of another browser with a little more history: Mosaic, created by engineers from the University of Illinois in 1993.
This rudimentary browser gave way to Netscape Navigator , which was the decisive leap for computer users to begin exploring the web.
Eventually Netscape Navigator was sold to Microsoft, which eventually turned them into Internet Explorer. Then came the other browsers and Internet services.
The story continues and keeps moving forward.
To get to be reading this on your computer or any of your mobile devices, multiple events had to be triggered.
First was the abacus. Then the analytical machine in the 19th century.
This was followed by computing machines to carry out arithmetic operations and the first computers with their perforated tapes and magnetic core disks.
Today the gigantic databases move the world.
All this is thanks to the fact that we have always looked for the best way to solve complex problems more and more quickly.
Although the history of the computer as you know it starts as mentioned above, the truth is that there was much more in between.
For this reason , the history of computers is separated into so-called generations.
These generations mark a turning point, where technology changes and gives way to something new, and much more powerful.
Computer generations are basically arranged like this:
Covers from 1940 to 1956: Prevalence of vacuum valve computers .
These could only be acquired and used by research centers and military institutions. Its manufacturing and maintenance cost, in addition to its programming difficulty and its large size, made it impossible for even large companies to have one.
Covers from 1956 to 1963: Solid-state computers with transistors appear.
The possibility of having one of these computers in many companies becomes simpler. This is because they were simpler to produce than valve computers.
Covers from 1964 to 1971: Solid state computers with Integrated circuits appear.
It should be noted that we owe the integrated circuit to Robert Noyce and Jack Kilby. These computers, much more powerful, smaller and cheaper than their predecessors, reached the desk of many companies. This allowed the expansion of its activities, accelerating all its processes.
Covers 1971 to present: Solid-state computers with microprocessors.
Computer science as you know it and as you usually use it. These are capable of processing and transmitting a large amount of data simultaneously, which partly favored the growth of the Internet.
At this point, computers start to get more powerful every so often. Approximately every two years, a new microprocessor, much more efficient and powerful, comes on the market, replacing the previous one.
This is mainly due to the miniaturization of the internal components of said processors.
The funny thing is that this was predicted by an Intel engineer named Gordon Moore in 1965. This engineer claimed that the number of transistors in integrated circuits would double every year. This is known as Moore’s Law, and it is still in effect.
If you want to know more about processors, in this post you will find everything you need: The computer’s processor.
This stage is in progress. At this point, technology is leaving super important advances in quantum computing and artificial intelligence.
6th, 7th and 8th generation
Although, as mentioned, the fifth generation is still passing, it is possible to speculate about what will come next.
Far from those first calculating machines, who knows what these new technologies developed in these coming generations will bring to your life!
This was just a very basic glimpse of generations of computers.
If you want to know more, we recommend that you go through this post: The generations of computers: 1, 2, 3, 4, 5, 6, 7 and … the eighth generation.
There is no doubt that computing is today one of the most interesting bets that can be taken into account to tackle a career.
Computer science extends to many areas where before it was not even feasible to install a computer. Examples are computers inside a vehicle or as a refrigerator control unit.
This means that the application of computer engineering is not limited to the field of computers or their hardware or software.
It can be implemented in practically any scenario where data is required to be processed efficiently, quickly and accurately.
Basically, computer engineering or computer engineering as it is also called in other parts of the world, is a branch of engineering. With it, the foundations of many other sciences can be applied in a specific area.
As an example of this we can mention telecommunications engineering, electronic engineering, software engineering and of course computer science, among other sciences.
This is in order to provide data processing solutions autonomously and comprehensively from the point of view of computing and communications.
To apply computer engineering to any process, this branch of engineering must encompass certain theoretical faculties. These will ultimately be the ones that allow us to study the problem from the root.
Obviously from there you should start to find the implementation that best suits the needs of that particular problem.
In order to successfully address any scenario, a computer engineer must have knowledge of data and information processing implementation in telecommunications.
This will allow you to calculate and develop, for example, communication networks that are within the parameters of security and legality.
It is also necessary for the computer engineer to have deep knowledge of robotics, artificial intelligence, algorithms, operating systems and of course the largest number of programming languages .
This is very necessary to be able to carry out successfully any development where this type of implementations is required.
In addition, you must have knowledge about software development and maintenance, as well as the aspects derived from its possible commercialization. Also the computer engineer must have extensive knowledge in aspects such as industrial and business organization.
This can be useful to implement the planning, direction and control of IT projects in IT departments.
Likewise, the computer engineer must have practical and theoretical knowledge about electronics. This is necessary to physically develop, design, and build control interfaces and other mechanisms to complement the software developments you may have created.
There is no doubt that computer engineering today is one of the most challenging careers that can exist for any student. Obviously this is so since for its successful development all the sciences that comprise it must be studied and understood.
This results in a really daunting task. But without a doubt , computer engineering is one of the careers that in a few years will be essential to be able to continue with our lives so surrounded by sophisticated electronic inventions.
All these are necessary knowledge to be able to develop in today’s demanding environment. If you are not prepared to develop in fields such as artificial intelligence or quantum computers , you can hardly insert yourself successfully.
Artificial intelligence now takes on an importance that goes beyond the field of research. It is among us in biometric recognition devices, in virtual assistants and in many other implementations.
Therefore, learning about the scope of artificial intelligence is essential. Obviously, it is not too much to know the fundamentals of quantum computing. These are the most important fields of the so-called fifth generation of computers.
An excellent place to start is this post: The fifth generation of computers.
There is a great discussion about the definition of educational informatics , and about the role that the computer should play in educational institutions such as schools or colleges.
Depending on the educational vision and the technical / pedagogical conditions, this term can take on different meanings.
It could be said that educational informatics means “the insertion of the computer in the teaching-learning process of the curricular contents of all levels and modalities of education. The affairs of a certain curricular discipline are developed with the help of a computer. “
Finally, educational informatics will be in charge of teaching the student about computational qualifications, where he is trained in commercial applications.
We can say that it is not enough to have technical knowledge and to know in depth the components of the computer, or to know how to program with different languages.
There are other diverse aspects that must be considered in this process. The most important thing is to be aware of the computer implications in society.
Computing and the computer
As we have already mentioned, computing uses two fundamental platforms. On the one hand, the software, which are all those programs, operating systems and others that allow specific tasks to be carried out.
On the other hand, you have the hardware, which is basically the physical support with which the software will interact. Hardware must be understood as all the physical components of a computer.
In this sense, the hardware without the software or vice versa, are elements that have little use.
For this reason, computers since their inception have been the necessary tool for the existence of computing.
According to the data collected from history, the first computer was called Z3, designed by Konrad Zuse in 1941. This is considered the pioneer in the field of automatic programmable machines.
The truth is that from that moment on we must make a qualitative leap in regard to the conception of this type of machines, and locate ourselves at the beginning of the 1980s. Perhaps the most important leap that computing has made in all its history.
At this time, the first personal computer, also known by the acronym PC, was presented to the world . This was developed by IBM with the collaboration of two geniuses of the digital world: Bill Gates and Paul Allen.
That year should be taken as the starting point where computing began to develop non-stop.
Thanks to research related to the miniaturization of electronics, the development and advancement of the Internet, added to the growing number of software, we have reached this fantastic present.
Today computing is an everyday element of your life, transcending all borders. In addition, it has made your life easier in many ways, and it is even possible that you have become too dependent on it.
Like all new development, technology is not without its problems. Some of them are serious problems, such as waste and the gap between the haves and the have-nots.
In this sense, if you want to know more about the subject, we recommend that you go through this post: Advantages and disadvantages of technology.
There are other problems that might seem easier to solve, or that don’t raise as much alarm. One of the most important without a doubt are viruses.
Ever since Core War ushered in the era of computer viruses, the history of computing cannot be separated from them.