The computer is currently one of the most used tools worldwide by all kinds of users, and they perform tasks that go beyond the objective of when they were developed. Nowadays a simple computer can be used as much as for simple entertainment and social networks even to control aspects as complicated as space missions in a completely automatic way.
Such is the variety of jobs in which a computer can help humanity. However, few people know that these devices were not born with Windows, but have been with us for centuries, perhaps in ways that have nothing to do with current ones, but they do not stop being computers for that reason.
So that we can all learn more about the history of computers , in this article we present a complete chronology from its earliest years, a few centuries ago.
Introduction to the history of computers
From this point on, we will learn more about the history of perhaps the technological device that has most helped the human species evolve. Hand in hand with software, computers have made it possible to perform calculations that otherwise would have taken much longer to complete, thus ensuring that the development processes of all other technologies could be carried out much faster, more efficiently and at a cost. much lower monetary.
In the dictionary of the Royal Spanish Academy, the definition for the word “Computer” tells us that it is a “device for mathematical calculations”, and that is why we can so easily encompass the different calculation technologies existing from practically the origins of mankind.
The history of the computer, contrary to what many may imagine, has its beginning a long time ago, when man discovered that he could make beads with his fingers, or with other objects, such as stones or pieces of wood. These accounts became more and more complicated as humanity learned, and soon men realized that they would need some device that would allow them to perform more complex calculations at a higher speed.
Origin of Computers: The First Calculating Machines
For these needs, approximately 4,000 BC, a very simple apparatus was created consisting of a clay plate where stones were moved to aid in calculations. That device was called abacus, a word of Phoenician origin. By 200 BC, the abacus had changed, and was made up of a rectangular wooden molding with parallel rods and perforated stones that slid down these rods. The concept and functions of the abacus remain intact to this day, as even this device is still used in, for example, learning for the blind.
We could say that the abacus is the starting point of this story. From here the computers would lead us to explore the nearby planets first and after that, who knows?
After abacus, the next step in the history of computers (year 1642), occurred when an 18-year-old Frenchman named Blaise Pascal, invented the first adding machine: the Pascaline , which performed arithmetic operations when the discs that were meshed, thus being the forerunner of mechanical calculators.
Around 1671 in Germany, Gottfried Leibnitz invented a machine very similar to the Pascalina, which carried out multiplication and division calculations, and which was the direct predecessor of manual calculators.
In 1802 in France, Joseph Marie Jacquard used punch cards to control and automate his loom machines. At the beginning of the 19th century, more specifically in 1822, a differential engine was developed by an English scientist named Charles Babbage that allowed calculations such as trigonometric and logarithmic functions, using the Jacquard cards.
In 1834, he developed an analytical machine capable of performing all four operations (adding, dividing, subtracting, multiplying), storing data in a memory (up to 1,000 50-digit numbers), and printing results.
However, his machine can only be completed years after his death, becoming the basis for the structure of today’s computers, causing Charles Babbage to be considered the “Father of the computer”.
The Beginning of the Computing Age
In 1890, the time of the US census, Hermann Hollerith realized that he would only be able to finish processing the census data when it was time to start the new census (1900). Then he perfected the system of punched cards (those used by Jacquard) and invented machines to process them, thus achieving the results in record time, that is, 3 years later.
Based on the results obtained, Hollerith, in 1896, founded a company called TMC – Tabulation Machine Company, which came to be associated, in 1914 with two other small companies, forming the Computing Tabulation Recording Company which was to become, in 1924, at the well-known IBM, International Business Machines.
In 1930, scientists began to make progress in inventing complex machines, with Vannevar Bush’s Differential Analyzer heralding the modern computer age. In 1936, Allan Turing published an article on “Computable Numbers” and Claude Shannon wrote a thesis on the connection between symbolic logic and electrical circuits. In 1937, George Stibitz built the famous “Model K” on his kitchen table, a digital machine based on relays and cables.
Manchester Mark 1, Harvard Mark I, Z3 and Z4: Dawn of the Computing Age
With the arrival of the Second World War, the need arose to design machines capable of executing ballistic calculations with speed and precision so that they could be used in the war industry.
With that came, in 1944, the first electromechanical computer (built at Harvard University, by the team of Professor H. Aiken and with the financial help of IBM, which invested US $ 500,000.00 in the project), had the name of MARK I, it was controlled by programs and used the decimal system.
It was about 15 meters long and 2.5 meters high, it was surrounded by a shiny glass and stainless steel case and some of its most important characteristics were that it had 760,000 parts, 800 km of cables, 420 control switches and he was able to perform an addition in 0.3s, a multiplication in 0.4s and a division in about 10s.
Harvard Mark I provided his mathematics services at Harvard University for a full 16 years, despite not having been very successful, as it was already obsolete before it was built, because in 1941, Konrad Zuse, in Germany, already he was creating test models: Z1 and Z2, and immediately after completing an operational computer (Z3), which consisted of a program-controlled and binary-based device and was much smaller and much cheaper to build than the Mark I. .
The Z3 computers and those that followed, the Z4, were used in solving aircraft engineering problems and missile projects. Zuze also built several other computers for special purposes, but did not have much support from the German government, since Hitler, at that time ordered all scientific research to be stopped, except for short-term ones, and since Zuze’s project would take about 2 years to be concluded, he had no support. One of the main applications of the Zuze machine was to decipher the secret codes that the English used to communicate with commanders in the field.
Another of the first electronic computers was the Manchester Mark 1, developed at the University of Manchester from the Small-Scale Experimental Machine (SSEM) or “Baby”, the first electronic computer with stored programs. It was also called the Manchester Automatic Digital Machine, or MADM. Work began in August 1948 and the first operational version was presented in April 1949.
The proper functioning of the machine was widely touted by the British press which used the expression “electronic brain” in the description to its readers. Mark 1 was initially developed to provide computing services within the university as an experience for researchers in the practice of using computers. It also quickly became the basis for a prototype for a commercial version.
Development ceased in late 1949 and the machine was disassembled in late 1950 , replaced in February 1951 by the first installation of the Ferranti Mark 1, the first commercially available general purpose computer.
Mark 1 was very important in pioneering the inclusion of a record index, an innovation that made it easier for a program to read sequentially through a set of words in memory. Thirty-four patents arose from its development, and many of the ideas that emerged from its inception were integrated into later commercial products , such as the IBM 701 and 702, as well as the Ferranti Mark 1.
Timeline of Computer History: 1614-1956
John Napier (1550-1617) published a text on the discovery of the logarithm. Napier also invented the Rods system (referred to as Napier’s Rods or Napier’s Bones).
This made it possible to multiply, divide, square and cube by turning the rods, and placing them on special plates.
Wilhelm Schickard (1592-1635), in Tuebingen, Wuerttemberg (now Germany), created the “Calculating Clock”. This instrument was capable of adding and subtracting 6 digits, and in the event that the result is greater than 6 digits, it rang a bell. The operations were made by means of a crank, which rotated and the numbers changed, as in the K7 counter of our day.
A French mathematician, Blaise Pascal built the adding machine (the “Pascalina”). Despite being inferior to Schickard’s “Calculating Clock” (see 1623), Pascal’s machine became more famous .
He sold dozens of copies of the machine in various forms, managing to process up to 8 digits.
After many attempts, the first calculating machine capable of developing the four mathematical operations (addition, subtraction, division and multiplication) and also the square root is finally invented in 1672 .
That great achievement was attributed to the mathematician Gottfried Wilhelm Von Leibnitz who improved Pascal’s machine and obtained the universal calculator.
The automatic loom was a loom with input of data through punched cards to control the making of fabrics and their respective drawings.
It was created in 1801 by Joseph Marie Jackuard and can be considered the first programmable mechanical machine in history.
The Differential Engine was idealized by Cambridge University mathematician and professor Charles Babbage in 1822. It was a cogwheel-based mechanical device capable of computing and printing extensive scientific tables.
Despite so many advantages, this machine was never built due to the technological limitations of the time.
George Scheutx, from Stockholm, produced a small wooden machine, after reading a short description of Babbage’s project.
English mathematician George Boole invents Boolean binary algebra , paving the way for the development of computers almost 100 years later.
Ramón Verea, living in New York, invents a calculator with an internal multiplication table; in other words, easier than turning gears or other methods.
He was not interested in producing it, he only wanted to show that the Spanish could invent like the Americans.
A more compact multiplication calculator goes into mass production. The production is more or less simultaneous with the invention of Frank S. Baldwin, from the United States, and T. Odhner , a Swiss living in Russia.
In 1880 the census conducted in the United States took seven years to complete, as all calculations were done by hand on newsprint. Due to the increase in population, it was imagined that the 1890 census would take more than 10 years, so a contest was held to find the best method to compute the results.
This contest was won by a Census employee, Herman Hollerith, who would found the Tabulating Machine Company, which later became IBM . Herman borrowed Babbage’s idea of using punch cards (see 1801) to make the memory system. With this method used in 1890, the result (62,622,250 people) was ready in just 6 weeks. With the memory system, the analysis of the results was very easy but, despite being more efficient, the cost of the 1890 Census was 198% more expensive than that of 1880.
As a result of the 2nd World War, the Z3 computer, built by the Germans, had as its main function the encoding of messages . However it was destroyed in Berlin leaving us very little information about this computer.
As well as the Germans, the English also went in search of technologies to decipher secret codes, then building the Colossus (British Intelligence Service).
Possessing gigantic dimensions, Colossus worked by means of valves, processing about 5 thousand characters per second . It was invented by the English mathematician Alan Turing.
Mark I (Howard Aiken) was the first electromechanical computer built. Quite different from today’s computers, the Mark I was 18 meters long, two meters wide, and weighed 70 tons.
It was made up of 7 million moving parts and its wiring reached 800 km. With the arrival of the Mark I electronic computers, it was immediately replaced.
John Von Neumann, a Hungarian and naturalized American mathematical engineer , developed a computer project based on logic, with electronic storage of information and programming data. The computer would process the data according to the user’s needs, that is, the instructions would not come predetermined. Later that computer was built receiving the name of Edvac .
The first computer BUG was reported by Naval Officer and Mathematician Grace Murray Hopper, the BUG was a moth inside the computer, which caused the computer to have a flaw in its calculations.
John W. Mauchly and J. Prester Eckert Jr., together with scientists from the University of Pennsylvania, built the first electronic computer, known as ENIAC (Electronic Numerical Integrator and Calculator), it had approximately 18 thousand valves, weighed 30 tons and reached to consume 150 KW.
In return, it exceeded a thousand times the speed of other computers, reaching 5,000 operations per second.
Presper Eckert and John Mauchly, pioneers in the history of the computer, founded the Cía. Eckert-Mauchly Computer Corporation , aiming to manufacture machines based on their experiences such as the ENIAC and the EDVAC.
The first commercial computer is invented, called UNIVAC. John Bardeen, Walter Brattain, and William Shockley of Bell Labs would patent the first transistor.
Thomas Watson Jr. in a talk at an IBM sales meeting foreshadowed that all moving parts in computers would be replaced by electronic components within a decade.
The Univac was the first commercialized computer. Designed by J. Presper Ecker and John Mauchly, it executed 1905 operations per second and its price reached US $ 1,000,000.
Heinz Nixdorf founded the Cía. Nixdorf Computer Corporation, in Germany . It remained an independent corporation until its union with Siemens in 1990.
International Business Machines IBM launches its first digital computer, the IBM 701 . As the first commercialized brand computer, 19 machines were sold in three years.
The mathematical genius Alan Turing published the book “On Computable Numbers” posing significant questions about programming and human intelligence.
He used his applications of logic in the development of the concept of the Universal machine. Texas Instruments announced the start of production of the transistors.
Announced by AT&T Bell Labs , the Tradic was the first transistorized computer , having approximately 800 transistors in place of the old vacuum tubes, allowing it to work with less than 100 Watts of power consumption.
At MIT (Massachusetts Institute of Technology) researchers began testing data entry on computer keyboards. In the same place the tests with the first computer with transistors (Transistorized Experimental Computer) began.
Timeline of Computer History: 1957-1981
A group of engineers led by Ken Olsen left MIT’s Lincoln Laboratory and founded the Digital Equipment Corporation DEC. This year a new language was also created: Fortran, which allowed the computer to perform repeated tasks from a set of instructions.
Jack Kilby created the first integrated circuit at Texas Instruments to prove that resistors and capacitors could exist in the same piece of semiconductor material. Its circuit was made up of a splinter of germanium and five components connected by wires. Japan’s NEC built the first electronic computer, the NEAC.
The IBM 7000 series of mainframes was the first of the company’s solid-state computers . At the top of the line of computers was the 7030, also known as STRETCH.
Seven computers, which used 64-bit words and other innovations, were sold to national laboratories and other scientific users. LR Johnson was the first to use the term “architecture” to describe the STRETCH.
The Dataphone, the first commercial modem, was designed specifically to convert digital computer signals into analog signals for transmission over its long-distance networks.
A team led by several computer manufacturers and the Pentagon developed COBOL, the Common Business Oriented Language, the first language to be used in computer programming. IBM creates the first massive transistor factory in New York.
The UNIMATE is created , the first industrial robot that came into operation at GM. Its function was to stack pieces of hot metal, work that was carried out without problems.
MIT students Slug Russel, Shag Graetz and Alan Kotok wrote SpaceWar !, considered the first interactive computer game. The game featured interactive graphics that inspired future video games.
The ASCII code (American Standard Code Information Interchange) is developed, which allowed machines from different manufacturers to exchange data with each other. Digital Equipament sells the first mini computer. Douglas Engelbart receives the patent for the first computer mouse.
DartMouth College professors Thomas Kurtz and John Kemeny created BASIC, an easy-to-learn programming language . Also around this time, the CDC 6600 computer was created, designed by Seymour Cray, which was capable of executing up to 3 million operations per second and had a processing speed three times faster than its competitor. It was the fastest until the arrival of its successor, in 1968, the CDC 7600.
Gordon Moore says integrated circuits are going to double in complexity every year.
DEC introduced the PDP-8, the first mini-computer to be successfully marketed. It was sold for US $ 18 thousand.
Hewlett-Packard entered the general purpose computer business with its HP-2115 offering high processing power found only in large computers. She supported a wide variety of languages, including BASIC, ALGOL, and FORTRAN.
IBM introduces the first storage disk, the IBM RAMAC 305, which had the capacity of 5 MB.
Seymour Papert designed LOGO as a computing language for children. Initially as a drawing program, the LOGO controlled the actions of a mechanical ‘turtle’, which traced its trail on paper. IBM built the first floppy disk.
Data General Corporation, a company created by a group of engineers who left DEC, introduced the NOVA computer. With 32 KB of memory, it was sold for US $ 8 thousand. The simple architecture of the instruction set inspired Steve Wozniak’s Apple I eight years later. Robert Noyce, Andy Grove, and Gordon Moore found Intel.
Programmers from AT&T Bell laboratories, Ken Thompson and Denis Richie developed UNIX, the first operating system that could be applied to any machine. That year, the American army connected the Arpanet machines , forming the network that would originate the Internet.
The SRI Shakey was the first international mobile robot controlled by artificial intelligence. Protests against the Vietnam War reached university computer centers and at the University of Wisconsin, injuring one man and damaging four computers. The Banco Nacional del Sur, in Valdosta, installed the first ATM machine for its citizens.
The first computer-to-computer communication developed when the American Department of Defense established four communication points on the ARPANET: University of California-Santa Barbara, UCLA, SRI International, and University of Utah.
The Kenbak-1 was the first personal computer announced by an American scientist, for $ 750. The first commercial for a microprocessor, the Intel 4004. An IBM team led by Alan Shugart invented the 8 “floppy disk.
Launch of the Intel 8008 microprocessor. Hewlett-Packard, HP, announced the HP-35 as “the fastest and most accurate electronic calculator” with solid-state memory similar to that of a computer.
Steve Wozniak built the “Blue Box”, a tone generator for telephone attention. Nolan Bushnell introduced Pong and his new company, Atari video games.
Robert Metcalfe designed Ethernet, a method for networking, at the Xerox Research Center in Palo Alto, California. The TV Typewriter, developed by Don Lancaster, projected the first alphanumeric information display in a regular TV studio. The Micral was the first commercial computer based on a microprocessor, the Intel 8008.
Xerox researchers at the research center in Palo Alto designed the ALTO, the first workstation with an internal mouse input. Intel and Zilog introduced new microprocessors. David Silver, from MIT, designed the silver arm, a mechanical arm for assembling small parts through feedback from the touch and pressure sensors present in the robot.
Scelbi announced the 8H computer, the first commercial computer announced in the United States based on the Intel 8008 microprocessor.
The January issue of The Popular Electronics announced the Altair 8800 computer, based on an Intel 8080 microprocessor. Telenet, the first commercial network, equivalent to ARPANET, was installed. The Visual Indicator Module (VDM) prototype, projected by Lee Felsenstein, marked the first implementation of a memory-mapped alphanumeric video indicator for personal computers.
Tandem Computers launched the Tandem-16, the first fault-tolerant computer for on-line transaction of processes. The Imsai 8080 produced by IMS Associates, a computer made with the same BUS structure of the Altair 8800, is also launched.
Steve Wozniak designed the Apple I, the first single-board computer. Gary Kildall developed CP / M, an operating system for personal computers.
The Commodore PET (Personal Eletronic Transactor) was the first of many personal computers that came out this year. Apple II was a success at its launch, in 1977, due to its characteristics: printed circuit on its motherboard, power supply, keyboard and game cartridges. In the first month after its launch, Tandy Radio Shack’s personal computer, the TRS-80, sold 10,000 units in the first year, more than the company’s projected 3,000.
The United States government adopted the IBM Data Encryption Standard, the key to unlocking encrypted messages, which serve to protect confidential data within their agencies. Also that year, the SOL was launched, an easy-to-use computer that only needed a monitor and that attracted many people.
The VAX 11/780 from Digital Equipment Corporation, was characterized as a machine capable of processing up to 4.3 gigabytes of virtual memory , proving to be the most rapid of mini computers of the time.
The 5 “floppy disk became the standard for personal computer software, immediately after Apple and Tandy Radio Shack’s introduced their software for this format.
Motorola’s 68000 microprocessor was much faster than microprocessors of the time. Programmers Daniel Bricklin and Robert Frankston of Harvard University developed VisiCalc, a program that transformed business computers into personal computers.
Carver Mead, a professor at the California Institute of Technology, and Lynn Conway, a scientist at Xerox Corporation, wrote a manual on the one-chip project called “Introduction to VLSI Systems.”
Seagate Technology developed the first Hard Disk Drive for micro computers. The disk stored 5 megabytes of data, five times more than most common disks of the time.
Developed by Philips, the first optical data storage disk had a storage capacity 60 times greater than a 5 ”floppy disk. John Shoch of the Xerox Research Center in Palo Alto invented the “Worm” computer which featured a high-performance information search program.
IBM introduced its PC, providing rapid growth to the personal computer market. MS-DOS (Microsoft Disk Operating System) was the basic software or operating system released for the IBM PC, establishing a long association between IBM and Microsoft.
Adam Osborne developed the first laptop, the Osborne I. Apollo Computer developed the first workstation, the DN100, with capacity beyond many similarly priced mini computers.
Timeline of Computer History: 1982-2001
Mitch Kapor developed the Lotus 1-2-3, software developed for the IBM personal computer. Time magazine sparked euphoria in its traditional “Man of the Year” pick by selecting a computer as the “Machine of the Year.”
The use of computer-generated graphics for movies took a big step through the making of the movie “Tron,” released by Disney.
The first personal computer with a graphical interface is developed by Apple. Compaq Computer Corporation introduced its first personal computer (PC), which used the same software as the IBM PC.
Microsoft announced the Word word processor, previously called Multi-Tool Word. It also announced the launch of the Windows operating system .
MIDI (Musical Instrument Digital Interface) is shown at the first North American Music Manufactures show in Los Angeles.
Apple Computer Corporation released the Macintosh, the first computer with a mouse and graphical interface, at a value of US $ 1.5 million. The 3 “floppy disk was widely accepted by the market, aided by Apple Computer’s decision to integrate it into the new Macintosh.
IBM released the PC Jr and the PC-AT. The Jr. PC failed, but the PC-AT, several times faster than the original PC and based on the Intel 80286 platform, was a success due to its optimal performance and large storage capacity, all those resources for approximately US $ 4000. William Gibson, in his book Neuromancer, invented the term Cyberspace or Cyber space.
The Internet took another big step when the National Science Foundation structured NSFNET by connecting five supercomputers at Princeton, Pittsburgh, California, Illinois and Cornell Universities.
The CDROM is born. With the capacity to store 550Mb of information, the new CD-ROMs expanded the market for music CDs. Aldus released the PageMaker program for use on Macintosh computers, showing an interest in Desktop Publishing. Two years later, Aldus developed the version for IBMs and compatible computers. The C ++ programming language emerged and dominated the computer industry when Bjarne Stroustrup published the book “The C ++ Programming Language.”
David Miller of AT&T Bell Labs patented the SEED (Self-ElectroOptic-Effect Device) optical transistor, a digital component for computers. Daniel Hillis of the Thinking Machines Corporation pioneered artificial intelligence when he developed the compact concept of parallel connection. IBM and MIPS developed the first RISC-based PC / RT and R2000 workstations.
Compaq ousted IBM from the market when it announced the Deskpro 386, the first computer on the market to use the new Intel 386 processor.
Motorola developed the 68030 microprocessor. IBM introduced PS / 2 computers, manufactured with 3 ”drives. William Atkinson, an Apple engineer, designed HyperCard, software that simplified the development of home applications.
Apple Co-founder Steve Jobs left Apple to found his own company, NeXT. Compaq and other PC manufacturers developed EISA (Enhanced Industry Standard Architecture), a standard architecture.
“Pixar’s Tin Toy” made the first movie made on computers to subsequently win an Academy Award for Best Cartoon in a Short Film. Robert Morris sent a virus across the Internet, causing problems for about 10% of 60,000 network users.
Intel released the 80486 microprocessor and the i860 RISC / coprocessor chip, each containing more than 1 million transistors. Motorola announced the 68040 microprocessor, with approximately 1.2 million transistors.
Maxis released SimCity, a video game that used a series of simulators. The city was used frequently in educational settings. The concept of virtual reality was the main theme at the Siggraph’s convention, held in Boston, Massachusetts.
Microsoft announced Windows 3.0 on May 22. Compatible with DOS, the first version of Windows offered satisfaction and performance to PC users. The World Wide Web was born when Tim Berners-Lee, a CERN researcher, developed HTML (HiperText Markup Language).
The Power PC from the IBM, Motorola, and Apple alliance is unveiled in July. Cray research reveals the Cray Y-MP C90 with 16 processors and a speed of 16 Gflops.
DEC introduces the first chip to implement the RISC Alpha 64-bit architecture. In March 1992, the first M-Bone multicast audio was broadcast over the Internet. After generating enormous concern in all computer users, the Michelangelo virus wreaks havoc of small proportions.
Apple introduces Newton, the first PDA (personal digital assistant). Intel’s Pentium is unveiled in March. The University of Illinois develops a graphical interface for Internet navigation called NCSA Mosaic.
Leonard Adleman from the University of Southern California shows that DNA can be a computational medium. Jim Clark and Marc Andreesen found Netscape Communications (originally Mosaic Communications).
Netscape’s first browser launches and spawns a rapid growth in Web surfers.
nineteen ninety five
Toy Story is the first entirely computer-generated feature film. Windows 95 is released on August 24 with a great marketing campaign. The Java programming language, released in May, enables platform-independent application development. “Duke” is the first applet.
nineteen ninety six
Intel’s Pentium Pro is presented. The IEEE Computer Society celebrates 50 years.
The IBM Deep Blue was the first computer to beat world chess champion Gary Kasparov in one game.
The Pentium II 333 MHz processor is launched, faster than the old one. Microsoft releases Windows 98.
Linux is released. The number of people who use LINUX is estimated at more than 10 million.
AMD launches the 1GHz AMD. Intel releases a limited quantity of the Pentium III. The end of TELEX is decreed. The Linux Kernel is released.
Apple launches the Mac OS X operating system, which among its new features includes protected memory architecture and preferential multitasking. The same year, Microsoft launched the Windows XP operating system , without a doubt, a true revolution for the PC market.
In this year, the manufacturer AMD launches the first 64-bit processor, the Athlon 64 from AMD, on the consumer market.
The Mozilla Foundation launches the first version of the Firefox browser, 1.0, with which it tries to compete with Microsoft Internet Explorer , the standard at that time. Also in the same year, Facebook appeared, the social network that would mark a before and after in how people relate to each other.
This year YouTube appears on the Internet , a video hosting service that would later be bought by Google and taken to what we know today. Also in that year the same firm bought Android, a very advanced project for an operating system for smart phones.
The first MacBook Pro appears, Apple’s first dual-core notebook. The same year the Nintendo Wii console also appeared.
The iPhone is on the market, a smartphone that will forever change the way we work and communicate thanks to the advanced features it offers.
This year a technological milestone occurs, they are precedent, since the IBM Roadrunner supercomputer is the first in the world to exceed the PetaFLOP process.
The most popular and remembered version of Windows is released, number 7, which offered multiple features and advanced options very focused on productivity. In addition, it would become one of the most stable versions of the system.
Sale to the consumer market the Apple iPad, a device tablet format that would establish a new way for mass consumption of information. In the same year, the Google firm launched its Chromebook, a notebook equipped with its own operating system.
Microsoft Windows 8.1 and Apple Mac OS X Mavericks are released.
That year Apple Inc. launched its first smart watch, called “Apple Watch”.
Microsoft launches Windows 10 on the market, which would become over time one of the best versions of it, both for options and features as well as for stability and performance.
The first reprogrammable quantum computer is put into operation at the University of Maryland College Park .
It is announced to the media that the Defense Advanced Research Projects Agency (DARPA), the creator of the beginnings of the Internet, is developing a program called “Molecular Computing” that uses molecules like computers.
Apple launches the iPhone Xs, iPhone Xs Max and the iPhone XR. At the beginning of that same year, the set of critical security vulnerabilities “Meltdown” and “Specter” were published, both of which affect most of the processors on the market.
IBM introduces the first commercial quantum computer to the market, the IBM Q System One. It also announces a 53 qubit quantum computer, a true technological milestone.
Despite the pandemic caused by SARS Covid-19, advances in technology development are not stopping. Honeywell announces to the specialized media the launch of a new quantum computer capable of reaching a quantum volume of 64.