Windows

The 10 biggest moments in IT history

Despite its relatively short lifespan, IT has had some huge watershed moments. Jack Wallen followed the tech timeline to identify the most pivotal events.

Despite its relatively short lifespan, IT has had some huge watershed moments. Jack Wallen followed the tech timeline to identify the most pivotal events.


It's unlikely that everyone will ever agree on the most important dates in the history of IT. I know my IT timeline has a personal and professional bias. But I've tried to be objective in examining the events that have served to shape the current landscape of the modern computing industry. Some of the milestones on my list are debatable (depending upon where you are looking from), but some of them most likely are not. Read on and see what you think.

Note: This article is also available as a PDF download.

1: The development of COBOL (1959)

There are many languages out there, but none has influenced as many others as COBOL has. What makes COBOL stand out is the fact that there are still machines chugging along, running COBOL apps. Yes, these apps could (and possibly should) be rewritten to a modern standard. But for many IT administrators, those who don't have the time or resources to rewrite legacy apps, those programs can keep on keeping on.

2: The development of the ARPANET (1969)

It is an undeniable fact that the ARPANET was the predecessor of the modern Internet. The ARPANET began in a series of memos, written by J.C. R. Licklider and initially referred to as the "Intergalactic Computer Network." Without the development of the ARPANET, the landscape of IT would be drastically different.

3: The creation of UNIX (1970)

Although many would argue that Windows is the most important operating system ever created, UNIX should hold that title. UNIX started as a project between MIT and AT&T Bell Labs. The biggest initial difference (and most important distinction) was that it was the first operating system to allow more than one user to log in at a time. Thus was born the multi-user environment. Note: 1970 marks the date the name "UNIX" was applied.

4: The first "clamshell" laptop (1979)

William Moggridge, working for GRID Systems Corporation, designed the Compass Computer, which finally entered the market in 1991. Tandy quickly purchased GRID (because of 20 significant patents it held) but then turned around and resold GRID to AST, retaining the rights to the patents.

5: The beginning of Linus Torvalds' work on Linux (1991)

No matter where you stand on the Linux versus Windows debate, you can't deny the importance of the flagship open source operating system. Linux brought the GPL and open source into the forefront and forced many companies (and legal systems) into seeing monopolistic practices as well as raising the bar for competition. Linux was also the first operating system that allowed students and small companies to think in much bigger ways than their budgets previously allowed them to think.

6: The advent of Windows 95 (1995)

Without a doubt, Windows 95 reshaped the way the desktop looked and felt. When Windows 95 hit the market the metaphor for the desktop became standardized with the toolbar, start menu, desktop icons, and notification area. All other operating systems would begin to mimic this new de facto standard desktop.

7: The 90s dot-com bubble (1990s)

The dot-com bubble of the 90s did one thing that nothing else had ever done: It showed that a great idea could get legs and become a reality. Companies like Amazon and Google not only survived the dot-com burst but grew to be megapowers that have significant influence over how business is run in the modern world. But the dot-com bubble did more than bring us companies -- it showed us the significance of technology and how it can make daily life faster, better, and more powerful.

8: Steve Jobs rejoining Apple (1996)

Really, all I should need to say here is one word: iPod. Had Jobs not come back to Apple, the iPod most likely would never have been brought to life. Had the iPod not been brought to life, Apple would have withered away. Without Apple, OS X would never have seen the light of day. And without OS X, the operating system landscape would be limited to Windows and Linux.

9: The creation of Napster (1999)

File sharing. No matter where you stand on the legality of this issue, you can't deny the importance of P2P file sharing. Without Napster, file sharing would have taken a much different shape. Napster (and the original P2P protocols) heavily influenced the creation of the BitTorrent protocol. Torrents now make up nearly one-third of all data traffic and make sharing of large files easy. Napster also led to the rethinking of digital rights (which to some has negative implications).

10: The start of Wikipedia (2000)

Wikipedia has become one of leading sources of information on the Internet and with good reason. It's the single largest collaborative resource available to the public. Wikipedia has since become one of the most often cited sources on the planet. Although many schools refuse to accept Wiki resources (questioning the legitimacy of the sources) Wikipedia is, without a doubt, one of the largest and most accessible collections of information. It was even instrumental in the 2008 U.S. presidential election, when the candidates' Wiki pages became the top hits for voters seeking information. These presidential Wiki pages became as important to the 2008 election as any advertisement.

What's missing?

Were there other important events in the timeline of IT? Sure. But I think few, if any, had more to do with shaping modern computing than the above 10 entries. What's your take? If you had to list 10 of the most important events (or inventions) of modern computing, what would they be? Share your thoughts with fellow TechRepublic members.


Check out 10 Things... the newsletter

Get the key facts on a wide range of technologies, techniques, strategies, and skills with the help of the concise need-to-know lists featured in TechRepublic's 10 Things newsletter, delivered every Friday. Automatically sign up today.

About

Jack Wallen is an award-winning writer for TechRepublic and Linux.com. He’s an avid promoter of open source and the voice of The Android Expert. For more news about Jack Wallen, visit his website getjackd.net.

371 comments
willis0966
willis0966

Wow - what about the microprocessor?

muadzir
muadzir

Introduction of DirectX has changed the world of game programming to a different stage. maybe the support at the earlier phase is not really satisfactory but it eases the amount of time programmers have to spend to code. Rather drilling down to the machine code or assembly language, accessing the API classes and methods really cut down the hassle of coding the game. Of course it speed up the time taken for the development

digibecky
digibecky

The objective was to open up markets to competition by removing unnecessary regulatory barriers to entry and "to provide for a pro-competitive, de-regulatory national policy framework designed to accelerate rapidly private sector deployment of advanced services and information technologies and services to all Americans by opening all telecommunications markets to competition....? (source=wikipedia)

jwwatson01
jwwatson01

Hey, What about the start of Google company?

jelarkin
jelarkin

I was using the Mac in 1984 for DiffEq solutions and word processing alike at Cornell. For programming, I was using PL1 on DOS computers. Windows GUI copied Apple's lead and Win95 was just another iteration of that.

sreid
sreid

From the Common Man - Windows 3X & GOOGLE The impact on the non-techie, the common man, must be a prime factor in measuring IT epochs. As such, invention of the microprocessor and dependent technologies, Microsoft Windows 3X, the Internet & Google were truly great moments.

plibby73
plibby73

IBM/XEROX giving Bill Gates rights to the OS... biggest corporate blunder...

sanyen
sanyen

I think the biggest moment in IT history is : 1. CPU, of course. 2. Without software, hardware is nothing. So I think the first creation of compiler is the biggest moment in IT history. First generation programming language is machine code but it's rarely used. Currently Assembly programming still used. Not COBOL.

Dogcatcher
Dogcatcher

Reading through the comments has been an interesting trip down memory lane, but have you noticed most of the comments are off on a tangent from the original list? Interestingly, although not unexpectedly given the readership of Tech Republic, we've tended to focus mostly on hardware. However, only the clamshell laptop in the original list really falls into the hardware category. All the other items are programming, uses of technology, or business events affecting the computer industry. This is just an observation, not a criticism, so hold the flames.

spencer2
spencer2

Where is wireless networks? Cell phone apps.?

wlittlej
wlittlej

16K RAM - Wrote my first programs - in FORTRAN. wlittlej

oldbaritone
oldbaritone

COBOL as the "first" biggest moment? How about the INVENTION of the modern computer? How about the INVENTION of the solid-state processor? How about the INVENTION of the floating-point coprocessor? How about the INVENTION of the graphics co-processor? This list "jumps into the middle" and ASSUMES the technology. How about the pioneers who INVENTED the technologies USED by your "10 moments"?

john.brennan
john.brennan

This list of 10 items excludes the IBM Mainframe which was actually the first OS to support multiple concurrent user logon. It also ignores the huge impact of the early Apple Mac and IBM PC. It was the development of these two 'standards' which set the foundation for all of the later software developments, especially Linux. The development of Windows 95 does not warrant being included in any top 10 list as it was still really DOS under the covers. I think that the 10 list posted reflects the relative youth of the writer.

subs
subs

None without the invention of the PN junction.

soccerkingpilot
soccerkingpilot

Google is probably the most memborable thing since it started and I think it deserved at least an honorable mention

Spitfire_Sysop
Spitfire_Sysop

It's wrong to credit Win95 or Apple with the modern desktop. The whole organization of the modern desktop GUI is based on the ATARI ST. The trash icon, the pull-down menus and icons. Even windows themselves. Take a look: http://www.atari.st/ You will see these common threads today in OSX and Win7. They just made is all pretty.

futureking
futureking

Google is not in this list. Where is IBM? Where is Mozilla?

surajit_basu
surajit_basu

and hardware? the first modem? wireless?

wbaltas
wbaltas

How about the Apple II - The first PC to gain wide acceptance in education and the home. What about the spreadsheet (visicalc) - Many say that this application is the app that put PCs on the desktop. Before this application the PC was considered a toy. Rather than the Windows O/S, how about the GUI Ethernet or Arcnet - the first really low cost networks Bulletin Board Systems HTML - where would the Internet be without this?

WirelessEngineer
WirelessEngineer

Wikipedia "one of leading sources of information on the Internet"? Only for the Kool-Aid drinkers! More accurately, it's one of leading sources of DISinformation on the Internet. The author of this piece is obviously too young to know much about the history of IT.

jcjr031064
jcjr031064

"All other operating systems would begin to mimic this new de facto standard desktop." Really?

MadKaugh
MadKaugh

Granting the distinction between (applied) IT and the broader scope of information sciences, the selections still seem odd. Database and information filing are key foundations of IT, yet are barely mentioned; reflecting on what the paradigm changing innovation was I think Hollerith cards deserve the number one spot on the list. Hollerith cards are more enduring than any other aspect of IT, (60+ years of active use), being the foundation of the pre-machine computing era of IT, and shaping the way we view data, even influencing the way many early languages (such as COBOL) are structured. Windows should not be on this list in any form. It is not foundational to IT, nor is it innovative. Ditto, Linux. Work on a functional desktop had gone on long before their time. Visicalc would make more sense. It changed how data was handled, made it more accessible. If you would put the Grid, I'd put in RIM's infrastructure, visibly represented by the Blackberry. RIM has demonstrably changed how we expect IT to be delivered. While they no longer have the field to themselves, they broke the ground. The 90s dot-com bubble, Steve Jobs rejoining Apple, The creation of Napster - these are not IT; these are footnotes. Did they CHANGE IT? No. DEC rolls our the PDP-8, makes computing affordable for something less than a megacorporation; that's breaking ground. I agree that Wikipedia is an insightful entry on the list.

MagicTom
MagicTom

1. the capability from going to mecanical machines to computers. 2. And even more than Cobol, the multitasking of computers (including Cobol being run on these machines). 3. the application of word processing. If you want more basic than that and a little bit more, just ask me Magic Tom

bowenw
bowenw

Reducing all the game-changing events in IT history down to a top 10 is NOT an easy task. I'd like to float 4 other events for your consideration (but I won't even attempt to say which of the present 10 should be eliminated): 1) The invention of ENIAC and Colossus, the first electronic computers. 2) The invention of the integrated circuit by Jack kilby in 1958 AND/OR the invention of the microprocessor in 1969. Without these 2 inventions computers as we know them would be only a dream. 3) The introduction of the IBM PC in 1981. With the IBM name on the box, microcomputers became acceptable in business. 4) The invention of ethernet by Bob Metcalf in 1982. Ethernet made the networking of computers that we take for granted today practical. I hate to break it to the Apple acolytes, but the introduction of OS X was NOT big deal in the overall scheme - Windows 95 had a MUCH greater impact.

jck
jck

When my parents bought me a Commodore 64 and said "Now, this is not going to be just for playing games." and I said "Okay.". Then 3 months later, they were buying me games and a 300 baud modem and I was gaming and pirating at age 12 :^0

r4f
r4f

- original announcement of the GNU Project, written by Richard Stallman on September 27th 1983. - GPL (General Public License) Version 1, February 1989 - november 1981: Microsoft and IBM sign a formal contract for Microsoft to develop certain software products for IBM's new microcomputer.

psmith
psmith

Commercial sales of the PDP-11 The impact of the GPR processor design, instruction set, and OS approach of RSX-11 were fundamental to the direction of all systems development after.

njoy_d_ride
njoy_d_ride

Hey folks, I'd like to suggest this be made into a contest. And to add some room divide it into two categories: The 10 biggest moments in IT history before ENIAC and the ten biggest moments in IT history after ENIAC. (I picked ENIAC because it was the first general-purpose electronic computer.) Take one moment in IT history, and tell us why it is one of the ten biggest moments in IT history. Do your homework, use your spell checker, and submit your moment. And to add that game-show quality, your submission must be in the form of an event in history. Example: ?Windows 95 should be on the list because it lead to widespread use of the Graphical User Interface.? doesn't even get out of the door because this is a business product, not an event. ?The development of the Graphical User Interface? does, now I just have to convince you that a GUI is important and that today's IT environment would be TOTAL DIFFERENT without it. Which I'm not going to try to do. Also, we all love PCs and MACs so lets try to avoid making this a pissing contest.

t34.mod.85
t34.mod.85

Apple L I S A ... The first commercial GUI OS on the market!

dpresley_50201
dpresley_50201

The ABC was the very first all electronic digital computer. All other computer designs can trace their design origins to the ABC. Without the ABC the Digital Age either doesn't happen, or is significantly delayed as some other machine is invented probably outside the US. BTW, the ABC was designed and built on the campus of Iowa State University in Ames, Iowa. Dig that, the first electronic computer built in a *farm* state at a "Moo U"! This event should be ranked #1 because all other IT events fall behind the ABC's invention.

jk2001
jk2001

Unix was not the first multi-user OS. There were timesharing systems before Unix. Unix was the first OS to take root in universities because you could get the source code tapes for BSD, a clone of Unix. Here are some more IT oldies. Doug Englebart's demo Apple II The Kaypro The Web Free Software Foundation, gcc. Relational Databases Structured Programming Object Oriented Programming Functional Programming IRC DOS boot sector viruses and Hypercard viruses Mapped memory Interrupt lines on CPUs VLSI RISC Programmable Hardware, FPGAs

azbat
azbat

http://www.intergraph.com/about_us/history_70s.aspx Helped develop 'refresh' on individual monitors and also small LANs and the chipsets. Plus in 2002, they killed Intel in court for $1/2 Billion for patent infringement which let AMD have time to catch up to Intel in the CPU wars, as Intel couldn't release their Enhanced PIC (EPIC) architecture, since Intergraph owned the original PIC rights.

profdocg
profdocg

Where is the Virtualisation gone? That is where the whole future is - GO GREEN

pjboyles
pjboyles

The Commodore 64 marks the first successful PC for the consumer. It moved computers from a business or hobbyist item to consumers. I do miss my old Commodore 128 sometimes. Not too often though. At the time, I used it much more than my IBM clone PC.

gbhouw
gbhouw

Haven't got time to read them all but suffice to say there are as many opinions on this as there are top ten items...............

williaa6
williaa6

If I follow you correctly, you are saying that COBOL is no longer used. If so, then I'm sorry but you are very wrong. I read an interesting article recently that said that if you assign a value of "X" dollars to a line of COBOL code, where X was something like $5 (I've forgotten the exact figure they used), then COBOL code is the 3rd or 4th most valuable commodity on the planet. Planet Earth in 2009 runs on COBOL code. You may not be aware of it, but COBOL is keeping this modern world ticking over. Can it be replaced with point'n'click languages? Of course it can, but the job is a loooong way from complete.

mandrake64
mandrake64

But Dilbert has broader scope than IT so does not belong in this list however prophetic he might be at times for software projects, etc... Dilbert is more synonymous to engineering, quality management, patent violations, questionable product quality, not to mention Wally and slacking, Alice and workplace rage, Asok and treatment of underlings, the PHB and management incompetence, and Dogbert's accurate portrayal of CEOs in their natural habitat.

dpresley_50201
dpresley_50201

As I remember correctly, Atari's GEM (Graphical Environment Manager) was a licensed copy of the original GUI developed at Xerox PARC and Jack Tramiel's engineers added a few more bells and whistles to increase its functionality. Your assertion that all other GUIs were based on GEM is not entirely correct. The original Macintosh OS was a direct steal of PARC's GEM with a few changes, and Windows started as an imitation of the Mac OS.

tehboss
tehboss

Any "Windows" versions before Windows 95 was just a graphical shell with a DOS backbone. And other than operating systems, can you really group office productivity software into the top 10 of IT?

santeewelding
santeewelding

Right down to what appears to be multi-colored hair and, probably, rings here and there. Yet, when he speaks, I find myself listening. Your correspondent above my reply proves worth listening to, as well. You -- you prove not. ________________ gender assumption eliminated, although I have a testosterone familiarity to know which

MadKaugh
MadKaugh

Wikipedia is two distinct things, a specific Internet information site that you are referring to, but also a way of doing collaborative information on a network, of which Wikipedia is only one example. Wikipedias are the future. At least a stepping stone to it. Sure there have been very public vandalizations of Wikipedia pages. But, you know, those have been fixed. Sure there are factious articles; but those are noted as controversial, and the text is driven toward a consensus, and the truth is available if you are willing to dig. The same can not be said for sites that are under control of a single point of view. Behind every Wiki page is the availability of a forum page for discussion of the page. In essence, Wiki combines the best features of a forum and an encyclopedia. Wikipedia is getting better. There are stronger controls, more assertive administration, and clearer guidelines on how to handle controversial topics. Notably, though, if you have an axe to grind, Wikipedia may not be to your liking - it is specifically NOT for the Kool-Aid drinkers. But the point that you seem to have missed is that the model is becoming common as a resource for collaboration. Many work units are hosting their own Wikis to document their area of interest. These tend to be of a less controversial and more technical and professional nature, and vandalism is not as much of a threat.

rcuriel
rcuriel

Yes, indeed. We know it wasn't the first multi-user system because Unix is Multics with out the balls!

techrepublic
techrepublic

Hardly. It's a FORTY year old technology, and more relevant now than ever. Old mainframers just get the job done instead of pounding our chests about how great "X" might be.

sanyen
sanyen

cobol still used, of course. many of mainframe still using cobol because of it's fast batch processing. iam cobol developer at 1995 that ever scared by millenium bug issue. But, assembly programming is the core of every programming language. Without assembly there's no operating system, there's no software. So i can say that assembler is used by every software.

retropedia
retropedia

... and the whole nature of collaborative web activities. Sure, the web was great at the beginning, but the flow of information was roughly one way: publishers (websites) to subscribers (readers). with blogs and forums and wikis, we have changed the way information is managed: it has become everyone's and no-ones (although patent lawyers may disagree). The rise of blogs is also fundamental in that it allowed non-technical people to publish on their own, and thus reduced the sway that traditional media (such as newspapers) have had.