History of IBM Developments

1956 - FIRST MAGNETIC HARD DISK. IBM introduces the world's first magnetic hard disk for data storage. RAMAC (or Random Access Method of Accounting and Control) offers unprecedented performance by permitting random access to any of the million characters distributed over both sides of 50 two-foot-diameter disks. Produced in San Jose, California, IBM's first hard disk stored about 2,000 bits of data per square inch and had a purchase price of about $10,000 per megabyte. By 1997, the cost of storing a megabyte had dropped to around ten cents.

1957 - FORTRAN. IBM revolutionizes programming with the introduction of FORTRAN (Formula Translator). Created by John Backus, it soon becomes the most widely used computer programming language for technical work. For the first time, engineers and scientists can write computer programs in more natural forms, such as C=A/B rather than as strings of "machine language: 1s and 0s.

1997 - DEEP BLUE. The 32-node IBM RS/6000 SP supercomputer, Deep Blue, defeated World Chess Champion Garry Kasparov in the first known instance of a computer vanquishing a world champion chess player in tournament-style competition. Also after years of teamwork among Research and Microelectronics divisions, IBM introduced the CMOS 7S process, which allowed manufacturers to use copper wires to link transistors in computer chips instead of relying on traditional aluminum interconnects; a revolutionary advance in semiconductor technology.



Saturday, June 30, 2012


Future Computers

divider line

Computers of Tomorrow

Today's computers operate using transistors, wires and electricity. Future computers might use atoms, fibers and light. Personally, I don't give a byte what makes it tick, as long as it does the job. If I could accidentally spill my coffee and not have it cost $848, that would be a cool feature.
But let us assume that you are not still bitter from a recent laptop replacement. You might stop to consider what the world might be like, if computers the size of molecules become a reality. These are the types of computers that could be everywhere, but never seen. Nano sized bio-computers that could target specific areas inside your body. Giant networks of computers, in your clothing, your house, your car. Entrenched in almost every aspect of our lives and yet you may never give them a single thought.
Understanding the theories behind these future computer technologies is not for the meek. My research into quantum computers was made all the more difficult after I learned that in light of her constant interference, it is theoretically possible my mother-in-law could be in two places at once.
If you have the heart, take a gander at the most promising new computer technologies. If not, dare to imagine the ways that billions of tiny, powerful computers will change our society.

The Personal Computer Assistant

pc assistant
I must admit that in some ways I envy Donald Trump. Not because of all the real estate he owns or even for his cool private helicopter. No, what I envy most about The Donald is his apprentice. Who wouldn't appreciate giving any chore that comes to mind, to an eager and competent assistant? After time, a good apprentice might even anticipate your needs. "Pink tie today, Mr. Trump?". Now apply this same kind of relationship model to the future of computing.
In the future, the number of tiny but powerful computers you encounter every day will number in the thousands, perhaps millions. You won't see them, but they will be all around you. Your personal interface to this powerful network of computers could come from a single computing device that is worn on or in the body.
Aside from providing one 24/7 interface to the myriad of computers and sensors that you will have access to, like a good apprentice, this computing device would come to know your personal preferences and sometimes make decisions on your behalf.

Moore's law

Gordon E. MooreVisit any site on the web writing about the future of computers and you will most likely find mention of Moore's Law. Moore's Law is not a strictly adhered to mathematical formula, but a prediction made by Intel's founder co-founder Gordon Moore in 1965.
Moore predicted that computing technology would increase in value at the same time it would decrease in cost. More specifically, that innovations in technology would allow a doubling of the number of transistors in a given space every year, the speed of those transistors would increase and manufacturing costs would drop.
A computer transistor acts like a small electronic switch. Just like the light switch on your wall, a transistor has only two states, On or Off. A computer interprets this on/off state as a 1 or a 0. Put a whole bunch of these transistors together and you have a computer chip. The central processing unit (CPU) inside your computer probably has around 500 million transistors.
Shrinking transistor size not only makes chips smaller, but faster. One benefit of packing transistors closer together is that the electronic pulses take less time to travel between transistors. This can increase the overall speed of the chip.
Not everyone agrees that Moore's Law has been accurate throughout the years, (the prediction has changed since its original version), or that it will hold true in the future. But does it really matter? The pace at which computers are doubling their smarts is happening fast enough for me.
Thanks to the innovation and drive of Gordon Moore and others like him, computers will continue to get smaller, faster and more affordable.


divider line

Email Your Comments, Links or Pictures

Visitor Comments
06.08.2010
that information about computers is so interesting....keep on going.
catalina
divider line

Computer Articles, Blogs and Web Sites

All Links open in a new window. Bold = Recommended

I verified the links on this page on 09.26.11. - ffa
Future Computer Articles
TitleSourceDate
Moore's Law Lives Another DayTechnology Review04/12
Predicting the Future of ComputingNY Times12/11
Supercomputer to Propel Sciences ForwardIBM11/11
Next-Generation High-Performance Computing PlatformsIntel11/11
Moore's Law safe into future as 3D 'Ivy Bridge' arrivesSilicon Republic05/11
The Emotional ComputerUniversity of Cambridge12/10
IBM’s “Watson” Computing System to Challenge Jeopardy! ChampionsIBM12/10
Breakthrough Chip Technology Lights the Path to Exascale ComputingKurzweilAI.net12/10
Supercomputers 'will fit in a sugar cube', IBM saysBBC News11/10
Preparing for the supercomputer warKurzweilAI.net11/10
Chinese supercomputer is world’s fastest at 2.5 petaflopsKurzweilAI.net10/10
Intel Wants to Be Inside EverythingBusiness Week09/10
'Smart dust' aims to monitor everythingCNN05/10
Quantum leap: World's smallest transistor built with just 7 atomsPhysOrg05/10
Chinese Supercomputer Is Ranked World’s Second-Fastest, 
Challenging U.S. Dominance
NY Times05/10
The $75 Future ComputerForbes12/09
Computer Security In The FutureComputer Safety Tips09/09
World's smallest computer (humor)YouTube12/08
The super-fast future of computingBBC News06/04

Future Computer Web Sites and Blogs
TitleDescription
15 Cool & Crazy Concept ComputersWebUrbanist
2020 – Future of ComputingNature
Non-Techie TalkGood blog on trending computers, devices and software
Shape of Computers in the FutureBlog with cool future computer images

Future Computer Video and Audio
TitleSources
CES 2009: Microsoft Future Products DemoYoutube
Future World - Smart Dust Micro ComputersYoutube
Microsoft's Vision For 2019Youtube
The Future of ComputersCNN
The Future of ComputersEdinburgh University
The Future of ComputingMIT World

References
ArticleSources
Computers of TomorrowArticle by ffa
The Personal Computer AssistantArticle and image by ffa
Moore's lawArticle by ffa
Image from intel





divider line

Warning: 

Many of the articles found on this web site are from a blogger that couldn't tell you the difference between hydrochloric and high colonic. We try our very best to provide you with useful, accurate information, but we don't always get it right. Please read our full disclaimer before quoting us at work, school or world conferences.

No comments:

Post a Comment