Babbage Difference Engine No. 2
Bill Detwiler has nothing to disclose. He doesn't hold investments in the technology companies he covers.
Bill Detwiler is Managing Editor of TechRepublic and Tech Pro Research and the host of Cracking Open, CNET and TechRepublic's popular online show. Prior to joining TechRepublic in 2000, Bill was an IT manager, database administrator, and desktop support specialist in the social research and energy industries. He has bachelor's and master's degrees from the University of Louisville, where he has also lectured on computer crime and crime prevention.
I really enjoyed looking at the older systems. I worked on the pure card systems (Burroughs 263)along with the respective peripherals (sorters, collators, and keypunch machines) in the late 60s, moved to the B3500 with the belt-driven hard drives, and to Honeywell 425/435 and IBM System 3 minis. Being a data processing machine operator was definitely special. The physical nature of the old systems was such a core structure for the logic of the hidden operations in the small servers and PCs of today. Thanks for the pictures.
You didn't list the Manchester Uni Atlas computer commissioned around 1962. I was an undergraduate there at the time and was the first electrical engineering undergrad to run a program on the machine in a HLL called Atlas Autocode to work out the Electrical Machine experiments results for my BSc. Maybe the first in the world? But I don't know about that. . .
Awesome! Something that needs to be displayed in every schools and Engineering Colleges. Can you ad more images of systems that were used in Avionics and Defence.
Ahhh! Be still my beating heart! I have a Honeywell DDP-516 (used for the first [ARPA]net transmission) and have taken it to a couple of schools - the kids love it! We use the analogy that if an iPod used the same core memory technology it would be the size of 500-odd buses! I have a load of other stuff but I would kill for some of these machines. !!SHAMELESS PLUG ALERT!! I have a Facebook page called "Want to Buy a Brain?" which shows an exhibition I did in 2007 - would love to get some anecdotes from people on there if you can spare five minutes.
this is a lot like looking through my highschool yearbook - seeing old girlfriends and wondering if... I fondly remember my first CDC 6400 - in 1967. You never forget your first love! And my first Fortran compiler - sweet memories. thanks for the trip back to an older and simpler time (no Java, .NET, Linux boxes).
The history of computers with images was great.I really enjoyed this computer time line.In the future I like to see something about AI.
This history of computers info with images is great. I really loved it. I'd like to see more of this and maybe some AI related stuff if possible! Just a thought! email@example.com
Desktops, laptops, perhaps it is time for the floortop to return. A few advantages: 1) Replaces the living room fold-out so no one sleeps over. 2) No need for a screen saver. 3) Display does not wash out in bright light. 4) Input method unlikely to cause Carpal Tunnel Syndrome. 5) Power source (hand crank) can be used to justify slavery.
Two points: I had remembered that Lord Byron's sister Ada had been a "programmer" for Charles Babbage at one time. Interesting historical linkage. Also, I am ambivalent about several of the computers I have actually programmed being referred to as "historic". Back in the 1960s I programmed several CDC computers, including the powerful 6600 and 7600. We programmed in Fortran; the computers had a standard 60-bit word, with double and triple precision programmable. Both machines were liquid cooled. Earlier CDC machines were the 3000 series and the little 160-A. If the machines themselves are historic I supppose the programmers of such machines are also historic. Strange feeling. We looked upon the 6000 and 7000 series as being almost godlike in their power. Seymour Cray, designer of all the most powerful CDC as well as the Cray computers, was considered among the immortals.
I remember learning how to program the LGB-30 in the fall of 1967. It took about a page and a half of Assembly language to program square root using Newtonian approximation. The next term we got to use an IBM 1130 if I recall correctly.
The Zuse Z23 (1961) (one of the photos) is a is descendant of the first working computer (also mechanical) built by Konrad Zuse (google - konrad zuse computer)
Charles Babbage's inventions were not only difficult to make, he had enormous trouble finding a draughtsman who was able to even draw the parts he needed! He did find one eventually who was good enough - and was paid handsomely for his outstanding work. It is still a matter of debate that the technology at the time would have allowed its construction. but given that the Egyptians could build pyramids with rods and rope, I would guess that it would have been possible - but what a struggle!
one of the first to use transistors and be a "super" computer. However, my links didn't claim a first in any category; only speculation.
When I lived out in the country in Colorado, Seymour semi-retired and move a mile down the road from me. I always want to meet him but never got up the nerve to drop in and introduce myself. Two reason: did not want to disturb his privacy, secondly didn't want to realize how little I knew about computers if we got into a discussion of what he knew. I believe he would have been a facinating person to study under.
would argue about whether he made the first electro mechanical computer. Even US sources show such things as late as 1940. I seem to remember some advancements in the US in 1936. However, I am delighted to see how far the Germans really got with that science. I'd be willing to bet he had the first Turing computer, even if it didn't have the memory for it.
When it comes down to it, we are all new to this, none of us being there at the time. Myself, I have read intensively into the history of computing. Every time these things come up, like now, all I can say is, "Wow!" Like you.
that the modern copy of this machine was made with old machine tools from 1836 or so; mass parts manufacture had been invented at Harpers Ferry by then, I believe. All other non machined parts were made to order using early 1800's techniques and tools. Without the fantastic drawings it would really have been a guess for that project. What a project!! No one would have seen an economic reason to spend that kind of money on such a development at the time. Using hundreds of clerks was still cheap labor in that day.
You mean, this isn't the restored version of: http://www.zuko.com/CrypticSphere/Antikythera_Unsolved_Mysteries.asp ...cleaned up and made to work?
It's a shame that you did not take advantage of the opportunity. There were lots of stories told during his CDC days about his quirky genius. Not only was he the architect of the CDC hardware he was also the designer of the Chippewa OS that powered most of the larger CDC boxes. As I understand it, he demanded a private laboratory in the Wisconsin woods (Chippewa Falls). It was there he worked. His support came to him, not vice-versa. He was definitely a giant.
Good to see the photo's of our historical relics. Perhaps readers would be interested in the activities of the UK Computer Conservation Society http://www.computerconservationsociety.org/ which has been reconstructing pioneering computers for some years,including a working reconstruction of the Colossus Computer used in World War II for breaking the German secret codes and the Bombe designed by Alan Turing for the same purpose. These can be visited at Bletchley Park Museum in the UK. Other early machines such as the Ferranti Pegasus are also being recommissioned. Excellent photo's of the world's first business computer, the LEO I can be seen on the LEO Society Website at http://www.leo-computers.org.uk/
Wow, there are people outside that still live on WWII.Everybody know about german/japan cientics that works after war on USA.
I entered the computer world in 1971 and that was a long time ago. I have seen some really old stuff, including vaccum tube technology, but when I started, ICs were starting to show up in most computers. Paper tape was still in use on some older systems. Replaced many bad transitors, and wire wrapped boards to find good circuits when parts were not available. WOW! Does this make me old? I believe the proper term is 'over-exposed' :-)
Hence the lack of invention by the Romans--plenty of slaves to throw against any problem Plus, the push for an invention if sheer manpower isn't enough. The 1880 census took so long (over seven years) that the Census Bureau contracted Herman Hollerith to design and build a tabulating machine to be used for the next census. There was a genuine fear that the 1890 census might run into 1900.
Steam-punk capital! ROTFLOL!! Maybe a Wild Wild BC, instead of the west! HA! =D This kind of imagination, is very entertaining to me. Someone should start making movies of Jules Verne, or write something new, based on an alternate ancient past, either Roman or Egyptian, it would be a real hoot to me. But then I'm a weird geek anyway! Proud of it too! :)
I agree with you and AV about what could have been. There is also, for example, Anne Frank, and those horrors. The sublime and its sublimities are not all that we are about and, you may pay dearly should you ever forget it.
that legionnaire had followed orders and not disturbed the circles of Archimedes. I mean, a galley-lifting-and-crushing crane, a sun-power burn-ray made with parabolic metal reflectors (if it is true)... it's maddening to think that the romans had strict orders to take this valuable man alive, and what they could have done with his assistance. Rome could've been the steam-punk capital of the world in about 100 BC!!!
I first read about the project to recreate it in the nineties, and was amazed that they got a finished copy to work! I was it on the History Channel being operated; very impressive planetary clock! It just goes to show, there really isn't much new under the sun, and who knows what we would discover about the past if the library of Alexandria hadn't been destroyed by war. Some historians say we would have reached the moon by the 1500's if that library hadn't been lost.
There was a very successful businessman in Japan just as the war was progressing, that was close to solving the A-Bomb problem all in his single corporation, but he was using so much electricity to process fuel, that the Nippon war department, confiscated his power for the war effort! I believe Jun Noguchi was his name,and this industrialist was miles ahead of everyone on heavy water production! Depending on which side you rooted for, that was either a God send, or a horrible miscalculation on Tojo's part!
My brother cut his computer teeth in college on systems very similar; and my teacher in college was running a system, in country, just like yours, and ducking incoming rockets in his hooch to boot! I just missed Vietnam, because the Marines wouldn't take me then, but my buddies all got to go, and helped in the pullout in the mid seventies. I did get to serve later up until the big manpower draw down after "Desert Storm", by then my health was already going down hill, but at least I made it through college.
First systems I syupported in the Navy were UNIVAC 642, no not the Alpha or Bravo, but the first generation serial numbers 2 and 4. More power in a watch than what they had, but we did a lot with them. Used then in Viet Nam with great accuracy. Also created a safe zone between them when we were being shot at.
must have made me old, 'cause I sure feel it! I was still using paper punch cards in the Army and processing them with a digital computerized memory typewriter!!! Talk about weird overlapping technology!? It was like the individual was running over the big machine, and making it obsolete in one stroke!
The Army was gearing up for women by the time I got in (1979). They put out a valiant effort to accommodate women in every way possible including field facilities. I would have thought the Air Force would have been light years ahead of us! Otherwise it is good to hear those stats!! I fear the next generation may not meet the challenge!
I worked in a Municipality from 1989 - 2009. from 1989 to 2000, workforce increased by 10 people. They finally computerized. I finally got to play with PCs and be network admin (and everything else having to do with PCs, programs and people) (computers being denied me by the AF in 1958 even with top aptitude scores - no facilites for women at the training base). From 2001 to 2009, tenfold increase in productivity and 2 new part-time hires while municipal population grew from about 7,000 to 30,000.
was the hero of the day with his tabulation machine, and first commercially successful punch card system(not including musical instruments). This made it possible to complete the census of 1890 within one year! Eventually his company morphed into IBM. I thought it was amazing that this German-American included electrical processes into this tabulation system. Ironically, buy 1985, I was filling out punch card sheets for the Army with a digital computer controlled memory typewriter, that actually made that whole mess obsolete. I got tired of filling out punch cards sheets for the state, as they were still in the dark ages, so I programmed my typewriter to do it faster. However the on-board memory was only 35k; and ran out very quickly. I integrated the only laptop I could find, the IBM Convertible, so that I could increase this memory capability, and it was cheaper, or at least more practical to split up memory duties between the two machines. The laptop was using the then new, plastic micro floppy at 750k each in two drives. Then the 1.44Mb microfloppy was introduced to make this even more affective at memory storage. After automating my office, I declared my position to be obsolete and toured out, leaving my Supply Sergent job to the computers. And gladly so, as it was drudgery work in the extreme! I then went to school to get an engineering degree. In all my labors in the industry, I've never seen the work force loss that is always predicted by nay sayers. The resulting increase in accuracy and productivity, has always resulted in more hiring, not less. And the resultant jobs more rewarding and paying at higher salaries.