Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Upgrades Software Hardware Technology

'Modern' Computers Turn 60 Years Old 88

Christian Smith writes "Stored program computers are 60 years old on Saturday. The Small Scale Experimental Machine, or 'Baby,' first ran on the 21st of June, 1948, in Manchester. While not the first computer, nor even programmable computer, it was the first that stored its program in its own memory. Luckily, transistors shrank the one tonne required for this computing power to something more manageable."
This discussion has been archived. No new comments can be posted.

'Modern' Computers Turn 60 Years Old

Comments Filter:
  • A tonne? (Score:5, Insightful)

    by Majik Sheff ( 930627 ) on Saturday June 21, 2008 @10:45AM (#23885587) Journal

    What's that in Volkswagen Beetles?

    • Re:A tonne? (Score:4, Informative)

      by Anonymous Coward on Saturday June 21, 2008 @10:57AM (#23885725)

      1.21 Old, 0.74 New.

      • Re: (Score:3, Funny)

        by Opportunist ( 166417 )

        That's proof it ain't a Japanese car. Else the new one would be at the very least at factor 2. And next gen it would be 5.

  • Evolution (Score:2, Interesting)

    by AkaKaryuu ( 1062882 )
    In 60 years we've gone from computers the size of a room to a laptop computers thin enough to fit in an interoffice envelope. Where will we be in another 60 years, or even ten for that matter? It's somewhat scary that we've created a technology that advances much quicker than ourselves. It's just a matter of time before we are the number two species ( if you can call a computer a species ) walking the planet.
    • Re: (Score:1, Insightful)

      by Anonymous Coward

      Lame.

    • by cayenne8 ( 626475 ) on Saturday June 21, 2008 @12:28PM (#23886491) Homepage Journal
      "In 60 years we've gone from computers the size of a room to a laptop computers thin enough to fit in an interoffice envelope."

      Yeah...but, those old tubes used to make the data 'feel' warmer.

      :-)

      • Obviously you haven't used a MacBook as a laptop...
        • That was supposed to be nested to a reply to the comment this is replying to. Who kills the humor? I do, I do!
      • Yeah...but, those old tubes used to make the data 'feel' warmer.

        So that's what the Pentium's floating-point unit was trying to simulate digitally.

    • by kestasjk ( 933987 ) on Saturday June 21, 2008 @01:14PM (#23886903) Homepage
      1. Transistors get far smaller

      2. ???

      3. We are slaves to robotic overlords

      Maybe if you use Will Smith's humor, or a recursive time-travel paradox, to distract us from the "???" it could work as a plot.
    • Not so fast... (Score:4, Interesting)

      by DrYak ( 748999 ) on Saturday June 21, 2008 @03:19PM (#23888105) Homepage

      Luckily, transistors shrank the one tonne required for this computing power to something more manageable
      The poster apparently hasn't checked the specs needed to run Vista.

      More seriously :

      In 60 years we've gone from computers the size of a room to a laptop computers thin enough to fit in an interoffice envelope. Where will we be in another 60 years, or even ten for that matter?
      You can bet that the developer will definitely find use for additional power, as machine performance increases.

      What has caused the computers to shrink to envelop-size isn't as much the increased performance/size ratio. It's the market.

      If Moore's law stated (roughly paraphrasing) that computer performance doubles each 2 years, one should expect the computer to reduce their size by half in that time frame. But that didn't happen. Because most of the time people only one to use the additional performance to have the same box as before but faster.

      Only from time to time the users' interest shifts.
      Desktop replaced microcomputers and mainframes, not (only) because suddenly the circuits could have been made smaller, but mainly because there was an increased interest in having a computer in each house.

      Today's UMPC appeared only because the public is starting to have interest into something that is small and cheap. With the increase of circuit density, building pocketable devices that have the same power as computers from a couple of years before has been possible for quite long time. PDA have been around for a few years and some have quite decent performance. But the demand only started arising now.

      So what will happen in 10 years ?
      It all depends on the market then.
      The technology will be around that could fit the processing power of today's big cluster into a chip as small as a pen.
      But then it all depends of buyers choice. If suddenly pen--sized computer are the latest trends, you'll see them around. Probably with geeks claiming that 2018 will finaly be year of the Linux PenComputer, because Windows 8.0 just can't run on them.

      But if UMPC are still the trend, you'll only see the same form factor as before, only with 40x processing power than today - three quarter of which will be taken by a combination of the bloated operating system and the DRM lock mechanisms.

      • If Moore's law stated (roughly paraphrasing) that computer performance doubles each 2 years

        Except it didn't state that at all. Moore observed that the number of transistors that can be inexpensively placed on an integrated circuit doubles every two years. Equating number of transistors == performance is vastly oversimplifying things. For one thing, it doesn't even take into account changes in clock speeds.

        Also, increasingly the performance of a single piece of silicon is less important, since we are offl

  • That's as many as six tens.
    And that's terrible.
  • That Depends (Score:4, Informative)

    by FurtiveGlancer ( 1274746 ) <.moc.loa. .ta. .yuGhceTcoHdA.> on Saturday June 21, 2008 @10:48AM (#23885607) Journal
    US or European? Saftey equipment varies and so do curb weights.
    • Re: (Score:2, Funny)

      by Anonymous Coward
      I would imagine the weight he wanted was the weight of the beetle filled to the maximum amount possible with live clowns. Sometimes the clowns are considered 'airbags' and so may qualify as safety equipment, but your mileage may vary. I don't think he was asking about how much a curb weighs though. that would probably depend on the height, width, length, and composition of the curb piece in question.
    • If memory serves, my father's '64 beetle weighed 1200 lbs, while my '74 superbeetle was a time and a half that . . .

      hawk

  • Because of the limitations of the display the team tested the machine using prime numbers.

    "If you give it a prime number to try then the highest factor of that is one," said Mr Burton.

    "If what they saw when they ran the program was a one - in other words a dash when everything else was dots - then bingo they knew it was working."

    If only they'd had log4baby they could've tried factoring more interesting numbers.

  • by advocate_one ( 662832 ) on Saturday June 21, 2008 @10:54AM (#23885673)
    not a hope of backporting to this one...
  • Zuse did it first (Score:1, Informative)

    by Anonymous Coward

    Not to nitpick but...What about the machines built by Zuse?

    http://en.wikipedia.org/wiki/Konrad_Zuse

    • Re:Zuse did it first (Score:4, Informative)

      by jeiler ( 1106393 ) <go.bugger.offNO@SPAMgmail.com> on Saturday June 21, 2008 @11:03AM (#23885781) Journal
      Zuse's machine didn't have memory, which is part of how they're defining "modern computer."
      • Re: (Score:2, Informative)

        by Anonymous Coward

        Well, according to wilkipedia it had memory

        "Improving on the basic Z2 machine, he built the Z3 in 1941. It was a binary 64-bit floating point calculator featuring programmability with loops but without conditional jumps, with memory and a calculation unit based on telephone relays. The telephone relays used in his machines were largely collected from discarded stock."

      • by Lars T. ( 470328 )

        Zuse's machine didn't have memory, which is part of how they're defining "modern computer."

        Who is "they"? If you mean "the authors of the article" - well, the actual quote is:

        "It was the earliest machine that was a computer, in the sense of what everyone today understands a computer to be," explained Chris Burton of the Computer Conservation Society (CCS).

        "It was a single piece of hardware which could perform any application depending on what program you put in."

        The key to this ability was its memory, built from a cathode ray tube (CRT), which could be used to store a program.

        And, let's face it, if they hadn't put in that last sentence about memory, the Z3 would fit that description.

        • by jeiler ( 1106393 )

          And, let's face it, if they hadn't put in that last sentence about memory, the Z3 would fit that description.

          And if your aunt had testicles, she would fit the description of "uncle."

          Jeez, people, read the synopsis: the first three words are "Stored program computers." The Z3 was not a stored-program computer.

          • by Lars T. ( 470328 )

            And, let's face it, if they hadn't put in that last sentence about memory, the Z3 would fit that description.

            And if your aunt had testicles, she would fit the description of "uncle."

            Jeez, people, read the synopsis: the first three words are "Stored program computers." The Z3 was not a stored-program computer.

            So it made up it's program? The program was stored on punched film. Unlike the Colossus and the ENIAC also mentioned in the article, which needed rewiring to reprogram.

            And before you try to nitpick your way out of it, it says "stored", not "memorized". Blame the submitter for not picking conditions to qualify for "modern computer" as well as the article: "the first to contain memory which could store a program"

            • by jeiler ( 1106393 )

              Um, Lars, just in case you didn't know--a "Stored-program computer" is a specific term, not a generic string of words.

              How much trouble is it to ask people to read the fucking article. *headdesk* Waitaminit--this is Slashdot. My bad.

    • Re:Zuse did it first (Score:5, Informative)

      by somethinsfishy ( 225774 ) on Saturday June 21, 2008 @12:16PM (#23886409)
      The Baby is set apart from other early machines by two major features.

      Memory was what we would call dynamic RAM. The storage element was special CRT's called Williams tubes which were the first all-electronic memory device (flip-flops we not economically viable for storing data). Williams tubes were randomly accessible and, used charges to store bits, and were therefore volatile. The volatile characteristic means that bits had to be refreshed by reading, or they would evaporate due to charge leakage. This is the same reason modern RAM chips have a periodic refresh cycle. This isn't a functional parallel, just a historically interesting one. FWIW, mercury delay lines are volatile, too, but not because of charge leakage. Programs were read into RAM from which they were executed.

      The other feature of the Baby which was adopted into subsequent designs was conditional jumps - sort of like goto's. The relative jump is a jump to a calculated address. Without the ability to hop around the program space based on whether statements are evaluated as true or false precludes easy implementation of things like for loops and arrays. In 1998, the Z3 was mathematically proved to be capable of conditional jumps, but this was not an intent in its design and didn't lead anywhere.

      The Baby had only seven instructions (take that, Microchip PIC!):

      Jump (indirect), Jump Relative (indirect), Load Negative, Store Accumulator, Subtract, Skip if Accumulator < 0, Halt

      A very good and hard to find page with info on the Mark I <URL:www4.wittenberg.edu/academics/mathcomp/bjsdir/madmmk1.shtml/>

      • by nurb432 ( 527695 )

        The Baby had only seven instructions (take that, Microchip PIC!):

        Jump (indirect), Jump Relative (indirect), Load Negative, Store Accumulator, Subtract, Skip if Accumulator < 0, Halt

        Just shows you RISC is not such a bad thing after-all.
        • Actually, it's been proven that you can get by with only a single instruction [wikipedia.org], a subtract and branch with three operand addresses. However, having three memory references isn't really classic RISC, which tends to also reduce the number of cycles it takes to execute, and rarely includes read-modify-write instructions.

      • by Tablizer ( 95088 )

        In 1998, the Z3 was mathematically proved to be capable of conditional jumps, but this was not an intent in its design and didn't lead anywhere.

        Let's see if I got this strait: the German Z3 *could* be dynamic enough to qualify, but it wasn't used that way in practice. "Baby" is the first to demonstrate this ability, and thus gets credit.

        Babbage designed a machine on paper that was capable of Turing Complete calculations. Thus, it seems there's 3 stages:

        1. First conceptualization of a machine capable of be

  • Way to go (Score:5, Funny)

    by Iwanowitch ( 993961 ) on Saturday June 21, 2008 @11:16AM (#23885891)

    "If you give it a prime number to try then the highest factor of that is one," said Mr Burton.
    First program EVER and it already had a bug in it.
    • At least we won't have to look far for the first computer program with a bug.

      Hell, MS didn't even invent that!

      • by cp.tar ( 871488 )

        At least we won't have to look far for the first computer program with a bug.

        Hell, MS didn't even invent that!

        In fact, this gives us a good first point for a proof by induction that all software has bugs.

    • Re: (Score:3, Informative)

      He meant the highest factor except for the number itself. Otherwise it doesn't matter what number you put in, the highest factor is always that number.

    • by neumayr ( 819083 )
      Yeah, they should have stopped right there.
      Might have done society some good, not to have technology that might be fit for its intended purpose, but very often isn't.
  • I'd sing the Happy Birthday song but, well, they prolly first used it to lookup tubgirl.
  • by Anonymous Coward on Saturday June 21, 2008 @11:58AM (#23886259)

    They had a programming contest 10 years ago. A pot-noodle timer won and was loaded on the rebuilt machine in a big celebration.

    Read more:

    Manchester Celebrates the 50th Anniversary of the First Stored-Program Computer [computer50.org]

    The 1998 Programming Competition [computer50.org]

    Simulators [computer50.org] so you can try your hand at programming a 60-year-old computer.

  • by Anonymous Coward

    We invented the computer!!!

    ENIAC was the first proper computer. Everyone knows the computer is an American invention, like planes and automobiles and electricity (telephone, radio and electric motors). If it wasn't for America, no one would be able to go faster than a horse's speed, or send messages thousands of miles. We'd all be stuck in the dark ages!

    • by Anonymous Coward

      In Communist days, the doctrine coming out of the Kremlin ( seems familiar somehow) was that Those Imperialist dogs in the west invented nothing.
      - TV
      - Telephone
      - Radio
      - Internal Combustion Engine
      - etc
      were all invented by Russian Patriots.

      Back to reality.
      Those of you in the USA should just learn to accept some basic facts.
      Most things ( apart from the likes of Edison) pre WWII were invented either somewhere else or by an immigrant to the USA.
      Who had the worlds first TV

  • I'm surprised no one mentioned yet that this was the same computer that produced the oldest recorded computer music [slashdot.org] (found so far).

  • This is truly insensitive.

    Please phrase weights in "stone," or "oxenweight."

    Thanks!

    timothy

  • Electrical charges on the screen of the CRT were used to represent binary information. A positive charge represented a one and a negative charge a zero...A metal grid attached to the screen read the different charges. A graphical representation - dashes for a one and dots for a zero - was displayed on a second CRT wired in parallel to the memory device... "The operator peered at the monitor tube and he could see the same patterns as in the storage tube," said Mr Burton.

    It has something that modern computers

  • > the first commercial general purpose computer, the Ferranti Mark I.

    I thought the first commercial general-purpose computer was the Leo [wikipedia.org].

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...