Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Upgrades Hardware

Best Motherboards With Large RAM Capacity? 161

cortex writes "I routinely need to analyze large datasets (principally using Matlab). I recently 'upgraded' to 64-bit Vista so that I can access larger amounts of RAM. I know that various Linux distros have had 64-bit support for years. I also typically use Intel motherboards for their reliability, but currently Intel's desktop motherboards only support 8GB of RAM and their server motherboards are too expensive. Can anyone relate their experiences with working with Vista or Linux machines running with large RAM (>8GB)? What is the best motherboard (Intel or AMD) and OS combination for workstation applications in terms of cost and reliability?"
This discussion has been archived. No new comments can be posted.

Best Motherboards With Large RAM Capacity?

Comments Filter:
  • Chipsets (Score:5, Insightful)

    by niceone ( 992278 ) * on Tuesday January 01, 2008 @07:42AM (#21873404) Journal
    To narrow things down a bit, it's not about Motherboards - it's about chipsets. I've only been looking at Intel (AMD don't have the performance right now for music stuff) - Intel's current P35 and X38 chipsets both support 8GB memory max. If you need more then you have to look at one of the Xeon chipsets: the 5000X workstation chipset is the one to look at if you want to be able to run 2 processors (not sure what the equivalent one is for a single processor) - it supports up to 32GB of memory.
  • by jacquesm ( 154384 ) <j@NoSpam.ww.com> on Tuesday January 01, 2008 @09:02AM (#21873628) Homepage
    I love that attitude...

    Some guy comes and asks an honest question. Then people go and tell him that can't be right and then go and give all kinds of suggestions taking into account that he isn't right.

    Let's just for a second assume that the OP has a dataset that large. I can easily imagine it:

    - complicated physics model
    - computational biology problem
    - datamining

    and any one of a thousand other not so trivial computational problems.

    If his 'luck' is the problem is not trivially parallelizable (I hope that's spelled right) then he's got two choices:

    1) try to set up some kind of pipeline
    2) get a single machine that can handle all the data

    Apparently he has chosen for door #2 because that seems to be just about feasible.

    There are some top of the line dell machines that will hold up to 128G of ram, the R900 series.

  • by 16384 ( 21672 ) on Tuesday January 01, 2008 @09:29AM (#21873726)

    Is your working set honestly over 8GB? Your dataset might be extremely large... but I would think that for the most part you'd get along just fine with swapping out to a decently fast device and your working set would be considerably below 8GB.

    When doing computer simulations it's really easy to need that much RAM. I currently have 4 GB (2xQuad Xeons on a Tyan motherboard -- To the OP: Get Opterons instead if you can), but could sometimes use much more. Swap is not an options: When the memory hits the swap the performance simply drops to pathetic levels.

    In computational physics you are in a constant struggle between the need for more accuracy and the limits of the machine.

  • by mangu ( 126918 ) on Tuesday January 01, 2008 @09:57AM (#21873870)

    Is your working set honestly over 8GB? Your dataset might be extremely large... but I would think that for the most part you'd get along just fine with swapping out to a decently fast device and your working set would be considerably below 8GB.

    My thoughts exactly. When doing physics simulations, one often needs to manually optimize the code in order to use the cache correctly, so optimizing the swap shouldn't be such a problem.


    Personal computers do not have support for more than 8 GB for a good reason, there isn't I/O capacity to use that much memory. There's no use having memory if you cannot transfer data to and from it.


    However, the problem is that he uses Matlab. Perhaps he could get better performance using Octave [gnu.org] with Atlas [sourceforge.net] optimization, but in the end, only compiling in C with assembly language optimization will guarantee the best results. I have heard from several people that Matlab has problems when the data sets become large.


  • by EsbenMoseHansen ( 731150 ) on Tuesday January 01, 2008 @10:25AM (#21874002) Homepage

    However, the problem is that he uses Matlab. Perhaps he could get better performance using Octave [gnu.org] with Atlas [sourceforge.net] optimization, but in the end, only compiling in C with assembly language optimization will guarantee the best results. I have heard from several people that Matlab has problems when the data sets become large.
    Well, looking at the price list [mathworks.com], switching to octave should buy him a good deal more hardware, even if the performance is the same :)
  • by Jeff DeMaagd ( 2015 ) on Tuesday January 01, 2008 @11:51AM (#21874460) Homepage Journal
    If it's that kind of data, then it's really worth paying more for a solid workstation class board. And it almost assures you of ECC compatibility. ECC isn't necessary for home use and gaming, but if you have a need for 8GB+ of memory, then you probably should protect that data, and it's not terribly expensive either, in my opinion, last year's FB-DIMM pricing notwithstanding, but even that's very affordable now too.
  • by Jeff DeMaagd ( 2015 ) on Tuesday January 01, 2008 @01:48PM (#21875222) Homepage Journal
    ECC might not be that important for you. ECC memory only helps resist bit flipping while the data is in memory. It won't make your backups much more reliable as it's mostly the reliability of the medium, when backing up, the amount of time data is in memory during the transfer is very short. If you keep gigabytes of data in RAM for days at a time, or if the data is valuable, then ECC would be one step, in conjunction with mirrored or RAID-5 storage and off-line backups.
  • by jacquesm ( 154384 ) <j@NoSpam.ww.com> on Tuesday January 01, 2008 @03:13PM (#21875882) Homepage
    even the not-so-wealthy can have skills that allow them to ask difficult questions but may not have a corresponding budget.

    I used to be in that position. Now I run my more interesting software on a 5 node dual opteron cluster (small for a cluster, I know... see that's those budget constraints again), each node has 8G of ram and 3TB storage. Before that it was 10 pentium machines at 600 Ghz (See http://clustercompute.com/ [clustercompute.com] , which has inspired numerous people to build copies) and before that it was 10 pentium 225's (overclocked 200's :) ). What used to take weeks now takes at the most days. My applications are mostly in datamining, but I find computational biology to be very interesting.

    You have to love it when people overcome their financial limitations with cleverness, why not give the guy a break and simply help him to solve his problem, starting out from the assumption that his problems and limitations are real.

    It would have been nice to have a few more bits of information about the kind of data and the nature of the calculations, I'm pretty sure that 'cheap' is also relative but it seems that cheaper is better for this guy. How many people are at their most brilliant periods in their lives when they're also poor is not easy to figure out but I would not be surprised if it was the majority.

  • Re:Tyan? (Score:3, Insightful)

    by Erpo ( 237853 ) on Tuesday January 01, 2008 @11:48PM (#21879108)
    Oh me too. But if I'm forced to choose between threads and Matlab, I'll take my Matlab. Especially if Matlab is the whole reason the computer is there in the first place.

So you think that money is the root of all evil. Have you ever asked what is the root of money? -- Ayn Rand

Working...