This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Best configuration for this high performance application

Hi, my first post....first, I'm looking for a place appropriate for this, so if this is not it, please pardon me. Any pointers to an online forum for HPC would be more appreciated.

I have a Dell Tower with 64 gigs of ram, 12 processors, a 250gb SST, and 2 2tb hard drives in RAID 0.

I run matlab, to analyze EEG (brainwave) data sets, about 4gb each. When I run an independent component analysis (ICA), all the cores are engaged, and the ram is about half full, taking about a day to complete. That is for one subject. I also do other analyses which are not multi-threaded, so don't use all 12 processors, but operate on 15 4gb datasets (all 15 subjects in an experiment) at once. The toolbox I use is designed to pull data from the hard drive as needed for this, since few machines will have RAM for 15 4gb datasets. Also, as any of these analyses are done, equally large output files are created.


I have windows 7 and Matlab installed on the SST. I am thinking of reconfiguring the system with Windows 7 and Matlab on the RAID to leave the SST free for the EEG data. My thinking is that the ICA and the 15-subject analyses involve a lot of back and forth between the drive and the CPUs, and to optimize performance here, I need to move the OS and applications off of the SST. At the moment it's pretty full with the OS and applications, leaving not much room for the SST to be used in analyses. I hope having the data on the SST will cause the SST to be the main place where data is read from and written to during the analyses.

Thanks much,

Jim Kroger

NMSU