Home PC or workstation for MATLAB?

I am due to specify a new PC to use for my MATLAB work. The most demanding tasks are typically repeatedly running long (couple of hours simulated time) Simulink simulations at small sample times (couple of microseconds) and plotting and otherwise processing (fairly simple like averaging, root mean square, etc.) the data. The simulations are usually time-varying and often of stiff systems. Ususally I run the simulations in series (tuning model and control parameters, etc.) although sometimes they could run in parallel for Monte-Carlo-type simulations, should MATLAB allow so (Parallel Processing Toolbox?).
Does anyone have information/ideas about whether the 'workstations' offered by companies such as Dell and Lenovo would perform significantly 'better' (mainly faster) than 'home PCs' at a similar price, say £1200/US$2000/Euro1400 for guidance? I am assuming the same RAM, say 8GB (trying to get 16GB, but that may exceed the budget).
Looking at Intel CPUs only, I have noticed that (mid-higher spec) 'home PCs' (which is all we use in the office at the moment) usually come with i5 or i7 CPUs, whereas 'workstations tend to come with a variety of Xeon CPUs. Is there any benchmarking data on whether (current, lower-end) Xeon CPUs offer an advantage over an i7, or vice versa, for the type of Simulink work described above?

 Accepted Answer

Jason Ross
Jason Ross on 2 Aug 2012
Edited: Jason Ross on 2 Aug 2012
The way I look at such questions is to look at what's the bottleneck on the system.
  • If you are using the swap file heavily, I'd look at more RAM.
  • If one processor is pegged when you are doing your run, I favor clock speed.
  • If you can benefit from parallel computation, I go for cores.
  • If you do a lot of disk I/O, I look at SSD and a fast SATA interface.
  • It sounds like everything is local, so I'm not addressing network.
  • And, of course the budget!
If you are on Windows 7, use the Resource Monitor (start Task Manager, then click the "Performance" tab and "Resource Monitor" to get a nice overview of all these systems. Other operating systems have similar tools.
As for the processor question, Intel has a nice comparison tool:
Generally (there are exceptions)
  • i5 is a 4 core processor without hyperthreading
  • i7 is a 4 core processor with hyperthreading
  • Xeon is a 6 or 8 core processor with hyperthreading.
Clock speeds vary throughout each of the lines, with cost generally increasing towards the fastest chip.
In terms of workstation versus home PC, generally the workstation offers things like
  • Multiple CPUs (single, dual or quad)
  • Space and interfaces for additional stuff (GPUs, hard drives)
  • The ability to put in more RAM (I've seen up to 1TB offered)
  • Multiple network interfaces
  • Higher capacity power supply
If the above matter to you, then the workstation is "better". But if they don't, then they offer little value.
For specific comparisons there are many hardware sites. http://www.cpubenchmark.net/ concentrates on benchmarks of CPU performance, but Tom's Hardware, Anandtech, etc also post in-depth reviews of hardware, too.

3 Comments

Gerrit
Gerrit on 3 Aug 2012
Edited: Gerrit on 3 Aug 2012
Hello Jason, you give some very good pointers there:
Resource Monitor: I have tried to use that previously, but for example I just do not know what "heavy swap file use" is: how do I know it is heavier than it could/should be? Do you know of any fairly simple explanation or user guide for this tool, i.e. how to interpret the information it gives?
Bottleneck and processors: Even if I knew exactly what the bottleneck on my current PC was, it does not tell me what it would be on a new PC. That is why I am trying to get some general guidance on the potential advantages of a 'workstation' over a 'home PC'. The key difference that would potentially affect me would be to have multiple CPUs - the others you list would not affect me much, assuming a high-spec PC has enough RAM. The question is then: would my use of MATLAB (say Simulink simulation of stiff systems with lots of scopes) benefit from having multiple CPUs, assuming the budget would allow that?
Cores and hyperthreading: This is a topic that confuses me greatly: what is the difference, in MATLAB use, of cores and hyperthreading, and is hyperthreading the same as multithreading? On the latter, I found a comment on the Mathworks website:
However, this does not mention Simulink. Does Simulink benefit indirectly from multithreading, or not at all? Or would multiple cores/CPUs help if I was multitasking by running two or possibly three (unlikely to be more) instances of MATLAB at the same time (one chugging away at number-crunching in the background, one to develop new software)?
Overall: Based on your reply, and my general research so far, I suspect that my 'budget' is insufficient to justify a 'workstation', and that I will be better off with a desktop/home PC. However, any comments that would confirm this, or otherwise, would be welcome.
I'll respond with my own opinions. Keep in mind these are not iron-clad rules.
For swap file usage, you should look at how much of your existing RAM is in use by the system when you are running your simulation. There is a dedicated tab for Memory on the Resource Monitor that shows a nice graph of memory utilization as things are running. If this is regularly topping out on your current system, then you are going into swap and this is most likely affecting your performance, as the swap file is going to be significantly slower. I'd look at sacrificing clock speed or cores to be on budget to fit the higher RAM amount. If you can get the system operating with essentially no swap file at all, operations will take place only in memory once they are loaded from disk and the performance will be very good. And with 16 GB RAM available for under $100 (USD), and dead simple to install, it's possible that you might want to look at the cost the vendor charges for the upgrade versus DIY and go that route -- especially if you know your way around the inside of a PC (or work with someone who does)
For your processors question -- if your existing machine has multiple cores, do they get maxed out when you are doing a run? I'm not all that familiar with the intricacies of Simulink. As for hyper-threading versus not, I've personally found that for what I work with most, it doesn't offer much value. I know there are some applications where it does, but I've seen that processes essentially wait on the "real" cores for computation, and it doesn't really matter much. So I'd look more at the number of cores versus hyperthreading, and take the saved money and put it into RAM.
You might also want to look into if what are you doing supports Parallel Computing and see if you can trial the toolbox. In that case, having multiple compute cores and RAM could most definitely help you speed up your time.
I would concur that your budget is likely going to land you in the "well-configured home desktop" range. Which is pretty much what a "well-configured workstation" was a few years ago, so you are likely to see much better performance.
Hope this helps,
Jason
Thank you: Looking at my "CPU Usage History" on the "Windows Task Manager" (this is Windows Vista 32-bit OS), this sits at around 55% (80-90 on one, 20-30 on the other of the two), which is high but does not seem to be at its limit. The "Free" "Physical Memory (MB)" drops to 0 during simulation, but I cannot see any sudden increase in Disk use or Page Faults on the Reliability and Performance Monitor" when that happens, so not sure what actually happens or 'gives way'.
It would be nice if I could do a proper benchmarking of two alternative system. However, that is not going to happen...
You have given me enough info to get going with the next spec. Thanks again and best regards,
Gerrit.

Sign in to comment.

More Answers (3)

William
William on 3 Aug 2012
look at the lenovo W520. I use it all the time and was able to get one with 16gb of ram for around 2000. Worst case get a desktop. You won't be paying for portability.

1 Comment

Thanks, it is a valid suggestion. However, I am not too convinced about laptops regarding value, or 'power', for money; we got a Sony Vaio with 64-bit OS and 6GB RAM and up-to-date CPU last year, and it was hardly faster in benchmarking than my five-year old Dell Dimension E520 with 32-bit OS and 'full' RAM (6GB installed, but of course the 32-bit Windows OS only uses approx. 3GB of that).

Sign in to comment.

Bhuvnesh Rathor
Bhuvnesh Rathor on 17 Aug 2020
Minimum Specifications for MATLAB Software
Processor - i3, i5,
RAM 4 GB
Hard Disk – 500 GB
Graphics Card- 2 GB
Window Any
If you want to work on new version of MATLAB software and in your budget then we suggest i7 processor, 8 GB ram, 1tb HDD

10 Comments

I would not buy as little as 8 gigabytes of RAM for a new installation unless I was pressed hard on current budget and I expected to be able to upgrade in a couple of years (e.g. expect to get a job in two years.) In less than 4 years, 8 gigabytes of RAM is going to be considered a serious limitation.
I moved from a 2012 MacBook Pro with 8 gigabytes, to a 2013 iMac with 32 gigabytes, and it made a significant difference for my work. The 2013 iMac is still fast enough for most things that I do, but I am not doing Deep Learning. Sure, faster CPUs would be nice, but the kinds of activities that I do that take a long time are 18+ hour symbolic computations and a very expensive highly cooled CPU that was 50% faster than what I have now would only take that down to 12+ hours: it isn't like having a 60 core supercomputer on my desk would solve the computation in 5 minutes. (Symbolic computing in parallel is more difficult than it sounds unless you put up with exponential memory usage. The x+y over here might not necessarily be the same as the x+y over there, and subtracting the two might not return 0.)
Bhuvnesh and Walter, I'm glad to see this thread is still alive and being read. My current key PC configuration is i7-3770 CPU, 12 GB memory, 1.8 TB HD (+ 223 GB SSD), Intel HD Graphics 4000 display adapter. That is plenty of HD space, but the memory is insufficient (I am doing some Simulink simulations with large datasets as I/O) and cannot be expanded. So I am on the lookout for a new PC again (well, it's been eight years)... Given that 12 GB memory is insufficient now, I wonder if 16 GB would be enough (if expandible in future) or if I have a sufficient argument to take to my boss to get a budget for a 32 GM PC.
In addition, is there any know/public comparison of i9 vs. i7 CPUs for Simulink use?
I came across the website of a company "Workstation Specialists":
Their recommendations tend to be rather more high-spec, for example a minimum of 32 GB memory and an NVIDIA Quadro graphics card (for PCs). Of course, they are selling their high-spec (presumably with matching prices) workstations, whereas MathWorks does't want to put off potential customers with lower spec systems (by recommending higher-spec requirements). So are Workstation Specialists massively over-egging the pudding?
Walter Roberson
Walter Roberson on 17 Aug 2020
Edited: Walter Roberson on 17 Aug 2020
It is not uncommon for me to have two matlab sessions running at the same time. Sometimes it is because I have a running calculation that I need to leave running while doing something else. Sometimes it is because it is common for me to be working on multiple topics, and save/load is inconvenient as a "suspend" mechanism (and does not save entire session states with function workspaces and so on.) Sometimes it is because I need to experiment with an approach while not breaking what I have. Sometimes it is because I need to work with multiple releases to cross-check behaviour or bug test. Sometimes I need to run a virtual machine to test matlab under a different operating system.
I find that when I try to do any development work in Simulink, it is pretty likely that I will end up opening another matlab session for pure matlab work.
I also tend to have Maplesoft's Maple symbolic programming language running, and there are plenty of times memory use blows up with that. Symbolic simultaneously equations with just a couple of squared terms chews up memory quickly (exact solution for quartic equations is long )
All of which is to say that 12 gigabytes is just not enough these days.
Walter: Exactly the same here; I may have my own project/Simulink model to work on, while a colleague asks me a question about a different MATLAB issue or Simulink model. Plus looking up some issues on a browser takes up a fair whack of memory as well. Anyway, being half cheeky and half serious here (not sure exactly which dominates): in your view, would 16 GB memory (normally, whatever that is...) be sufficient if 12 isn't?
If you read that page very carefully they don't actually recommend a Quadro.
I would suggest looking at your typical work. How much memory does it use? How many cores? Are you using GPU acceleration? Only for extreme parallel work I would suggest more than 6 or 8 cores (also with an eye to the limited upgrade path you have with Epyc/Threadripper compared to Ryzen). The claim that the Intel hyperthreading doesn't actually improve performance seems strange to me.
It all depends on your workflow, but in general you should try to determine how much memory you need per thread, as that tends to be relatively stable over a wider range of cores. It makes intuitive sense you shouldn't spend thousands on your CPU and only getting 8GB of RAM. I have heard a suggestion that you should get 2GB per core you're planning to saturate, but I don't remember in what context that advice was given. I planned on getting 32GB for an R5 3600X (6c12t), which was intended as future-proofing, seeing as I don't have very memory-intensive work. (also: this machine is my gaming computer as well, which motivated a limited thread-count to allow higher clocks, which improves performance there)
Gerrit
Gerrit on 17 Aug 2020
Edited: Gerrit on 17 Aug 2020
Rik, thank you - good comments. Re. Quadro, it depends on what you mean exactly by recommendation. What they say is: "If however, you plan to take advantage of the GPU acceleration options within MATLAB then a more powerful graphics card or multiple cards will be required. The MathWorks MATLAB Parallel Computing Toolbox takes advantage of the huge performance acceleration GPUs provides to compute/analysis workflows. We recommend NVIDIA Quadro graphics cards for workstations and the NVIDIA Tesla range for rackmount servers supporting compute capability 3.0 or higher to guarantee compatibility and stability." It does start with a fairly big 'if', but I would still call that a recommendation - maybe a matter of semantics.
Regarding the general 'how much X does my MATLAB/Simulink' use, I am struggling to answer that question. I can see on the Task Manager and Resource Monitor that my Physical Memory is maxed out at 100% with the current model and data set that I am working on, so that is a pretty clear constraint. However, it doesn't tell me how much memory MATLAB, and everything else I am running, would use if I had enough memory. Is there any way I could tell? With some other models I use, the data sets and memory requirements are much smaller, and other factors must form the constraint or bottleneck on the run time of a Simulink simulation. But how to know what that is, e.g. does Simulink use one core or more? I also read somewhere that, due to (or when) hyperthreading, the Task Manager only indicates 50% CPU usage, even when the actual/underlying usage is 100%. So any advice on how to determine the true bottleneck on performance when running a Simulink model would be very welcome.
Re. GPU acceleration: my current PC has only the basic graphics capabilities on the motherboard (Intel HP Graphics 4000), so I haven't been able to check if I could use GPU acceleration (unless someone can tell me otherwise).
Re. memory per core: Do you remember whether the 2GB per core you mention is a target value or a minimum? If it is a target, then say 32GB memory would correspond to 16 cores, which seems rather high to me. You said you are thinking of a CPU with 6 cores and 32GB, i.e. ~5GB per core, so I assume the 2GB would be a minimum. I realise that you are talking about advice and suggestions, not certainty, but anything that gives me a bit of clarity is welcome.
Re. future-proofing: Obviously, I am trying to create some specs that will meet my future as well as current requirements. As an example, to date I have not done much, or any, work on machine-learning. However, that is an area I will (need to) get into sooner rather than later. I would imagine that parallel processing and large data sets are relevant issues here. Does anyone have any comments on whether/how this would affect the requirements on a PC?
CPU: If you go to the performance tab of task manager, click the CPU section, right-click on the graph and select 'Logical processors' under the 'change graph to' menu. Then you can see the usage per thread.
RAM: 32GB is too much for what I currently do, but it provide headroom to start doing more intensive tasks. For 'normal' use I would say 2GB/thread is a good target, but you are doing more intensive tasks. A quick search taught me that the 2GB/thread wisdom has been told for quite some time now. Lately the number of threads have exploded (4 threads were quite normal, now 12 threads is even available to mid-range gamer builds). I don't really know what effect that would have on the memory/core, but going to 4-8GB/core (or /thread) doesn't sound like strange choice. I don't have your workflow. If you find a way to create a benchmark, I can run it for you and tell you the memory stats on my system, but otherwise this is not my area of expertise. Just my two cents.
GPU: If you're going to do deep/machine learning you will have to go for large memory GPUs (i.e. Quadro/Tesla). Most other parallel workflows will run on the CPU, so even the integrated graphics you're using are fine. Once you get to GPUs that work well for DL/ML/ANN the cost of the GPU can easily exceed the cost of the rest of the components combined, while still resulting in a balanced build. I suspect there are other sites where you can find a lot of advice about what combination of components are ideal for a given budget.
I run out of memory with 32 gigabytes. Not every day, but it is not rare. If I only had 16 I would be running out a lot.
I was using over 16 even as I wrote that. When I closed down all my apps I was still over 9 Gb. System processes keep expanding... and so many side processes running these days.
Rik
Rik on 18 Aug 2020
Edited: Rik on 18 Aug 2020
After I upgraded my pc, my memory usage blew up. I suspect my idle RAM usage would decrease by 40 percent if I halved the total capacity. To be fair, I don't know what behavior I would expect on Macs.

Sign in to comment.

Gerrit
Gerrit on 17 Aug 2020
Rik, I very much appreciate your (and all correspondents') time and effort and hope others do as well.
CPU: Is your description for Windows 10? I still use Windows 7 on this PC (one of the reasons I need to change it) and when ticking the Performance tab there is a menu option View -> CPU History -> One Graph Per CPU (or alternatively One Graph, All CPUs). I assume this does the same as your option; it shows eight individual graphs (four cores with two threads per core). I would send a screenshot of the Task Manager if I could, but Chrome crashed earlier on when I tried that, losing me my earlier text.
RAM: The large-data model I am working on at the moment is commercially sensitive and I couldn't send that. If I do think of a way to create a public large-data Simulink (benchmarking) model, I will post it on the MathWorks site.
GPU: If anyone has examples of the component combination sites that Rik refers to, could you please post them here?

1 Comment

I hope your organization made use of that extended protection deal with Microsoft if it is still running Windows 7 on internet-connected machines. (if you can attach a USB-stick to it and use that same stick on an internet-connected machine, even offline machines should be considered internet-connected)
It sounds to me like Walters workflow is closer to yours, so his advice is probably more applicable. The last time I ran out of memory was when answering this question where I found the maximum size imwrite can write to a png: about a 4GB file.

Sign in to comment.

Categories

Find more on Graphics Performance in Help Center and File Exchange

Products

Asked:

on 2 Aug 2012

Edited:

Rik
on 18 Aug 2020

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!