I use primarily Nvidia GPU cards for rendering, some ML, Cuda projects etc – and occasionally run benchmarks on the cards l have on different workstations for comparison.
Using OctaneBench – click here to download – here are some results. I used OctaneBench 4.00c
GTX970 4GB Desktop PC 94.16
GTX1060 6GB – on Alienware 17R5 Laptop 91.31
GTX 1080 8GB – Desktop PC – 147.30
Seems a bit odd to me that a GTX1060 is beat by a GTX970, but that might be because of the mobile vs desktop version? , l am pretty sure l have ran other tests posted on this website showign a GTX1060 exceeding a GTX970
QuadQuadro M600M 2GB 21.23
“Machine learning is a subfield of computer science that evolved from the study of pattern recognition and computational learning theory in artificial intelligence. In 1959, Arthur Samuel defined machine learning as a “Field of study that gives computers the ability to learn without being explicitly programmed”.” ..Wikipedia
Some useful links l found while exploring Deep/machine learning, AI etc:
Using the software Superposition Benchmark – click here to download.
For CineBench – click here to download the free benchmark software’
Compared with my custom main workstation PC, using a dual card 8Gb GTX1080 and 4Gb GTX970
With same workstation, just with the GTX1080 8Gb card
From above, it seems my main PC as l have researched is limited a bit when a 4Gb GTX970 is in it.
Also note the processor is much older, compared to that in my new laptop below.
Below is the Alienware 17 R5 Results
Here are surprising results generated from the new Blender Benchmark under its Blender Open data website program.
Quick summary/my findings:
- less render time running Linux [LinumMint] than under Windows 10 [all things turned off], same hardware
- SSD’s help
Linux run: Workstation with GTX1080 8Gb, GTX970 4Gb, 32Gb RAM, Quad core Intel processor. No SSD
Same hardware as above but run under Windows 10
Ran on same hardware, only using the GTX1080 8Gb , under Linuxmint – impressive resuly showing in my view and from cuda/nvidia research l did that the 4Gb of the 2nd graphics card limits the performance.
Home PC #2 Results below, newer PC, no SSD, 2 x GTX560Ti
Below results from Alienware 17 [4 years old] with Samsung Evo 500Gb SSD, GTX860M
Updating to the latest cuda development toolkit on Linuxmint 18.1 [with a GTX 1080+970]
First, update to R390 Driver
sudo dpkg -i cuda-repo-ubuntu1704-9-1-local_9.1.85-1_amd64.deb
sudo apt-key add /var/cuda-repo-9-1-local/7fa2af80.pub
sudo apt-get update
sudo apt-get install cuda
GTX1080 at about 80 degrees C – and doing 474 Sol/s
GTX970 at aboyt 56 degrees C – and doing 275 Sol/s
OS – Linuxmint