tl;dr received wisdom is that when buying a Windows laptop to run Minecraft a discrete GPU is hugely important. It is!
I always want to understand the basis of any advice, however there’s not much supporting evidence for this on the internet. Having just purchased a Toshiba Satellite L50 B 1P1 laptop for my son, which can run Minecraft either on its Intel integrated HD4400 or on the discrete AMD R7 M260, it’s easy to get performance stats, and possible to draw conclusions.
On the Internet it’s pretty much unanimous that you should be running a 64 bit version of Windows (8.1 in our case), 64 bit version of Java 8, and Minecraft should be setup to ask for 2GB of RAM from Java. Those are pretty logical recommendations, so that was the base setup.
Discrete versus Integrated
These are all taken with the window maximised to 1366 x 705.
|GPU||Frames Per Second (FPS)|
|Spot||Minimum Flying Around|
|Intel HD 4400||63||30|
|AMD R7 M260||109||45|
Clearly the discrete AMD R7 wins by a considerable margin, and gives a much better overall experience. The performance of the HD4400 isn’t bad either though. Clearly you could play this level with either. The AMD R7 definitely gives you more scope to increase the number of chunks drawn, pushing out the distance drawn. Alternatively you could install more render-intensive mods or turn on the more extreme world features.
Full Screen versus Half Screen
These are all using the discrete AMD R7 M260
|Resolution||Frames Per Second|
|1356 x 705||109|
|667 x 689||150|
In this case the frame rate shoots up as I (roughly) halve the size of the window I’m using for Minecraft. This suggests the speed of the game is directly affected by the number of pixels it has to draw and is again limited by the graphics card.
VBO versus no VBO
There’s just one last thing. In the graphics options there’s and option to use ‘VBO’s, which are Vertex Buffer Objects. In a nutshell when the game uses VBOs it creates chunks of work to give to the graphics card. When it doesn’t it’s letting the driver for the graphics card build up these chunks for it. As a games programmer I’d expect the game to run faster if it decided how to create the chunks of work. For Minecraft this isn’t always the case!
|GPU||Frames Per Second|
|Intel HD 4400||72||63|
|AMD R7 M260||103||45|
VBOs are faster for the Intel graphics chip, unfortunately driver bugs cause the world to flicker in this case. For the AMD R7 Minecraft actually runs slower with VBO’s turned on. The best thing to do with these is to try settings them both on and off and see what is better for your machine.
The newest Intel GPUs aren’t bad if HD4400 or better. Get a discrete GPU though for significantly better performance.
Replicating these tests
It would be interesting if there were more performance measurements for Minecraft which are directly comparable. The following should be enough information to reproduce these tests. If you have Minecraft it will display all the information you need on-screen simply by pressing ‘F3’ in the game.
The performance of Minecraft itself seems to vary a lot from version to version; I was simply using the latest version: 1.8.1. For all the performance tests in the graphic options I turned vsync to off and allowed the frame rate to run as high as it could. Normally I’d clamp the frame rate to 60 fps as letting it run higher than doesn’t give any visible benefit, but uses more processor power.
The integrated graphics chip is a Intel HD4400, with the 4.2.0 10.18.10.3335 driver and the discrete chip is the AMD R7 M260 with the 4.3.12798 driver. Intel clearly wins the length of the driver version number competition.
Finally I was flying round a world generated from the seed “Jools Performance Test” on creative mode, and took a spot reading at (-223,75, 336) looking at (80.5, -4.5). That view in show in the image above.
It would be great to get other people results for their setups.