Pages

Sunday, July 1, 2018

Ways of Using NVIDIA Graphics Card in a Linux Machine

There are different ways of using NVIDIA graphics drivers in Linux machine, such as:
1. Using it as the main GPU
Cons:
a. Significantly more power consumption compared to using Intel HD
b. Known artifacts, which is sometimes portion of the screen gets corrupted. This goes away when mouse is hovered over, but is still annoying

2. Using it with Bumblebee to get PRIME-like feature.
As most of you are probably familiar, in Windows, the default setting is that NVIDIA GPU is turned on only if it's needed. For example when you're running a game, it's turned on, but when you're browsing, editing a text-file, or something else lightweight, it's turned off to conserve the battery. In Linux, however, such feature doesn't work. The open-source community, however, created an effort to get a similar feature: Bumblebee project. With Bumblebee, Intel HD is the one that renders to the screen. NVIDIA GPU, when requested, is used to render things to a transparent layer, which is then transported into the Intel HD buffer to be rendered to the screen. Unliked PRIME feature in Windows, however, the switch between when to use NVIDIA vs. Intel is not automated. You'll have to specifically tell a program to use which GPU.
For more information: https://github.com/Bumblebee-Project/
Cons:
a. Setup can be pretty complex
b. The perf is not as good as using NVIDIA to render directly to the screen, because of the additional step of transporting NVIDIA-rendered buffer to screen buffer, which is a CPU-consuming step

3. Using nvidia-xrun to run a separate entity of X server which uses NVIDIA GPU.
This is actually my favorite, because this is very easy to setup. To give a little bit of background, in Linux, there is a component called Display Manager the job of display manager is to provide an interface for application to render stuffs onto the screen. In other words, it's a bridge between application and display hardware. And for any applications to render significant graphics into the screen, they use the API provided by X server. So the idea here is to run 2 separate X servers. The first is used to run typical lightweight workload such as text-editing, web-browsing, etc, and is backed by Intel HD drivers. The second is backed by NVIDIA drivers, hence is using the NVIDIA card, and is only run when there is a need to run an application that requires it. So this is very similar to Bumblebee project, except is much simpler.



Special thanks to:
https://wiki.archlinux.org/index.php/NVIDIA
https://github.com/Witko/nvidia-xrun
https://github.com/Bumblebee-Project/