maj 30 2011

Crazy Bumble-weeks

It has now been about 4 weeks since I sat in a small hotel room in Norway and implemented the first lines, of what was going to be the bumblebee project.

By now I have hit over 30.000 hits on the first blog page alone, have about 200 followers on twitter and about 200 issues have been created on github ( about 130 closed )…

z0rc has forked the project to debumblebee and is now supporting debian through that.

Joaquín is providing support for Arch Linux, and  backbone is handles Gentoo

arnauddupuis and paulvriens are helping me on bumblebee with Fedora and OpenSuSE support, and other people are also beginning to submit patches and suggestions (thanks!)..

This has really been some crazy weeks, and not looking to settle down right now.. 😀

Thanks to all for your support, suggestions and reports..

 

Arch Linux Support:

http://aur.archlinux.org/packages.php?ID=49469

wiki:

https://wiki.archlinux.org/index.php/Bumblebee

Debian Linux Support:

https://github.com/z0rc/debumblebee

Gentoo Linux Support:

https://github.com/iegor/bumblebee-Gentoo-support


maj 3 2011

Optimus on Linux Problem Solved

Hi all..

Bumblebee v.1.6.43 has been released…

This has been some crazy days, with A LOT of feedback and bugs…

If you want to speed up the work, send me a Red Bull here.

If you get your machine running with bumblebee, please run the bumblebee-submitsystem.

Right now I’m getting so many problems reported in, that I have lost the overview completely…

Therefore I will now introduce a new way to report bugs regarding bumblebee..

Procedure as follows:

Install the newest version of Bumblebee…

If you have had any previous installation of Bumblebee/Prime-ng installed.. Please run the bumblebee-uninstall tool and reinstall afterwards (no need for reboot)..

If you still have a problem, please create an issue on github:

https://github.com/MrMEEE/bumblebee/issues

Then run the bumblebee-bugreport tool and send the created package to mj@casalogic.dk

I know it’s easier just to send an email.. But this will help me get the bugs organised and hopefully provide help faster..

Thanks..

Martin Juhl

 

First of all… The project has been renamed to bumblebee (the support and right hand of Optimus Prime)

From now on all howto/software updates will be done on github: https://github.com/MrMEEE/bumblebee

Only status and tests will be kept here…

 

The Installation should now be completely automatic…

Now 3D is enabled on both the Intel and the nVidia at the same time..

The Intel card is running the Desktop and the nVidia card can be used for
the thougher applications with the command “optirun32 <application>” or
“optirun64 <application>”…

So now we just need somebody to write an application for automatically
balance which applications should be run on which card.. and shut the nVidia
card down, when not used..

But still I think it is a good step forward…

Up’s:

We got nvidia acceleration on our Linux games and programs.. wine as well..

We got CUDA support… Also for SDK 4.0 which windows doesn’t support for the optimus drivers.

Down’s:

A few programs doesn’t work.. this far I have only found Diablo 2.. which gives some weird double/single buffer error in wine…
This seems to be a Nvidia problem in general, and nothing to do with this solution.. Diablo works with the -w parameter for window mode, and other people with just an Nvidia card is reporting this problem as well..

Still not transparent/automatic switching like in Window$.. but a big step the right way..

No acceleration from the Intel card.. so Nvidia 3D or none…

Vdpau doesn’t seem to be working on the Intel screen, only on the none-watchable nVidia screen … we might need to implement something like VNC to get it to work…

Drains battery pretty fast… Not as bad as I feared… got battery for 2+ hours…

Further Idea’s:

Combine this with the acpi module hack, to get more battery time, when the Nvidia card is not used.. auto-shutdown??

Maybe a seperate X-server for the Nvidia card is the way to go, so we can shut that down as well when not used.. Working in prime-ng V.2.0..

Try to get both the Intel and the Nvidia libraries installed at the same time, so that 3D is enabled for both cards.. Working in prime-ng V.2.0..

 

I will now update this with a version number.. this should make it easier to look for changes: v.1.4.2 (latest: Support for more laptops and performance and tests)

 

OUTDATED FROM HERE AND DOWN… GO TO GITHUB INSTEAD…

 

 

I have found a way to use the nvidia card in machines WITHOUT the optimus mux…

There are still a few flaws… but in my regard they are few…

This has been testet succesful on:

Machine User
Alienware M11X R2 Martin Juhl
Asus N82Jv André Ventura
Acer EeePC 1215N Ellington Santos
Asus N61Jv (X64Jv) Ted
Acer Aspire 5745PG Miguel Belmonte
Dell Vostro 3300 Peter Liedler
Acer EeePC 1215N Fedor Indutny

First of all I have this running on my Alienware M11X R2, on Ubuntu Natty 11.04 64-bit.. And haven’t tried it on any other configurations.. so I hope you can report back, if it works on other laptops (it should) and other distributions….

Here it goes:

First of all download the following:

General:

http://www.martin-juhl.dk/optimus/xorg.conf

32-bit deb-based:

http://www.martin-juhl.dk/optimus/turbojpeg_1.11.1_i386.deb

http://www.martin-juhl.dk/optimus/VirtualGL_2.2.1_i386.deb

64-bit deb-based:

http://www.martin-juhl.dk/optimus/turbojpeg_1.11.1_amd64.deb

http://www.martin-juhl.dk/optimus/VirtualGL_2.2.1_amd64.deb

32-bit rpm-based:

http://www.martin-juhl.dk/optimus/turbojpeg-1.11.i386.rpm

http://www.martin-juhl.dk/optimus/VirtualGL-2.2.1.i386.rpm

64-bit rpm-based:

http://www.martin-juhl.dk/optimus/turbojpeg-1.11.x86_64.rpm

http://www.martin-juhl.dk/optimus/VirtualGL-2.2.1.x86_64.rpm

Source:

http://www.martin-juhl.dk/optimus/turbojpeg-ipp-1.11.1.tar.gz

http://www.martin-juhl.dk/optimus/VirtualGL-2.2.1.tar.gz

Files can also be found here:

http://sourceforge.net/projects/virtualgl/files/

Ok… Installation:

Start by installing the nvidia driver:

sudo aptitude install nvidia-current     (ubuntu)

then put the xorg.conf in /etc/X11/

It might be necessary to change the “ConnectedMonitor” under the Nvidia-device section and BUSID’s for the Intel and Nvidia Card..

Here is what I got so far:

Laptop Intel BusID Nvidia BusID ConnectedMonitor
Alienware M11X PCI:0:2:0 PCI:1:0:0 CRT-0
Dell XPS 15 PCI:0:2:0 PCI:2:0:0 CRT-0
Asus N61Jv (X64Jv) PCI:0:2:0 PCI:1:0:0 CRT-0
Acer EeePC 1215N PCI:0:2:0 PCI:4:0:0 DFP-0
Acer Aspire 5745PG PCI:0:2:0 PCI:1:0:0 DFP-0
Dell Vostro 3300 PCI:0:2:0 PCI:1:0:0 DFP-0

 

after that, install the two files you downloaded above:

sudo dpkg -i VirtualGL*        (deb)

or

sudo rpm -ihv VirtualGL*     (rpm)

Note:

Turbojpeg or libjpeg-turbo is not needed, and actually gives a lower performance.. I left the links if you want to experiment..

Performance Tests can be found at the end…

Now run:

sudo vglserver_config

answer as below:

1) Configure server for use with VirtualGL in GLX mode
2) Unconfigure server for use with VirtualGL in GLX mode
X) Exit

Choose:
1

Restrict 3D X server access to vglusers group (recommended)?
[Y/n]
n

Restrict framebuffer device access to vglusers group (recommended)?
[Y/n]
n

Disable XTEST extension (recommended)?
[Y/n]
y
… Creating /etc/modprobe.d/virtualgl.conf to set requested permissions for
/dev/nvidia* …
… Attempting to remove nvidia module from memory so device permissions
will be reloaded …
ERROR: Module nvidia is in use
… Granting write permission to /dev/nvidia0 /dev/nvidiactl for all users …
… Modifying /etc/X11/xorg.conf to enable DRI permissions for
all users …
… Adding xhost +LOCAL: to /etc/kde4/kdm/Xsetup script …
… Disabling XTEST extension in /etc/kde4/kdm/kdmrc …

Done. You must restart the display manager for the changes to take effect.

IMPORTANT NOTE: Your system uses modprobe.d to set device permissions. You
must execute rmmod nvidia with the display manager stopped in order for the
new device permission settings to become effective.

1) Configure server for use with VirtualGL in GLX mode
2) Unconfigure server for use with VirtualGL in GLX mode
X) Exit

Choose:
x

Then:

append the following two lines to /etc/profile or to your ~/.bashrc if that exists..

VGL_DISPLAY=:0.1
export VGL_DISPLAY
VGL_COMPRESS=yuv
export VGL_COMPRESS
VGL_READBACK=fbo
export VGL_READBACK

Now you need to make sure that your Desktop Enviroment runs the following command after starting up..

vglclient localhost &

In KDE this is archived by creating: ~/.kde/share/autostart/vglclient.desktop (you might need to create the autostart folder as well):

[Desktop Entry]
Comment[da]=
Comment=
Exec=vglclient localhost &
GenericName=
Icon=exec
MimeType=
Name=vglclient
Path=
StartupNotify=true
Terminal=false
TerminalOptions=
Type=Application
X-DBUS-ServiceName=
X-DBUS-StartupType=
X-KDE-SubstituteUID=false
X-KDE-Username=
X-Ubuntu-Gettext-Domain=desktop_kdebase

and reboot…

Hopefully your computer comes back up..

now you should be able to start applications with:

vglrun <application>

and the nvidia card will be used for acceleration..

btw. <application> needs to contain the full path to the application if not in the path…

It is still the Intel card running the rest.. and for now I haven’t found a way to activate acceleration for both cards.. so no fancy compiz effects.. but thats no problem for me, as long as I can use my nvidia card for gaming :D…

Hope this will help someone..

Nvidia Always on in Gnome or KDE:

This is experimental.. might be unforeseen problems:

KDE:

Create: /usr/share/xsessions/kde-plasma-vgl.desktop:

[Desktop Entry]
Encoding=UTF-8
Type=XSession
Exec=/usr/bin/startkde-vgl
TryExec=/usr/bin/startkde-vgl
Name=KDE Plasma Workspace Nvidia Accelerated
Comment=The desktop made by KDE

Create: /usr/bin/startkde-vgl:

#!/bin/bash
/usr/bin/vglrun -d :0.1 /usr/bin/startkde

and run:

chmod +x /usr/bin/startkde-vgl

 

Gnome:

Create: /usr/share/xsessions/gnome-vgl.desktop:

[Desktop Entry]
Type=XSession
Exec=/usr/bin/gnome-session-vgl
TryExec=/usr/bin/gnome-session-vgl
Name=GNOME Nvidia Accelerated
Comment=The GNU Network Object Model Environment. A complete, free and easy-to-use desktop environment
X-Ubuntu-Gettext-Domain=desktop_kdebase-workspace

Create: /usr/bin/gnome-session-vgl:

#!/bin/bash
/usr/bin/vglrun -d :0.1 /usr/bin/gnome-session

and run:

chmod +x /usr/bin/gnome-session-vgl

The Accelerated sessions should now be available in gdm or kdm..

KDE or Gnome doesn’t work with Desktop Effects, but everything else seems to work.. (applications are automatically started with the Nvidia card)

Performance:

I’m using glxgears and getting:

~ 1700 fps with libjpeg-turbo

7362 frames in 5.0 seconds = 1472.247 FPS
6651 frames in 5.0 seconds = 1330.055 FPS
6507 frames in 5.0 seconds = 1301.198 FPS
9399 frames in 5.0 seconds = 1879.629 FPS
9519 frames in 5.0 seconds = 1903.788 FPS
8857 frames in 5.0 seconds = 1771.247 FPS
9524 frames in 5.0 seconds = 1904.607 FPS
8545 frames in 5.0 seconds = 1708.921 FPS
9485 frames in 5.0 seconds = 1896.910 FPS
9495 frames in 5.0 seconds = 1898.814 FPS

~ 1600 fps with turbojpeg

6580 frames in 5.0 seconds = 1315.794 FPS
7177 frames in 5.0 seconds = 1435.397 FPS
9346 frames in 5.0 seconds = 1869.135 FPS
8206 frames in 5.0 seconds = 1641.097 FPS
6429 frames in 5.0 seconds = 1285.784 FPS
7006 frames in 5.0 seconds = 1401.094 FPS
6229 frames in 5.0 seconds = 1245.770 FPS
7559 frames in 5.0 seconds = 1511.768 FPS
8541 frames in 5.0 seconds = 1708.090 FPS
8503 frames in 5.0 seconds = 1700.564 FPS
8501 frames in 5.0 seconds = 1700.139 FPS
8540 frames in 5.0 seconds = 1707.930 FPS

~ 1900 fps with none of them installed

7621 frames in 5.0 seconds = 1524.158 FPS
9526 frames in 5.0 seconds = 1905.055 FPS
9504 frames in 5.0 seconds = 1900.725 FPS
9487 frames in 5.0 seconds = 1897.317 FPS
9419 frames in 5.0 seconds = 1883.613 FPS
8754 frames in 5.0 seconds = 1750.718 FPS
9435 frames in 5.0 seconds = 1886.882 FPS
9400 frames in 5.0 seconds = 1879.999 FPS
9408 frames in 5.0 seconds = 1881.523 FPS

~ 1600 fps with xuv

7888 frames in 5.0 seconds = 1577.544 FPS
7906 frames in 5.0 seconds = 1581.168 FPS
7963 frames in 5.0 seconds = 1592.556 FPS
8090 frames in 5.0 seconds = 1617.894 FPS
8122 frames in 5.0 seconds = 1624.336 FPS
8204 frames in 5.0 seconds = 1640.684 FPS
8377 frames in 5.0 seconds = 1675.184 FPS
8288 frames in 5.0 seconds = 1657.548 FPS
8399 frames in 5.0 seconds = 1679.671 FPS