Hi, I've been trying to get DRM/KMS working with the current graphics stack (xf86-video-ati 7.5, xserver-1.17) on a R200 series card. I assumed this should be working since KMS was implemented for it a while back, and it has been working with xf86-video-ati-6.x.
Unfortunately, it doesn't seem to work.
I've narrowed it down to drmSetInterfaceVersion() failing when called from the ATI driver (in radeon_kms.c). This is a bit strange since, /sys/class/drm/version correctly reports 1.1.0 20060810. Presuably it's getting the correct fd for the DRM master otherwise it should bail earlier?
Googling confirms others have had the same issue, and generally the resolution has been to stick with the old driver.
Should this be working? Is it known to be broken?
On Mon, Jul 6, 2015 at 9:39 AM, Steven Newbury steve@snewbury.org.uk wrote:
Hi, I've been trying to get DRM/KMS working with the current graphics stack (xf86-video-ati 7.5, xserver-1.17) on a R200 series card. I assumed this should be working since KMS was implemented for it a while back, and it has been working with xf86-video-ati-6.x.
Unfortunately, it doesn't seem to work.
I've narrowed it down to drmSetInterfaceVersion() failing when called from the ATI driver (in radeon_kms.c). This is a bit strange since, /sys/class/drm/version correctly reports 1.1.0 20060810. Presuably it's getting the correct fd for the DRM master otherwise it should bail earlier?
Googling confirms others have had the same issue, and generally the resolution has been to stick with the old driver.
Should this be working? Is it known to be broken?
It should be working. Make sure the kernel driver has kms enabled, firmware available, and that the kernel driver is loaded before starting X. If the kernel driver is not loaded before X starts you can get a version mis-match error.
Alex
On Mon, 2015-07-06 at 12:25 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 9:39 AM, Steven Newbury <steve@snewbury.org.uk
wrote: Hi, I've been trying to get DRM/KMS working with the current graphics stack (xf86-video-ati 7.5, xserver-1.17) on a R200 series card. I assumed this should be working since KMS was implemented for it a while back, and it has been working with xf86-video-ati-6.x.
Unfortunately, it doesn't seem to work.
I've narrowed it down to drmSetInterfaceVersion() failing when called from the ATI driver (in radeon_kms.c). This is a bit strange since, /sys/class/drm/version correctly reports 1.1.0 20060810. Presuably it's getting the correct fd for the DRM master otherwise it should bail earlier?
Googling confirms others have had the same issue, and generally the resolution has been to stick with the old driver.
Should this be working? Is it known to be broken?
It should be working. Make sure the kernel driver has kms enabled, firmware available, and that the kernel driver is loaded before starting X. If the kernel driver is not loaded before X starts you can get a version mis-match error.
Yes, using Gentoos 4.1.1 kernel, driver is definitely loaded, with modeset=1, which is working, all sysfs entries are there. gdm manages to fall back to starting up an X session without using DRM swrast -only, not something you want to experience on such a weak CPU!
Manually starting X fails with the "[drm] failed to set drm interface version." error.
It's a very old system, PCI-only(!) Coppermine-128, belonging to a friend. The system previously was (very slowly) running Ubuntu 10 LTS, I think. It's not my machine so I'm not able to have continuous access, but R200 DRM/KMS was working. Apparently, Ubuntu no longer support R200, so no further updates were possible.
My friend can't afford a new machine at the moment; and since I'm a long time Gentoo dev I took it upon myself to build him a optimized desktop with gcc-5, where possible LTO, -Os, -march=pentium3 with the system L1 and L2 cache size information. It's quite possible some part of the gfx stack is miscompiled, I tested it pretty throughly under qemu (with qxl) before deployment, but that of course didn't exercise the R200 driver. FWIW, other than the failing DRI, performance is surprisingly OK, not super fast obviously, but a *lot* better than under Ubuntu! (start-up time is alot quicker, by an order of magnitude!)
I'm attempting to downgrade the xserver and drivers (on the live system) to see if that works, you can imagine that takes a little while on a Coppermine-128! I'll find out tomorrow. Otherwise, I guess I'm recompiling the stack with gcc-4.9 and no-LTO...
On Mon, Jul 6, 2015 at 2:40 PM, Steven Newbury steve@snewbury.org.uk wrote:
On Mon, 2015-07-06 at 12:25 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 9:39 AM, Steven Newbury <steve@snewbury.org.uk
wrote: Hi, I've been trying to get DRM/KMS working with the current graphics stack (xf86-video-ati 7.5, xserver-1.17) on a R200 series card. I assumed this should be working since KMS was implemented for it a while back, and it has been working with xf86-video-ati-6.x.
Unfortunately, it doesn't seem to work.
I've narrowed it down to drmSetInterfaceVersion() failing when called from the ATI driver (in radeon_kms.c). This is a bit strange since, /sys/class/drm/version correctly reports 1.1.0 20060810. Presuably it's getting the correct fd for the DRM master otherwise it should bail earlier?
Googling confirms others have had the same issue, and generally the resolution has been to stick with the old driver.
Should this be working? Is it known to be broken?
It should be working. Make sure the kernel driver has kms enabled, firmware available, and that the kernel driver is loaded before starting X. If the kernel driver is not loaded before X starts you can get a version mis-match error.
Yes, using Gentoos 4.1.1 kernel, driver is definitely loaded, with modeset=1, which is working, all sysfs entries are there. gdm manages to fall back to starting up an X session without using DRM swrast -only, not something you want to experience on such a weak CPU!
If the kernel driver loads properly and you get a kms console you should be good to go.
Manually starting X fails with the "[drm] failed to set drm interface version." error.
Maybe the ddx with that old system was build without KMS support?
Alex
It's a very old system, PCI-only(!) Coppermine-128, belonging to a friend. The system previously was (very slowly) running Ubuntu 10 LTS, I think. It's not my machine so I'm not able to have continuous access, but R200 DRM/KMS was working. Apparently, Ubuntu no longer support R200, so no further updates were possible.
My friend can't afford a new machine at the moment; and since I'm a long time Gentoo dev I took it upon myself to build him a optimized desktop with gcc-5, where possible LTO, -Os, -march=pentium3 with the system L1 and L2 cache size information. It's quite possible some part of the gfx stack is miscompiled, I tested it pretty throughly under qemu (with qxl) before deployment, but that of course didn't exercise the R200 driver. FWIW, other than the failing DRI, performance is surprisingly OK, not super fast obviously, but a *lot* better than under Ubuntu! (start-up time is alot quicker, by an order of magnitude!)
I'm attempting to downgrade the xserver and drivers (on the live system) to see if that works, you can imagine that takes a little while on a Coppermine-128! I'll find out tomorrow. Otherwise, I guess I'm recompiling the stack with gcc-4.9 and no-LTO...
On Mon, 2015-07-06 at 15:42 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 2:40 PM, Steven Newbury <steve@snewbury.org.uk
wrote: On Mon, 2015-07-06 at 12:25 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 9:39 AM, Steven Newbury < steve@snewbury.org.uk
wrote: Hi, I've been trying to get DRM/KMS working with the current graphics stack (xf86-video-ati 7.5, xserver-1.17) on a R200 series card. I assumed this should be working since KMS was implemented for it a while back, and it has been working with xf86-video-ati-6.x.
Unfortunately, it doesn't seem to work.
I've narrowed it down to drmSetInterfaceVersion() failing when called from the ATI driver (in radeon_kms.c). This is a bit strange since, /sys/class/drm/version correctly reports 1.1.0 20060810. Presuably it's getting the correct fd for the DRM master otherwise it should bail earlier?
Googling confirms others have had the same issue, and generally the resolution has been to stick with the old driver.
Should this be working? Is it known to be broken?
It should be working. Make sure the kernel driver has kms enabled, firmware available, and that the kernel driver is loaded before starting X. If the kernel driver is not loaded before X starts you can get a version mis-match error.
Yes, using Gentoos 4.1.1 kernel, driver is definitely loaded, with modeset=1, which is working, all sysfs entries are there. gdm manages to fall back to starting up an X session without using DRM swrast -only, not something you want to experience on such a weak CPU!
If the kernel driver loads properly and you get a kms console you should be good to go.
Manually starting X fails with the "[drm] failed to set drm interface version." error.
Maybe the ddx with that old system was build without KMS support?
Everything is freshly compiled. The error itself is coming from radeon_kms.c:651 in the ddx.
Alex
It's a very old system, PCI-only(!) Coppermine-128, belonging to a friend. The system previously was (very slowly) running Ubuntu 10 LTS, I think. It's not my machine so I'm not able to have continuous access, but R200 DRM/KMS was working. Apparently, Ubuntu no longer support R200, so no further updates were possible.
My friend can't afford a new machine at the moment; and since I'm a long time Gentoo dev I took it upon myself to build him a optimized desktop with gcc-5, where possible LTO, -Os, -march=pentium3 with the system L1 and L2 cache size information. It's quite possible some part of the gfx stack is miscompiled, I tested it pretty throughly under qemu (with qxl) before deployment, but that of course didn't exercise the R200 driver. FWIW, other than the failing DRI, performance is surprisingly OK, not super fast obviously, but a *lot* better than under Ubuntu! (start-up time is alot quicker, by an order of magnitude!)
I'm attempting to downgrade the xserver and drivers (on the live system) to see if that works, you can imagine that takes a little while on a Coppermine-128! I'll find out tomorrow. Otherwise, I guess I'm recompiling the stack with gcc-4.9 and no-LTO...
On Mon, Jul 06, 2015 at 09:06:28PM +0100, Steven Newbury wrote:
On Mon, 2015-07-06 at 15:42 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 2:40 PM, Steven Newbury <steve@snewbury.org.uk
wrote: On Mon, 2015-07-06 at 12:25 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 9:39 AM, Steven Newbury < steve@snewbury.org.uk
wrote: Hi, I've been trying to get DRM/KMS working with the current graphics stack (xf86-video-ati 7.5, xserver-1.17) on a R200 series card. I assumed this should be working since KMS was implemented for it a while back, and it has been working with xf86-video-ati-6.x.
Unfortunately, it doesn't seem to work.
I've narrowed it down to drmSetInterfaceVersion() failing when called from the ATI driver (in radeon_kms.c). This is a bit strange since, /sys/class/drm/version correctly reports 1.1.0 20060810. Presuably it's getting the correct fd for the DRM master otherwise it should bail earlier?
Googling confirms others have had the same issue, and generally the resolution has been to stick with the old driver.
Should this be working? Is it known to be broken?
It should be working. Make sure the kernel driver has kms enabled, firmware available, and that the kernel driver is loaded before starting X. If the kernel driver is not loaded before X starts you can get a version mis-match error.
Yes, using Gentoos 4.1.1 kernel, driver is definitely loaded, with modeset=1, which is working, all sysfs entries are there. gdm manages to fall back to starting up an X session without using DRM swrast -only, not something you want to experience on such a weak CPU!
If the kernel driver loads properly and you get a kms console you should be good to go.
Manually starting X fails with the "[drm] failed to set drm interface version." error.
Maybe the ddx with that old system was build without KMS support?
Everything is freshly compiled. The error itself is coming from radeon_kms.c:651 in the ddx.
Do you have latest libdrm? We might have accidentally broken this for very old versions of libdrm (although surprising this would compile with everything else). -Daniel
Alex
It's a very old system, PCI-only(!) Coppermine-128, belonging to a friend. The system previously was (very slowly) running Ubuntu 10 LTS, I think. It's not my machine so I'm not able to have continuous access, but R200 DRM/KMS was working. Apparently, Ubuntu no longer support R200, so no further updates were possible.
My friend can't afford a new machine at the moment; and since I'm a long time Gentoo dev I took it upon myself to build him a optimized desktop with gcc-5, where possible LTO, -Os, -march=pentium3 with the system L1 and L2 cache size information. It's quite possible some part of the gfx stack is miscompiled, I tested it pretty throughly under qemu (with qxl) before deployment, but that of course didn't exercise the R200 driver. FWIW, other than the failing DRI, performance is surprisingly OK, not super fast obviously, but a *lot* better than under Ubuntu! (start-up time is alot quicker, by an order of magnitude!)
I'm attempting to downgrade the xserver and drivers (on the live system) to see if that works, you can imagine that takes a little while on a Coppermine-128! I'll find out tomorrow. Otherwise, I guess I'm recompiling the stack with gcc-4.9 and no-LTO...
dri-devel mailing list dri-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/dri-devel
On Mon, 2015-07-06 at 23:26 +0200, Daniel Vetter wrote:
On Mon, Jul 06, 2015 at 09:06:28PM +0100, Steven Newbury wrote:
On Mon, 2015-07-06 at 15:42 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 2:40 PM, Steven Newbury < steve@snewbury.org.uk
wrote: On Mon, 2015-07-06 at 12:25 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 9:39 AM, Steven Newbury < steve@snewbury.org.uk
wrote: Hi, I've been trying to get DRM/KMS working with the current graphics stack (xf86-video-ati 7.5, xserver-1.17) on a R200 series card. I assumed this should be working since KMS was implemented for it a while back, and it has been working with xf86-video-ati -6.x.
Unfortunately, it doesn't seem to work.
I've narrowed it down to drmSetInterfaceVersion() failing when called from the ATI driver (in radeon_kms.c). This is a bit strange since, /sys/class/drm/version correctly reports 1.1.0 20060810. Presuably it's getting the correct fd for the DRM master otherwise it should bail earlier?
Googling confirms others have had the same issue, and generally the resolution has been to stick with the old driver.
Should this be working? Is it known to be broken?
It should be working. Make sure the kernel driver has kms enabled, firmware available, and that the kernel driver is loaded before starting X. If the kernel driver is not loaded before X starts you can get a version mis-match error.
Yes, using Gentoos 4.1.1 kernel, driver is definitely loaded, with modeset=1, which is working, all sysfs entries are there. gdm manages to fall back to starting up an X session without using DRM swrast -only, not something you want to experience on such a weak CPU!
If the kernel driver loads properly and you get a kms console you should be good to go.
Manually starting X fails with the "[drm] failed to set drm interface version." error.
Maybe the ddx with that old system was build without KMS support?
Everything is freshly compiled. The error itself is coming from radeon_kms.c:651 in the ddx.
Do you have latest libdrm? We might have accidentally broken this for very old versions of libdrm (although surprising this would compile with everything else). -Daniel
It's the latest in portage:
x11-libs/libdrm-2.4.62::gentoo USE="libkms -static-libs -valgrind"VIDEO_CARDS="radeon (-exynos) (-freedreno) -intel -nouveau (-omap) (-tegra) -vmware"
I'll know tomorrow whether downgrading works, if not, I'm suspecting it's a compiler bug. I have wondered though how much testing R200 could be getting with the latest stack.
If it is a mis-compile, I guess it must be either in libdrm or the ddx - hopefully, at least that shouldn't mean too much compiling!
Slightly off topic, I'm curious whether R200 could be used with Wayland. How difficult/possible would it be to get R200 working with gles1? Obviously, gles2 would be a non-starter given it's OpenGL1.3 hardware. Would gles1 be sufficient to run a Wayland compositor, I'm guessing probably not..?
On Mon, 06 Jul 2015 22:50:30 +0100 Steven Newbury steve@snewbury.org.uk wrote:
Would gles1 be sufficient to run a Wayland compositor, I'm guessing probably not..?
If you can find a Wayland compositor that is written to composite with GLES1, that's all you need from the "Wayland side". (Yeah, this has nothing to do with Wayland per se.) Compositing in itself without any effects is very simple, as long as you get the textures up.
Or, if you find a Wayland compositor written to use desktop OpenGL for compositing and does not use features your GL driver does not expose, that's good too.
Absolutely nothing about Wayland limits your choice of the GL flavour - even more so as the compositor is not running *on* Wayland.
Also, the question of running GL apps on Wayland is a whole another matter. There used to be a common misconception that Wayland had something to do with only allowing GLES.
Finally, there is the option of software rendering for composition...
Thanks, pq
On Tue, 2015-07-07 at 09:18 +0300, Pekka Paalanen wrote:
On Mon, 06 Jul 2015 22:50:30 +0100 Steven Newbury steve@snewbury.org.uk wrote:
Would gles1 be sufficient to run a Wayland compositor, I'm guessing probably not..?
If you can find a Wayland compositor that is written to composite with GLES1, that's all you need from the "Wayland side". (Yeah, this has nothing to do with Wayland per se.) Compositing in itself without any effects is very simple, as long as you get the textures up.
Or, if you find a Wayland compositor written to use desktop OpenGL for compositing and does not use features your GL driver does not expose, that's good too.
Is desktop OpenGL accessible from "EGL_PLATFORM=drm"?
Absolutely nothing about Wayland limits your choice of the GL flavour - even more so as the compositor is not running *on* Wayland.
Also, the question of running GL apps on Wayland is a whole another matter. There used to be a common misconception that Wayland had something to do with only allowing GLES.
Finally, there is the option of software rendering for composition...
Well, considering I was wondering about running Wayland on ancient hardware, perhaps software compositing wouldn't be ideal! ;-)
On Wed, 2015-07-08 at 21:56 +0100, Steven Newbury wrote:
On Tue, 2015-07-07 at 09:18 +0300, Pekka Paalanen wrote:
On Mon, 06 Jul 2015 22:50:30 +0100 Steven Newbury steve@snewbury.org.uk wrote:
Would gles1 be sufficient to run a Wayland compositor, I'm guessing probably not..?
If you can find a Wayland compositor that is written to composite with GLES1, that's all you need from the "Wayland side". (Yeah, this has nothing to do with Wayland per se.) Compositing in itself without any effects is very simple, as long as you get the textures up.
Or, if you find a Wayland compositor written to use desktop OpenGL for compositing and does not use features your GL driver does not expose, that's good too.
Is desktop OpenGL accessible from "EGL_PLATFORM=drm"?
To answer my own question, it seems that is possible. I wonder if it works with mutter/cogl???
Absolutely nothing about Wayland limits your choice of the GL flavour - even more so as the compositor is not running *on* Wayland.
Also, the question of running GL apps on Wayland is a whole another matter. There used to be a common misconception that Wayland had something to do with only allowing GLES.
Finally, there is the option of software rendering for composition...
Well, considering I was wondering about running Wayland on ancient hardware, perhaps software compositing wouldn't be ideal! ;-)
On 09.07.2015 06:01, Steven Newbury wrote:
On Wed, 2015-07-08 at 21:56 +0100, Steven Newbury wrote:
On Tue, 2015-07-07 at 09:18 +0300, Pekka Paalanen wrote:
On Mon, 06 Jul 2015 22:50:30 +0100 Steven Newbury steve@snewbury.org.uk wrote:
Would gles1 be sufficient to run a Wayland compositor, I'm guessing probably not..?
If you can find a Wayland compositor that is written to composite with GLES1, that's all you need from the "Wayland side". (Yeah, this has nothing to do with Wayland per se.) Compositing in itself without any effects is very simple, as long as you get the textures up.
Or, if you find a Wayland compositor written to use desktop OpenGL for compositing and does not use features your GL driver does not expose, that's good too.
Is desktop OpenGL accessible from "EGL_PLATFORM=drm"?
To answer my own question, it seems that is possible. I wonder if it works with mutter/cogl???
It does.
However, your problem seems rather that gnome-shell/mutter doesn't support R200 anymore.
On Thu Jul 9 03:32:40 2015 GMT+0100, Michel Dänzer wrote:
On 09.07.2015 06:01, Steven Newbury wrote:
On Wed, 2015-07-08 at 21:56 +0100, Steven Newbury wrote:
On Tue, 2015-07-07 at 09:18 +0300, Pekka Paalanen wrote:
On Mon, 06 Jul 2015 22:50:30 +0100 Steven Newbury steve@snewbury.org.uk wrote:
Would gles1 be sufficient to run a Wayland compositor, I'm guessing probably not..?
If you can find a Wayland compositor that is written to composite with GLES1, that's all you need from the "Wayland side". (Yeah, this has nothing to do with Wayland per se.) Compositing in itself without any effects is very simple, as long as you get the textures up.
Or, if you find a Wayland compositor written to use desktop OpenGL for compositing and does not use features your GL driver does not expose, that's good too.
Is desktop OpenGL accessible from "EGL_PLATFORM=drm"?
To answer my own question, it seems that is possible. I wonder if it works with mutter/cogl???
It does.
However, your problem seems rather that gnome-shell/mutter doesn't support R200 anymore.
Yes, that's true. I wonder if I can revert the incompatible change or better create a env variable to revert to the compatible behaviour or something? I'll need to take a look...
[I'm going to be posting a lot from my phone, my laptop power jack has just broken, again! :-( You'd think after 3.5 decades of making laptops we'd have come up with a better design!]
On Thu, Jul 9, 2015 at 2:58 AM, Steven Newbury steve@snewbury.org.uk wrote:
On Thu Jul 9 03:32:40 2015 GMT+0100, Michel Dänzer wrote:
On 09.07.2015 06:01, Steven Newbury wrote:
On Wed, 2015-07-08 at 21:56 +0100, Steven Newbury wrote:
On Tue, 2015-07-07 at 09:18 +0300, Pekka Paalanen wrote:
On Mon, 06 Jul 2015 22:50:30 +0100 Steven Newbury steve@snewbury.org.uk wrote:
Would gles1 be sufficient to run a Wayland compositor, I'm guessing probably not..?
If you can find a Wayland compositor that is written to composite with GLES1, that's all you need from the "Wayland side". (Yeah, this has nothing to do with Wayland per se.) Compositing in itself without any effects is very simple, as long as you get the textures up.
Or, if you find a Wayland compositor written to use desktop OpenGL for compositing and does not use features your GL driver does not expose, that's good too.
Is desktop OpenGL accessible from "EGL_PLATFORM=drm"?
To answer my own question, it seems that is possible. I wonder if it works with mutter/cogl???
It does.
However, your problem seems rather that gnome-shell/mutter doesn't support R200 anymore.
Yes, that's true. I wonder if I can revert the incompatible change or better create a env variable to revert to the compatible behaviour or something? I'll need to take a look...
I'm not sure how valid this is any more: https://bugs.freedesktop.org/show_bug.cgi?id=51658
Basic issue is that r1xx/r2xx hw only has a limited number of render buffer formats while they support a lot of texture formats. Gnome shell expects to be able to render to the same formats they can texture from.
Alex
[I'm going to be posting a lot from my phone, my laptop power jack has just broken, again! :-( You'd think after 3.5 decades of making laptops we'd have come up with a better design!]
Sent from my Jolla _______________________________________________ dri-devel mailing list dri-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/dri-devel
On Thu Jul 9 16:04:35 2015 GMT+0100, Alex Deucher wrote:
On Thu, Jul 9, 2015 at 2:58 AM, Steven Newbury steve@snewbury.org.uk wrote:
On Thu Jul 9 03:32:40 2015 GMT+0100, Michel Dänzer wrote:
On 09.07.2015 06:01, Steven Newbury wrote:
On Wed, 2015-07-08 at 21:56 +0100, Steven Newbury wrote:
On Tue, 2015-07-07 at 09:18 +0300, Pekka Paalanen wrote:
On Mon, 06 Jul 2015 22:50:30 +0100 Steven Newbury steve@snewbury.org.uk wrote:
> Would gles1 be sufficient to run a Wayland compositor, I'm > guessing probably not..?
If you can find a Wayland compositor that is written to composite with GLES1, that's all you need from the "Wayland side". (Yeah, this has nothing to do with Wayland per se.) Compositing in itself without any effects is very simple, as long as you get the textures up.
Or, if you find a Wayland compositor written to use desktop OpenGL for compositing and does not use features your GL driver does not expose, that's good too.
Is desktop OpenGL accessible from "EGL_PLATFORM=drm"?
To answer my own question, it seems that is possible. I wonder if it works with mutter/cogl???
It does.
However, your problem seems rather that gnome-shell/mutter doesn't support R200 anymore.
Yes, that's true. I wonder if I can revert the incompatible change or better create a env variable to revert to the compatible behaviour or something? I'll need to take a look...
I'm not sure how valid this is any more: https://bugs.freedesktop.org/show_bug.cgi?id=51658
Basic issue is that r1xx/r2xx hw only has a limited number of render buffer formats while they support a lot of texture formats. Gnome shell expects to be able to render to the same formats they can texture from.
It looks like a bit of a hack, it's a pity to lose valid texture formats, but I've applied the patches from the bug after a little manual intervention. It's building now, will take some time!
On Thu Jul 9 17:02:12 2015 GMT+0100, Steven Newbury wrote:
On Thu Jul 9 16:04:35 2015 GMT+0100, Alex Deucher wrote:
On Thu, Jul 9, 2015 at 2:58 AM, Steven Newbury steve@snewbury.org.uk wrote:
On Thu Jul 9 03:32:40 2015 GMT+0100, Michel Dänzer wrote:
On 09.07.2015 06:01, Steven Newbury wrote:
On Wed, 2015-07-08 at 21:56 +0100, Steven Newbury wrote:
On Tue, 2015-07-07 at 09:18 +0300, Pekka Paalanen wrote: > On Mon, 06 Jul 2015 22:50:30 +0100 Steven Newbury > steve@snewbury.org.uk wrote: > >> Would gles1 be sufficient to run a Wayland compositor, I'm >> guessing probably not..? > > If you can find a Wayland compositor that is written to composite > with GLES1, that's all you need from the "Wayland side". (Yeah, > this has nothing to do with Wayland per se.) Compositing in > itself without any effects is very simple, as long as you get the > textures up. > > Or, if you find a Wayland compositor written to use desktop > OpenGL for compositing and does not use features your GL driver > does not expose, that's good too. > Is desktop OpenGL accessible from "EGL_PLATFORM=drm"?
To answer my own question, it seems that is possible. I wonder if it works with mutter/cogl???
It does.
However, your problem seems rather that gnome-shell/mutter doesn't support R200 anymore.
Yes, that's true. I wonder if I can revert the incompatible change or better create a env variable to revert to the compatible behaviour or something? I'll need to take a look...
I'm not sure how valid this is any more: https://bugs.freedesktop.org/show_bug.cgi?id=51658
Basic issue is that r1xx/r2xx hw only has a limited number of render buffer formats while they support a lot of texture formats. Gnome shell expects to be able to render to the same formats they can texture from.
It looks like a bit of a hack, it's a pity to lose valid texture formats, but I've applied the patches from the bug after a little manual intervention. It's building now, will take some time!
Didn't work. Still get the same error! :-(
On the other hand, I've got muffin to work compositor is actually okay, so thought I'd try cinnamon, but it's trying to use 1.5GB which isn't going to work! ;-)
Will probably revert to Xfce...
On Thu Jul 9 17:02:12 2015 GMT+0100, Steven Newbury wrote:
On Thu Jul 9 16:04:35 2015 GMT+0100, Alex Deucher wrote:
On Thu, Jul 9, 2015 at 2:58 AM, Steven Newbury steve@snewbury.org.uk wrote:
On Thu Jul 9 03:32:40 2015 GMT+0100, Michel Dänzer wrote:
On 09.07.2015 06:01, Steven Newbury wrote:
On Wed, 2015-07-08 at 21:56 +0100, Steven Newbury wrote:
On Tue, 2015-07-07 at 09:18 +0300, Pekka Paalanen wrote: > On Mon, 06 Jul 2015 22:50:30 +0100 Steven Newbury > steve@snewbury.org.uk wrote: > >> Would gles1 be sufficient to run a Wayland compositor, I'm >> guessing probably not..? > > If you can find a Wayland compositor that is written to composite > with GLES1, that's all you need from the "Wayland side". (Yeah, > this has nothing to do with Wayland per se.) Compositing in > itself without any effects is very simple, as long as you get the > textures up. > > Or, if you find a Wayland compositor written to use desktop > OpenGL for compositing and does not use features your GL driver > does not expose, that's good too. > Is desktop OpenGL accessible from "EGL_PLATFORM=drm"?
To answer my own question, it seems that is possible. I wonder if it works with mutter/cogl???
It does.
However, your problem seems rather that gnome-shell/mutter doesn't support R200 anymore.
Yes, that's true. I wonder if I can revert the incompatible change or better create a env variable to revert to the compatible behaviour or something? I'll need to take a look...
I'm not sure how valid this is any more: https://bugs.freedesktop.org/show_bug.cgi?id=51658
Basic issue is that r1xx/r2xx hw only has a limited number of render buffer formats while they support a lot of texture formats. Gnome shell expects to be able to render to the same formats they can texture from.
It looks like a bit of a hack, it's a pity to lose valid texture formats, but I've applied the patches from the bug after a little manual intervention. It's building now, will take some time!
Didn't work. Still get the same error! :-(
On the other hand, I've got muffin to work compositor is actually okay, so thought I'd try cinnamon, but it's trying to use 1.5GB which isn't going to work! ;-)
Will probably revert to Xfce...
On Mon Jul 6 22:26:25 2015 GMT+0100, Daniel Vetter wrote:
On Mon, Jul 06, 2015 at 09:06:28PM +0100, Steven Newbury wrote:
On Mon, 2015-07-06 at 15:42 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 2:40 PM, Steven Newbury <steve@snewbury.org.uk
wrote: On Mon, 2015-07-06 at 12:25 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 9:39 AM, Steven Newbury < steve@snewbury.org.uk
wrote: Hi, I've been trying to get DRM/KMS working with the current graphics stack (xf86-video-ati 7.5, xserver-1.17) on a R200 series card. I assumed this should be working since KMS was implemented for it a while back, and it has been working with xf86-video-ati-6.x.
Unfortunately, it doesn't seem to work.
I've narrowed it down to drmSetInterfaceVersion() failing when called from the ATI driver (in radeon_kms.c). This is a bit strange since, /sys/class/drm/version correctly reports 1.1.0 20060810. Presuably it's getting the correct fd for the DRM master otherwise it should bail earlier?
Googling confirms others have had the same issue, and generally the resolution has been to stick with the old driver.
Should this be working? Is it known to be broken?
It should be working. Make sure the kernel driver has kms enabled, firmware available, and that the kernel driver is loaded before starting X. If the kernel driver is not loaded before X starts you can get a version mis-match error.
Yes, using Gentoos 4.1.1 kernel, driver is definitely loaded, with modeset=1, which is working, all sysfs entries are there. gdm manages to fall back to starting up an X session without using DRM swrast -only, not something you want to experience on such a weak CPU!
If the kernel driver loads properly and you get a kms console you should be good to go.
Manually starting X fails with the "[drm] failed to set drm interface version." error.
Maybe the ddx with that old system was build without KMS support?
Everything is freshly compiled. The error itself is coming from radeon_kms.c:651 in the ddx.
Do you have latest libdrm? We might have accidentally broken this for very old versions of libdrm (although surprising this would compile with everything else). -Daniel
Alex
It's a very old system, PCI-only(!) Coppermine-128, belonging to a friend. The system previously was (very slowly) running Ubuntu 10 LTS, I think. It's not my machine so I'm not able to have continuous access, but R200 DRM/KMS was working. Apparently, Ubuntu no longer support R200, so no further updates were possible.
My friend can't afford a new machine at the moment; and since I'm a long time Gentoo dev I took it upon myself to build him a optimized desktop with gcc-5, where possible LTO, -Os, -march=pentium3 with the system L1 and L2 cache size information. It's quite possible some part of the gfx stack is miscompiled, I tested it pretty throughly under qemu (with qxl) before deployment, but that of course didn't exercise the R200 driver. FWIW, other than the failing DRI, performance is surprisingly OK, not super fast obviously, but a *lot* better than under Ubuntu! (start-up time is alot quicker, by an order of magnitude!)
I'm attempting to downgrade the xserver and drivers (on the live system) to see if that works, you can imagine that takes a little while on a Coppermine-128! I'll find out tomorrow. Otherwise, I guess I'm recompiling the stack with gcc-4.9 and no-LTO...
dri-devel mailing list dri-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/dri-devel
-- Daniel Vetter Software Engineer, Intel Corporation http://blog.ffwll.ch
On Mon, 2015-07-06 at 23:26 +0200, Daniel Vetter wrote:
On Mon, Jul 06, 2015 at 09:06:28PM +0100, Steven Newbury wrote:
On Mon, 2015-07-06 at 15:42 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 2:40 PM, Steven Newbury < steve@snewbury.org.uk
wrote: On Mon, 2015-07-06 at 12:25 -0400, Alex Deucher wrote:
On Mon, Jul 6, 2015 at 9:39 AM, Steven Newbury < steve@snewbury.org.uk
wrote: Hi, I've been trying to get DRM/KMS working with the current graphics stack (xf86-video-ati 7.5, xserver-1.17) on a R200 series card. I assumed this should be working since KMS was implemented for it a while back, and it has been working with xf86-video-ati -6.x.
Unfortunately, it doesn't seem to work.
I've narrowed it down to drmSetInterfaceVersion() failing when called from the ATI driver (in radeon_kms.c). This is a bit strange since, /sys/class/drm/version correctly reports 1.1.0 20060810. Presuably it's getting the correct fd for the DRM master otherwise it should bail earlier?
Googling confirms others have had the same issue, and generally the resolution has been to stick with the old driver.
Should this be working? Is it known to be broken?
It should be working. Make sure the kernel driver has kms enabled, firmware available, and that the kernel driver is loaded before starting X. If the kernel driver is not loaded before X starts you can get a version mis-match error.
Yes, using Gentoos 4.1.1 kernel, driver is definitely loaded, with modeset=1, which is working, all sysfs entries are there. gdm manages to fall back to starting up an X session without using DRM swrast -only, not something you want to experience on such a weak CPU!
If the kernel driver loads properly and you get a kms console you should be good to go.
Manually starting X fails with the "[drm] failed to set drm interface version." error.
Maybe the ddx with that old system was build without KMS support?
Everything is freshly compiled. The error itself is coming from radeon_kms.c:651 in the ddx.
Do you have latest libdrm? We might have accidentally broken this for very old versions of libdrm (although surprising this would compile with everything else). -Daniel
Alex
It's a very old system, PCI-only(!) Coppermine-128, belonging to a friend. The system previously was (very slowly) running Ubuntu 10 LTS, I think. It's not my machine so I'm not able to have continuous access, but R200 DRM/KMS was working. Apparently, Ubuntu no longer support R200, so no further updates were possible.
My friend can't afford a new machine at the moment; and since I'm a long time Gentoo dev I took it upon myself to build him a optimized desktop with gcc-5, where possible LTO, -Os, -march=pentium3 with the system L1 and L2 cache size information. It's quite possible some part of the gfx stack is miscompiled, I tested it pretty throughly under qemu (with qxl) before deployment, but that of course didn't exercise the R200 driver. FWIW, other than the failing DRI, performance is surprisingly OK, not super fast obviously, but a *lot* better than under Ubuntu! (start-up time is alot quicker, by an order of magnitude!)
I'm attempting to downgrade the xserver and drivers (on the live system) to see if that works, you can imagine that takes a little while on a Coppermine-128! I'll find out tomorrow. Otherwise, I guess I'm recompiling the stack with gcc-4.9 and no-LTO...
Sorry about the accidental e-mail.
I've tried an xserver-1.16, and ddx, libdrm without LTO and with gcc4.9. Exactly the same thing. I wondered whether the unused i810 could be interfering but triggering a device "remove" before starting X made no difference.
I'm a bit of a loss. I suppose I could try writing a simple test for drmSetInterfaceVersion(). At least that should determine whether the xserver/ddx is in the clear.
Any other ideas?
On Tue, Jul 7, 2015 at 9:46 AM, Steven Newbury steve@snewbury.org.uk wrote:
I've tried an xserver-1.16, and ddx, libdrm without LTO and with gcc4.9. Exactly the same thing. I wondered whether the unused i810 could be interfering but triggering a device "remove" before starting X made no difference.
I'm a bit of a loss. I suppose I could try writing a simple test for drmSetInterfaceVersion(). At least that should determine whether the xserver/ddx is in the clear.
Any other ideas?
Can you start a non-X runlevel and start X manually as root (assuming you are using a login manager now)?
Alex
On Tue, 2015-07-07 at 10:12 -0400, Alex Deucher wrote:
On Tue, Jul 7, 2015 at 9:46 AM, Steven Newbury <steve@snewbury.org.uk
wrote:
I've tried an xserver-1.16, and ddx, libdrm without LTO and with gcc4.9. Exactly the same thing. I wondered whether the unused i810 could be interfering but triggering a device "remove" before starting X made no difference.
I'm a bit of a loss. I suppose I could try writing a simple test for drmSetInterfaceVersion(). At least that should determine whether the xserver/ddx is in the clear.
Any other ideas?
Can you start a non-X runlevel and start X manually as root (assuming you are using a login manager now)?
I've written a simple test based on the code from radeon_kms.c
I just need to try it on the actual hw now... :-)
On Tue Jul 7 15:12:28 2015 GMT+0100, Alex Deucher wrote:
On Tue, Jul 7, 2015 at 9:46 AM, Steven Newbury steve@snewbury.org.uk wrote:
I've tried an xserver-1.16, and ddx, libdrm without LTO and with gcc4.9. Exactly the same thing. I wondered whether the unused i810 could be interfering but triggering a device "remove" before starting X made no difference.
I'm a bit of a loss. I suppose I could try writing a simple test for drmSetInterfaceVersion(). At least that should determine whether the xserver/ddx is in the clear.
Any other ideas?
Can you start a non-X runlevel and start X manually as root (assuming you are using a login manager now)?
My test program worked fine. I considerably improved it over the version I posted. I'll send it to the list when I get back.
I removed the drmSetInterfaceVersion() from radeon_kms.c and it got much further. Starting Xserver as root apparently started normally, according to the log, although there was a permission denied error on mode set during init. I don't know whether it was related or not, but the display then hung with a non-blinking cursor. Strange to get a permission denied as root!
Starting GNOME via gdm gives a working slow X session but for some reason only uses sw dri even though the Xorg log shows r200 DRI2 as initialized. Perhaps it's a config error somewhere.. ?
startx as a regular user just works!
But mutter doesn't, perhaps that's why a gnome session isn't working. It just gives the following error: Cogl-ERROR **: Failed to create texture 2d due to size/format constraints
Mutter is supposed to work on r200, right?
On Wed, Jul 8, 2015 at 8:58 AM, Steven Newbury steve@snewbury.org.uk wrote:
On Tue Jul 7 15:12:28 2015 GMT+0100, Alex Deucher wrote:
On Tue, Jul 7, 2015 at 9:46 AM, Steven Newbury steve@snewbury.org.uk wrote:
I've tried an xserver-1.16, and ddx, libdrm without LTO and with gcc4.9. Exactly the same thing. I wondered whether the unused i810 could be interfering but triggering a device "remove" before starting X made no difference.
I'm a bit of a loss. I suppose I could try writing a simple test for drmSetInterfaceVersion(). At least that should determine whether the xserver/ddx is in the clear.
Any other ideas?
Can you start a non-X runlevel and start X manually as root (assuming you are using a login manager now)?
My test program worked fine. I considerably improved it over the version I posted. I'll send it to the list when I get back.
I removed the drmSetInterfaceVersion() from radeon_kms.c and it got much further. Starting Xserver as root apparently started normally, according to the log, although there was a permission denied error on mode set during init. I don't know whether it was related or not, but the display then hung with a non-blinking cursor. Strange to get a permission denied as root!
Starting GNOME via gdm gives a working slow X session but for some reason only uses sw dri even though the Xorg log shows r200 DRI2 as initialized. Perhaps it's a config error somewhere.. ?
startx as a regular user just works!
But mutter doesn't, perhaps that's why a gnome session isn't working. It just gives the following error: Cogl-ERROR **: Failed to create texture 2d due to size/format constraints
Mutter is supposed to work on r200, right?
IIRC it tries to use a render buffer format that's not supported by the hw.
Alex
On Wed Jul 8 14:20:28 2015 GMT+0100, Alex Deucher wrote:
On Wed, Jul 8, 2015 at 8:58 AM, Steven Newbury steve@snewbury.org.uk wrote:
On Tue Jul 7 15:12:28 2015 GMT+0100, Alex Deucher wrote:
On Tue, Jul 7, 2015 at 9:46 AM, Steven Newbury steve@snewbury.org.uk wrote:
I've tried an xserver-1.16, and ddx, libdrm without LTO and with gcc4.9. Exactly the same thing. I wondered whether the unused i810 could be interfering but triggering a device "remove" before starting X made no difference.
I'm a bit of a loss. I suppose I could try writing a simple test for drmSetInterfaceVersion(). At least that should determine whether the xserver/ddx is in the clear.
Any other ideas?
Can you start a non-X runlevel and start X manually as root (assuming you are using a login manager now)?
My test program worked fine. I considerably improved it over the version I posted. I'll send it to the list when I get back.
I removed the drmSetInterfaceVersion() from radeon_kms.c and it got much further. Starting Xserver as root apparently started normally, according to the log, although there was a permission denied error on mode set during init. I don't know whether it was related or not, but the display then hung with a non-blinking cursor. Strange to get a permission denied as root!
Starting GNOME via gdm gives a working slow X session but for some reason only uses sw dri even though the Xorg log shows r200 DRI2 as initialized. Perhaps it's a config error somewhere.. ?
startx as a regular user just works!
But mutter doesn't, perhaps that's why a gnome session isn't working. It just gives the following error: Cogl-ERROR **: Failed to create texture 2d due to size/format constraints
Mutter is supposed to work on r200, right?
IIRC it tries to use a render buffer format that's not supported by the hw.
Is there anything to be done about it? Have to use a different wm/compositor?
Any idea why removing the call from radeon_kms.c worked?
On Wed, Jul 8, 2015 at 9:53 AM, Steven Newbury steve@snewbury.org.uk wrote:
On Wed Jul 8 14:20:28 2015 GMT+0100, Alex Deucher wrote:
On Wed, Jul 8, 2015 at 8:58 AM, Steven Newbury steve@snewbury.org.uk wrote:
On Tue Jul 7 15:12:28 2015 GMT+0100, Alex Deucher wrote:
On Tue, Jul 7, 2015 at 9:46 AM, Steven Newbury steve@snewbury.org.uk wrote:
I've tried an xserver-1.16, and ddx, libdrm without LTO and with gcc4.9. Exactly the same thing. I wondered whether the unused i810 could be interfering but triggering a device "remove" before starting X made no difference.
I'm a bit of a loss. I suppose I could try writing a simple test for drmSetInterfaceVersion(). At least that should determine whether the xserver/ddx is in the clear.
Any other ideas?
Can you start a non-X runlevel and start X manually as root (assuming you are using a login manager now)?
My test program worked fine. I considerably improved it over the version I posted. I'll send it to the list when I get back.
I removed the drmSetInterfaceVersion() from radeon_kms.c and it got much further. Starting Xserver as root apparently started normally, according to the log, although there was a permission denied error on mode set during init. I don't know whether it was related or not, but the display then hung with a non-blinking cursor. Strange to get a permission denied as root!
Starting GNOME via gdm gives a working slow X session but for some reason only uses sw dri even though the Xorg log shows r200 DRI2 as initialized. Perhaps it's a config error somewhere.. ?
startx as a regular user just works!
But mutter doesn't, perhaps that's why a gnome session isn't working. It just gives the following error: Cogl-ERROR **: Failed to create texture 2d due to size/format constraints
Mutter is supposed to work on r200, right?
IIRC it tries to use a render buffer format that's not supported by the hw.
Is there anything to be done about it? Have to use a different wm/compositor?
Another wm or compositor may help.
Any idea why removing the call from radeon_kms.c worked?
No idea.
Alex
On 8 July 2015 at 14:55, Alex Deucher alexdeucher@gmail.com wrote:
On Wed, Jul 8, 2015 at 9:53 AM, Steven Newbury steve@snewbury.org.uk wrote:
On Wed Jul 8 14:20:28 2015 GMT+0100, Alex Deucher wrote:
On Wed, Jul 8, 2015 at 8:58 AM, Steven Newbury steve@snewbury.org.uk wrote:
On Tue Jul 7 15:12:28 2015 GMT+0100, Alex Deucher wrote:
On Tue, Jul 7, 2015 at 9:46 AM, Steven Newbury steve@snewbury.org.uk wrote:
I've tried an xserver-1.16, and ddx, libdrm without LTO and with gcc4.9. Exactly the same thing. I wondered whether the unused i810 could be interfering but triggering a device "remove" before starting X made no difference.
I'm a bit of a loss. I suppose I could try writing a simple test for drmSetInterfaceVersion(). At least that should determine whether the xserver/ddx is in the clear.
Any other ideas?
Can you start a non-X runlevel and start X manually as root (assuming you are using a login manager now)?
My test program worked fine. I considerably improved it over the version I posted. I'll send it to the list when I get back.
I removed the drmSetInterfaceVersion() from radeon_kms.c and it got much further. Starting Xserver as root apparently started normally, according to the log, although there was a permission denied error on mode set during init. I don't know whether it was related or not, but the display then hung with a non-blinking cursor. Strange to get a permission denied as root!
Starting GNOME via gdm gives a working slow X session but for some reason only uses sw dri even though the Xorg log shows r200 DRI2 as initialized. Perhaps it's a config error somewhere.. ?
startx as a regular user just works!
But mutter doesn't, perhaps that's why a gnome session isn't working. It just gives the following error: Cogl-ERROR **: Failed to create texture 2d due to size/format constraints
Mutter is supposed to work on r200, right?
IIRC it tries to use a render buffer format that's not supported by the hw.
Is there anything to be done about it? Have to use a different wm/compositor?
Another wm or compositor may help.
Any idea why removing the call from radeon_kms.c worked?
No idea.
From a quick look at the actual implementation drmSetInterfaceVersion
is not something that we want/need to use with KMS drivers.
In general the original issue sound like the drm driver is not (fully) loaded before the xserver/ddx kicks in. Additionally it may be a race condition if something else (plymouth) is using the device. In the latter case DRM_MASTER might still be held by $(other_app), thus attempting to either SetInterfaceVersion or do any modeset operation will fail.
Personally I would add a healthy amount of printk/printf though the kernel drm + radeon and the ddx.
Cheers Emil
On Wed, 2015-07-08 at 17:10 +0100, Emil Velikov wrote:
On 8 July 2015 at 14:55, Alex Deucher alexdeucher@gmail.com wrote:
On Wed, Jul 8, 2015 at 9:53 AM, Steven Newbury < steve@snewbury.org.uk> wrote:
On Wed Jul 8 14:20:28 2015 GMT+0100, Alex Deucher wrote:
On Wed, Jul 8, 2015 at 8:58 AM, Steven Newbury < steve@snewbury.org.uk> wrote:
On Tue Jul 7 15:12:28 2015 GMT+0100, Alex Deucher wrote:
On Tue, Jul 7, 2015 at 9:46 AM, Steven Newbury < steve@snewbury.org.uk> wrote: > > I've tried an xserver-1.16, and ddx, libdrm without LTO > and with > gcc4.9. Exactly the same thing. I wondered whether the > unused i810 > could be interfering but triggering a device "remove" > before starting > X made no difference. > > I'm a bit of a loss. I suppose I could try writing a > simple test for > drmSetInterfaceVersion(). At least that should > determine whether the > xserver/ddx is in the clear. > > Any other ideas? >
Can you start a non-X runlevel and start X manually as root (assuming you are using a login manager now)?
My test program worked fine. I considerably improved it over the version I posted. I'll send it to the list when I get back.
I removed the drmSetInterfaceVersion() from radeon_kms.c and it got much further. Starting Xserver as root apparently started normally, according to the log, although there was a permission denied error on mode set during init. I don't know whether it was related or not, but the display then hung with a non-blinking cursor. Strange to get a permission denied as root!
Starting GNOME via gdm gives a working slow X session but for some reason only uses sw dri even though the Xorg log shows r200 DRI2 as initialized. Perhaps it's a config error somewhere.. ?
startx as a regular user just works!
But mutter doesn't, perhaps that's why a gnome session isn't working. It just gives the following error: Cogl-ERROR **: Failed to create texture 2d due to size/format constraints
Mutter is supposed to work on r200, right?
IIRC it tries to use a render buffer format that's not supported by the hw.
Is there anything to be done about it? Have to use a different wm/compositor?
Another wm or compositor may help.
Any idea why removing the call from radeon_kms.c worked?
No idea.
From a quick look at the actual implementation drmSetInterfaceVersion is not something that we want/need to use with KMS drivers.
In general the original issue sound like the drm driver is not (fully) loaded before the xserver/ddx kicks in. Additionally it may be a race condition if something else (plymouth) is using the device. In the latter case DRM_MASTER might still be held by $(other_app), thus attempting to either SetInterfaceVersion or do any modeset operation will fail.
Sitting on a KMS console, "systemctl start gdm", no plymouth installed. It's just during Xserver initialisation that drmSetInterfaceVersion() fails. AFAIK Xserver startup is entirely single process, single thread. I've written a little test utility which works fine on the system in question.
Compile attached file with: gcc -O2 -o test-drm test-drm.c $(pkg-config --cflags libdrm) $(pkg -config --libs libdrm)
Personally I would add a healthy amount of printk/printf though the kernel drm + radeon and the ddx.
I guess it doesn't really matter since patching out the code "fixes" it...
Hi Steven,
A couple of more (wild) ideas, below but first...
On 8 July 2015 at 17:48, Steven Newbury steve@snewbury.org.uk wrote:
On Wed, 2015-07-08 at 17:10 +0100, Emil Velikov wrote:
On 8 July 2015 at 14:55, Alex Deucher alexdeucher@gmail.com wrote:
On Wed, Jul 8, 2015 at 9:53 AM, Steven Newbury < steve@snewbury.org.uk> wrote:
On Wed Jul 8 14:20:28 2015 GMT+0100, Alex Deucher wrote:
On Wed, Jul 8, 2015 at 8:58 AM, Steven Newbury < steve@snewbury.org.uk> wrote:
On Tue Jul 7 15:12:28 2015 GMT+0100, Alex Deucher wrote: > On Tue, Jul 7, 2015 at 9:46 AM, Steven Newbury < > steve@snewbury.org.uk> wrote: > > > > I've tried an xserver-1.16, and ddx, libdrm without LTO > > and with > > gcc4.9. Exactly the same thing. I wondered whether the > > unused i810 > > could be interfering but triggering a device "remove" > > before starting > > X made no difference. > >
I've completely missed out on this - you have a i810 in there ? Bth, I'm not sure how well (if any) our userspace works when mixing UMS and KMS drivers. Toggling off it in the BIOS, blacklisting the kernel module, checking if it's the boot vga and if X gets it right are nice things to test/try.
[snip]
Sitting on a KMS console, "systemctl start gdm", no plymouth installed.
Don't know what gdm is up-to these days but a while back it used to try bringing one X session for the greeter, quickly tear it down and bring another one for the desktop. Not a gdm person so don't quote me on that :-) Yet if that is (roughly) still the case, a likely race condition is on.
It's just during Xserver initialisation that drmSetInterfaceVersion() fails. AFAIK Xserver startup is entirely single process, single thread. I've written a little test utility which works fine on the system in question.
Compile attached file with: gcc -O2 -o test-drm test-drm.c $(pkg-config --cflags libdrm) $(pkg -config --libs libdrm)
Your test utility seems to do a lot more than the few lines in radeon_kms.c. Would it be possible that you're misinterpreted the output (relative to the Xorg.log) ? I know I would :-)
Personally I would add a healthy amount of printk/printf though the kernel drm + radeon and the ddx.
I guess it doesn't really matter since patching out the code "fixes" it...
As you wish. Personally I tend to give it a bit more before giving up.
As this is getting a bit long/messy I'd suspect that a bugzilla entry with information/logs might be good. It's up-to you, as I won't be able to help much more.
Good luck, Emil
On Wed, 2015-07-08 at 23:44 +0100, Emil Velikov wrote: ...
I guess it doesn't really matter since patching out the code "fixes" it...
As you wish. Personally I tend to give it a bit more before giving up.
I typically would, but it isn't my machine, and my priority is getting the system up as that's what I promised to do! Also, given there doesn't really seem to be any point to the call to drmSetInterfaceVersion(), as has been pointed out, simply removing it doesn't hurt - although it admittedly irks me as to how/why it can be failing.
As this is getting a bit long/messy I'd suspect that a bugzilla entry with information/logs might be good. It's up-to you, as I won't be able to help much more.
Simply getting feedback has been helpful. Yes, I should open a bugzilla entry, I'll get to it after I have the system up and running. It's probably getting tough to maintain R200 support at this point, particularly on ancient pre-millennial PCI only systems with buggy* BIOSes! ;-)
* The BIOS option to disable the onboard video is present, and the help test lists 3 options, Enabled - 512KB, Enabled - 1MB, Disabled. Only the first two options actually appear on the menu! Even better, the system only (re-)boots on the second attempt, first attempt always hangs post-POST, before attempting to read from boot media!
dri-devel@lists.freedesktop.org