Hi all
I am on a Raspberry Pi, I want to display fullscreen video and have a couple of overlay planes to display controls / subtitles etc. The h/w can certainly do this. I need to be able to do this from a starting point where X is running.
I can successfully find X's output & crtc and grab that using xcb_randr_create_lease and use that handle to display video. So far so good. But I also want to have overlay planes for subtitles etc. The handle I've got from the lease only seems to have a PRIMARY & a CURSOR plane attached so I can't get anything there.
How should I be going about getting some more planes to use for overlays? Pointers to documentation / examples gratefully received - so far my google-foo has failed to find anything that works.
I'm sorry if this is the wrong place to ask, but if there is a better place please say and I'll go there.
Many thanks
John Cox
On Tue, Aug 10, 2021 at 05:57:31PM +0100, John Cox wrote:
Hi all
I am on a Raspberry Pi, I want to display fullscreen video and have a couple of overlay planes to display controls / subtitles etc. The h/w can certainly do this. I need to be able to do this from a starting point where X is running.
I can successfully find X's output & crtc and grab that using xcb_randr_create_lease and use that handle to display video. So far so good. But I also want to have overlay planes for subtitles etc. The handle I've got from the lease only seems to have a PRIMARY & a CURSOR plane attached so I can't get anything there.
I think X just gives you a legacy lease for the crtc, and the kernel automatically adds the primary plane and cursor plane (if they exist) to that lease. Unless X is patched to enable plane support and add those all explicitly to the lease I don't think there's a way for that.
For wayland this is still in the works, so might be good if you check there that your use-case is properly supported. Protocol MR is here:
https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/67
How should I be going about getting some more planes to use for overlays? Pointers to documentation / examples gratefully received - so far my google-foo has failed to find anything that works.
I'm sorry if this is the wrong place to ask, but if there is a better place please say and I'll go there.
Many thanks
John Cox
On Tue, Aug 10, 2021 at 05:57:31PM +0100, John Cox wrote:
Hi all
I am on a Raspberry Pi, I want to display fullscreen video and have a couple of overlay planes to display controls / subtitles etc. The h/w can certainly do this. I need to be able to do this from a starting point where X is running.
I can successfully find X's output & crtc and grab that using xcb_randr_create_lease and use that handle to display video. So far so good. But I also want to have overlay planes for subtitles etc. The handle I've got from the lease only seems to have a PRIMARY & a CURSOR plane attached so I can't get anything there.
I think X just gives you a legacy lease for the crtc, and the kernel automatically adds the primary plane and cursor plane (if they exist) to that lease. Unless X is patched to enable plane support and add those all explicitly to the lease I don't think there's a way for that.
Bother. So near and yet so far. Thanks for the info.
For wayland this is still in the works, so might be good if you check there that your use-case is properly supported. Protocol MR is here:
https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/67
In overall protocol terms that doesn't seem so different from what X does and I am far too inexperienced in Wayland / DRM to understand the subtleties. That MR seems to be done so is probably an inappropriate place to ask - where would you recommend as an appropriate forum?
Many thanks
John Cox
How should I be going about getting some more planes to use for overlays? Pointers to documentation / examples gratefully received - so far my google-foo has failed to find anything that works.
I'm sorry if this is the wrong place to ask, but if there is a better place please say and I'll go there.
Many thanks
John Cox
On Wednesday, August 11th, 2021 at 12:19, John Cox jc@kynesim.co.uk wrote:
That MR seems to be done so is probably an inappropriate place to ask - where would you recommend as an appropriate forum?
For Wayland related questions, you can ask on IRC or on the wayland-devel mailing list.
On Wednesday, August 11th, 2021 at 11:43, Daniel Vetter daniel@ffwll.ch wrote:
For wayland this is still in the works, so might be good if you check there that your use-case is properly supported. Protocol MR is here:
https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/67
The client requests a connector, and the compositor will decide which resources to lease. This may or may not include overlay planes. The connector you're interested in may or may not be available for lease.
What's your use-case? Why not use an xdg_toplevel and wl_subsurface?
DRM leases are not a good idea for regular applications. They don't properly integrate with the rest of the desktop, and won't get input events. Letting the compositor deal with KMS planes is the preferred approach.
Hi
On Wednesday, August 11th, 2021 at 11:43, Daniel Vetter daniel@ffwll.ch wrote:
For wayland this is still in the works, so might be good if you check there that your use-case is properly supported. Protocol MR is here:
https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/67
The client requests a connector, and the compositor will decide which resources to lease. This may or may not include overlay planes. The connector you're interested in may or may not be available for lease.
Fair point
What's your use-case?
Raspberry Pi displaying video with subtitles or other controls. I was thinking of the fullscreen case but if zero copy video can be made to work to the main desktop then that would even better.
If displaying 4k video the Pi does not have enough bandwidth left for a single frame copy, convert or merge so I need hardware scaling, composition & display taking the raw video frame (its in a dmabuf). The raw video is in a somewhat unique format, I'd expect the other layers to be ARGB. The Pi h/w can do this and I believe I can make it work via DRM if I own the screen so that was where I started.
Why not use an xdg_toplevel and wl_subsurface?
Probably because I am woefully underinformed about how I should be doing stuff properly. Please feel free to point me in the correct direction - any example that takes NV12 video (it isn't NV12 but if NV12 works then SAND can probably be made to too) would be a great start. Also Wayland hasn't yet come to the Pi though it will shortly be using mutter.
DRM leases are not a good idea for regular applications. They don't properly integrate with the rest of the desktop, and won't get input events. Letting the compositor deal with KMS planes is the preferred approach.
If that can be made to work then I agree I would like to do it like that.
Many thanks for the response.
John Cox
On Wednesday, August 11th, 2021 at 13:40, John Cox jc@kynesim.co.uk wrote:
Raspberry Pi displaying video with subtitles or other controls. I was thinking of the fullscreen case but if zero copy video can be made to work to the main desktop then that would even better.
If displaying 4k video the Pi does not have enough bandwidth left for a single frame copy, convert or merge so I need hardware scaling, composition & display taking the raw video frame (its in a dmabuf). The raw video is in a somewhat unique format, I'd expect the other layers to be ARGB. The Pi h/w can do this and I believe I can make it work via DRM if I own the screen so that was where I started.
Why not use an xdg_toplevel and wl_subsurface?
Probably because I am woefully underinformed about how I should be doing stuff properly. Please feel free to point me in the correct direction - any example that takes NV12 video (it isn't NV12 but if NV12 works then SAND can probably be made to too) would be a great start. Also Wayland hasn't yet come to the Pi though it will shortly be using mutter.
By SAND do you mean one of these vc4-specific buffer tilings [1]? e.g. BROADCOM_SAND64, SAND128 or SAND256?
[1]: https://drmdb.emersion.fr/formats?driver=vc4
The fullscreen case may work already on all major Wayland compositors, assuming the video size matches exactly the current mode. You'll need to use the linux-dmabuf Wayland extension to pass NV12 buffers to the compositor.
If you want to add scaling into the mix, you'll need to use the viewporter extension as well. Most compositors aren't yet rigged up for direct scan-out, they'll fall back to composition. Weston is your best bet if you want to try this, it supports direct scan-out to multiple KMS planes with scaling and cropping. There is some active work in wlroots to support this. I'm not aware of any effort in this direction for mutter or kwin at the time of writing.
If you want to also use KMS planes with other layers (RGBA or something else), then you'll need to setup wl_subsurfaces with the rest of the content. As said above, Weston will do its best to offload the composition work to KMS planes. You'll need to make sure each buffer you submit can be scanned out by the display engine -- there's not yet a generic way of doing it, but the upcoming linux-dmabuf hints protocol will fix that.
If you want to get started, maybe have a look at clients/simple-dmabuf-gbm in Weston.
Hope this helps!
Raspberry Pi displaying video with subtitles or other controls. I was thinking of the fullscreen case but if zero copy video can be made to work to the main desktop then that would even better.
If displaying 4k video the Pi does not have enough bandwidth left for a single frame copy, convert or merge so I need hardware scaling, composition & display taking the raw video frame (its in a dmabuf). The raw video is in a somewhat unique format, I'd expect the other layers to be ARGB. The Pi h/w can do this and I believe I can make it work via DRM if I own the screen so that was where I started.
Why not use an xdg_toplevel and wl_subsurface?
Probably because I am woefully underinformed about how I should be doing stuff properly. Please feel free to point me in the correct direction - any example that takes NV12 video (it isn't NV12 but if NV12 works then SAND can probably be made to too) would be a great start. Also Wayland hasn't yet come to the Pi though it will shortly be using mutter.
By SAND do you mean one of these vc4-specific buffer tilings [1]? e.g. BROADCOM_SAND64, SAND128 or SAND256?
Yes - for SAND8 (or SAND128 in your terms) drm output we have the required types as NV12 + a broadcom modifier. Then there is SAND30 for 10-bit output which fits in the same column tiling but packs 3 10-bit quantities into 32 bits with 2 junk (zero) bits. Again we have a DRM definition for that which I think may have made it upstream.
The fullscreen case may work already on all major Wayland compositors, assuming the video size matches exactly the current mode. You'll need to use the linux-dmabuf Wayland extension to pass NV12 buffers to the compositor.
If you want to add scaling into the mix, you'll need to use the viewporter extension as well. Most compositors aren't yet rigged up for direct scan-out, they'll fall back to composition. Weston is your best bet if you want to try this, it supports direct scan-out to multiple KMS planes with scaling and cropping. There is some active work in wlroots to support this. I'm not aware of any effort in this direction for mutter or kwin at the time of writing.
If you want to also use KMS planes with other layers (RGBA or something else), then you'll need to setup wl_subsurfaces with the rest of the content. As said above, Weston will do its best to offload the composition work to KMS planes. You'll need to make sure each buffer you submit can be scanned out by the display engine -- there's not yet a generic way of doing it, but the upcoming linux-dmabuf hints protocol will fix that.
If you want to get started, maybe have a look at clients/simple-dmabuf-gbm in Weston.
Hope this helps!
Very many thanks for the pointers - to a large extent my problem is that I don't know what should work in order to build something around it and then work out why it doesn't. I've got video decode down pat, but modern display still eludes me - I grew up on STBs and the like where you could just use the h/w directly, now its a lot more controlled.
Ta again
John Cox
dri-devel@lists.freedesktop.org