https://bugs.freedesktop.org/show_bug.cgi?id=86781
Bug ID: 86781
Summary: enabling glamor causes jumpy VDPAU playback with 2x
framerate DI
Product: Mesa
Version: 10.3
Hardware: x86-64 (AMD64)
OS: Linux (All)
Status: NEW
Severity: normal
Priority: medium
Component: Drivers/Gallium/radeonsi
Assignee: dri-devel(a)lists.freedesktop.org
Reporter: warpme(a)o2.pl
VDPAU playback of interlaced content starts to be jumpy when user enables
glamor and DI is set to 2x.
Issue is seen under MythTV. Playback log reports:
2014-11-13 13:47:01.013753 I Player(1): Video is 3.18365 frames behind audio
(too slow), dropping frame to catch up.
2014-11-13 13:47:01.013783 I AOBase: Pause 1
2014-11-13 13:47:01.014047 I Player(1): Video is 3.45023 frames behind audio
(too slow), dropping frame to catch up.
2014-11-13 13:47:01.014064 I AOBase: Pause 1
2014-11-13 13:47:01.014171 I Player(1): Video is 3.40015 frames behind audio
(too slow), dropping frame to catch up.
2014-11-13 13:47:01.014188 I AOBase: Pause 1
2014-11-13 13:47:01.014286 I Player(1): Video is 3.1126 frames behind audio
(too slow), dropping frame to catch up.
2x DI problem manifests only with enabled glamor. On Brazos with EXA, 2x DI
works perfect while enabling glamor triggers 2x DI issue.
On Kabini issue is always present as Kabini mandatory requires glamor.
Tests were done with mesa 10.2.8/10.3.3 and xserver 1.16.1/1.16.2
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=83226
Priority: medium
Bug ID: 83226
Assignee: dri-devel(a)lists.freedesktop.org
Summary: Allow use of ColorRange and ColorSpace in xorg.conf.d
files
Severity: enhancement
Classification: Unclassified
OS: Linux (All)
Reporter: john.ettedgui(a)gmail.com
Hardware: All
Status: NEW
Version: unspecified
Component: Drivers/Gallium/radeonsi
Product: Mesa
Hello,
This a feature request, I hope it is the right place.
As my main computer is connected to an HD-TV that only supports limited range
of colors, I would like to be able to use limited range with radeon(si) but it
is not yet available unlike for Intel or Nvidia.
For now I'm able to set media players to limited range, but I would like to be
able to do it at the system level.
Being able to switch the color space would also be nice.
Thanks!
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=80419
Priority: medium
Bug ID: 80419
Assignee: dri-devel(a)lists.freedesktop.org
Summary: XCOM: Enemy Known Causes lockup
Severity: normal
Classification: Unclassified
OS: Linux (All)
Reporter: theamazingjanet(a)googlemail.com
Hardware: x86-64 (AMD64)
Status: NEW
Version: 10.1
Component: Drivers/Gallium/radeonsi
Product: Mesa
Running the recently released XCOM: Enemy Unknown, after a few minutes of
playing the game will lockup the display completely. Mouse movement still shows
but nothing is responsive. Occurs for the whole display, evidenced by running
the game in windowed mode. Keyboard input is also unresponsive, forcing a hard
reset.
Ubuntu 14.04
AMD HD7770 2GB
Mesa 10.1.3
Only response from Feral on the issue was to use Catalyst.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=76484
Priority: medium
Bug ID: 76484
Assignee: dri-devel(a)lists.freedesktop.org
Summary: [radeonsi] Strike Suit Zero fail to start
Severity: normal
Classification: Unclassified
OS: Linux (All)
Reporter: lordheavym(a)gmail.com
Hardware: x86-64 (AMD64)
Status: NEW
Version: git
Component: Drivers/Gallium/radeonsi
Product: Mesa
* mesa-git
OpenGL renderer string: Gallium 0.4 on AMD PITCAIRN
OpenGL core profile version string: 3.3 (Core Profile) Mesa 10.2.0-devel
(git-e5f6b6d)
* llvm-3.5svn
Game is segfaulting when started
the backtrace:
(gdb) bt
#0 0xf7fdb425 in __kernel_vsyscall ()
#1 0xf7930816 in raise () from /usr/lib32/libc.so.6
#2 0xf7931fa3 in abort () from /usr/lib32/libc.so.6
#3 0x083f80c5 in mspace_free (msp=0xd9683018, mem=0xbfde728) at
../XII/Code/Core/Memory/dlmalloc.cpp:5700
#4 0x083f474b in cDynamicPartition::Free (this=0xbf4dd80, lpMem=0xbfde728) at
../XII/Code/Core/Memory/Partition.cpp:645
#5 0x083f37d9 in CustomDelete (ptr=0xbfde728) at
../XII/Code/Core/Memory/MemoryManager.cpp:995
#6 CustomDelete (ptr=0xbfde728) at
../XII/Code/Core/Memory/MemoryManager.cpp:979
#7 0x083f386f in operator delete (ptr=0xbfde728) at
../XII/Code/Core/Memory/MemoryManager.cpp:1036
#8 0xd4a7186c in ?? () from /usr/lib32/libLLVM-3.5.0svn.so
#9 0xd3fdfd33 in LLVMDisposeMemoryBuffer () from
/usr/lib32/libLLVM-3.5.0svn.so
#10 0xd60075d6 in radeon_llvm_compile (M=M@entry=0xd9684170,
binary=binary@entry=0xffff81b0, gpu_family=0xd60e916d "pitcairn",
dump=dump@entry=0) at radeon_llvm_emit.c:132
#11 0xd5ff1480 in si_compile_llvm (sctx=sctx@entry=0xbf95e10,
shader=shader@entry=0xbfdb218, mod=0xd9684170) at si_shader.c:2302
#12 0xd5ff1d45 in si_pipe_shader_create (ctx=ctx@entry=0xbf95e10,
shader=shader@entry=0xbfdb218) at si_shader.c:2577
#13 0xd5ff7ddc in si_shader_select (ctx=ctx@entry=0xbf95e10,
sel=sel@entry=0xbfdb0d8) at si_state.c:2114
#14 0xd5ff7f0b in si_create_shader_state (ctx=0xbf95e10, state=0xffffcba8,
pipe_shader_type=1) at si_state.c:2146
#15 0xd5eef7a1 in ureg_create_shader (ureg=ureg@entry=0xbfbfe80,
pipe=pipe@entry=0xbf95e10, so=so@entry=0x0) at tgsi/tgsi_ureg.c:1719
#16 0xd5f1b9a2 in ureg_create_shader_with_so_and_destroy (so=0x0,
pipe=0xbf95e10, p=0xbfbfe80) at ./tgsi/tgsi_ureg.h:138
#17 ureg_create_shader_and_destroy (pipe=0xbf95e10, p=0xbfbfe80) at
./tgsi/tgsi_ureg.h:147
#18 util_make_empty_fragment_shader (pipe=pipe@entry=0xbf95e10) at
util/u_simple_shaders.c:392
#19 0xd5f000bb in util_blitter_create (pipe=pipe@entry=0xbf95e10) at
util/u_blitter.c:294
#20 0xd5feb88f in si_create_context (screen=0xbf95948, priv=0x0) at
si_pipe.c:173
#21 0xd5febc3d in radeonsi_screen_create (ws=ws@entry=0xbf95328) at
si_pipe.c:458
#22 0xd5c81098 in create_screen (fd=5) at drm_target.c:43
#23 0xd600c5bc in dri2_init_screen (sPriv=0xbf95198) at dri2.c:1088
#24 0xd5c822b6 in driCreateNewScreen2 (scrn=0, fd=5, extensions=0xbf71708,
driver_extensions=0xd619be24 <__driDriverExtensions>,
driver_configs=0xffffd0c0, data=0xbf71750) at dri_util.c:159
#25 0xf7f53b63 in dri2CreateScreen (screen=0, priv=0xbf70590) at
dri2_glx.c:1247
#26 0xf7f2c574 in AllocAndFetchScreenConfigs (priv=0xbf70590, dpy=0xbf57de8) at
glxext.c:779
#27 __glXInitialize (dpy=dpy@entry=0xbf57de8) at glxext.c:888
#28 0xf7f28aad in GetGLXPrivScreenConfig (dpy=dpy@entry=0xbf57de8,
scrn=scrn@entry=0, ppriv=ppriv@entry=0xffffd174, ppsc=ppsc@entry=0xffffd178) at
glxcmds.c:172
#29 0xf7f2932e in GetGLXPrivScreenConfig (ppsc=0xffffd178, ppriv=0xffffd174,
scrn=0, dpy=0xbf57de8) at glxcmds.c:168
#30 glXChooseVisual (dpy=0xbf57de8, screen=0, attribList=0xffffd2f4) at
glxcmds.c:1249
#31 0xf7cb8f81 in X11_GL_GetVisual (screen=0, display=0xbf57de8,
_this=0xbf573e0) at ../src/video/x11/SDL_x11opengl.c:488
#32 X11_GL_InitExtensions (_this=0xbf573e0) at
../src/video/x11/SDL_x11opengl.c:300
#33 0xf7cb97ba in X11_GL_LoadLibrary (_this=0xbf573e0, path=<optimized out>) at
../src/video/x11/SDL_x11opengl.c:220
#34 0xf7ca1573 in SDL_GL_LoadLibrary (path=0x0) at
../src/video/SDL_video.c:2302
#35 0xf7ca2e87 in SDL_CreateWindow (title=0xf7ccd14e "OpenGL test", x=-32,
y=-32, w=32, h=32, flags=10) at ../src/video/SDL_video.c:1210
#36 0xf7ca2d63 in ShouldUseTextureFramebuffer () at
../src/video/SDL_video.c:170
#37 SDL_VideoInit (driver_name=<optimized out>) at ../src/video/SDL_video.c:516
#38 0xf7c02bec in SDL_InitSubSystem (flags=25120) at ../src/SDL.c:158
#39 0xf7c02cb9 in SDL_Init (flags=8736) at ../src/SDL.c:243
#40 0x084756ac in cRenderer::DeviceInit (this=0xbb05f60 <gcRenderer>) at
../XII/Code/Render/Renderer/OpenGL/RendererGL.cpp:160
#41 0x0847707d in cRendererBase::Init (this=0xbb05f60 <gcRenderer>) at
../XII/Code/Render/Renderer/Renderer.cpp:97
#42 0x083f9bb6 in cTask::PerformInit (this=0xbb05f60 <gcRenderer>) at
../XII/Code/Core/Task/Task.cpp:70
#43 0x083f1900 in cKernel::Boot (this=0x8e75400 <gcKernel>) at
../XII/Code/Core/Kernel.cpp:118
#44 0x08084231 in main (argc=1, argv=0xffffd704) at
../XII/Code/Core/Platform/Unix/StartupUnix.cpp:45
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=60879
Priority: medium
Bug ID: 60879
Assignee: dri-devel(a)lists.freedesktop.org
Summary: X11 can't start with acceleration enabled
Severity: blocker
Classification: Unclassified
OS: Linux (All)
Reporter: mustrumr97(a)gmail.com
Hardware: x86-64 (AMD64)
Status: NEW
Version: unspecified
Component: DRM/Radeon
Product: DRI
Created attachment 74857
--> https://bugs.freedesktop.org/attachment.cgi?id=74857&action=edit
Screenshot of Xorg
I've recently bought a Radeon 7000 graphics card - Sapphire Radeon HD 7870 XT.
I enabled glamor in xorg.conf.
The X server fails to start with glamor enabled. For 10 seconds it shows
nothing. I've attached a screenshot taken with my mobile phone after that.
After killing X the monitor turns off as if the GPU is off (probably true).
X works if I disable glamor. However this way I'd be forced to use swrast.
xorg-xserver - 1.13.2
xf86-video-ati - git or 7.1.0 (both tried)
glamor - git or 0.5 (both tried)
libdrm - git or 2.4.42 (both tried)
kernel - git or 3.7.7 (both tried)
mesa - git
weston also fails to start. It just shows a black screen with a white stripe on
top. Killing it returns me to a tty.
I'll attach parts of the kernel log when starting Xorg and weston.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=86043
Bug ID: 86043
Summary: Optimus issue with libdrm 2.4.58
Product: DRI
Version: XOrg git
Hardware: Other
OS: All
Status: NEW
Severity: normal
Priority: medium
Component: libdrm
Assignee: dri-devel(a)lists.freedesktop.org
Reporter: jljusten(a)gmail.com
Users with optimus systems are reporting that
many games fail to run in libdrm-intel is upgraded
from 2.4.56 to 2.4.58. (And, downgrading to 2.4.56
fixes the issue.
Steam bug:
https://github.com/ValveSoftware/steam-for-linux/issues/3506
Debian bug:
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=768045
I don't have hardware to confirm this issue.
I notice that between 2.4.56 and 2.4.58 libdrm changed
some symbol visibility settings.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=77204
Priority: medium
Bug ID: 77204
Assignee: dri-devel(a)lists.freedesktop.org
Summary: make fails due to wrong file names in man
Severity: normal
Classification: Unclassified
OS: All
Reporter: tamas.haraszti(a)uni-heidelberg.de
Hardware: Other
Status: NEW
Version: XOrg CVS
Component: libdrm
Product: DRI
make fails with:
/bin/sed: can't read drm-mm.7: No such file or directory
looking into the man folder I can find:
Makefile drm-kms.7 drm-memory.xml drmAvailable.3 drmHandleEvent.xml
drm\-gem.7
Makefile.am drm-kms.xml drm.7 drmAvailable.xml
drmModeGetResources.3 drm\-mm.7
Makefile.in drm-memory.7 drm.xml drmHandleEvent.3
drmModeGetResources.xml drm\-ttm.7
Thus all MANPAGE_ALIASES contain an extra \.
Renaming the files allows the make to proceed.
System: gentoo linux, newest automake, cmake, bash...
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=72425
Priority: medium
Bug ID: 72425
Assignee: dri-devel(a)lists.freedesktop.org
Summary: divide by zero error in radeon_surface.c when opening
chrome with WebGL enabled
Severity: normal
Classification: Unclassified
OS: All
Reporter: crwulff(a)gmail.com
Hardware: Other
Status: NEW
Version: unspecified
Component: libdrm
Product: DRI
Created attachment 90384
--> https://bugs.freedesktop.org/attachment.cgi?id=90384&action=edit
Fix divide by zero in radeon_surface
Passing a tile_split of zero to eg_surface_init_2d causes a divide by zero
error. Launching chromium with WebGL enabled on a AMD Llano (A8-3850) exhibits
this behavior and webgl then fails to work. The attached patch fixes the
problem and allows WebGL to work in chrome on this platform.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=66332
Priority: medium
Bug ID: 66332
Assignee: dri-devel(a)lists.freedesktop.org
Summary: drmHandleEvent returns 0 on read() failure
Severity: normal
Classification: Unclassified
OS: All
Reporter: mgold(a)qnx.com
Hardware: All
Status: NEW
Version: XOrg CVS
Component: libdrm
Product: DRI
drmHandleEvent contains this code:
len = read(fd, buffer, sizeof buffer);
if (len == 0)
return 0;
if (len < sizeof *e)
return -1;
In the (len < sizeof *e) check, len gets promoted to size_t (which is
unsigned); so when len is negative "return -1" won't be executed. Instead, the
function continues to the end and returns 0. (The documentation states
drmHandleEvent will return -1 if the read fails.)
If there's an error like EBADF, the caller won't detect it and might end up
busy-waiting. Rewriting the condition as (len < (int)(sizeof *e)) will fix
this.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=61269
Priority: medium
Bug ID: 61269
Assignee: dri-devel(a)lists.freedesktop.org
Summary: Support libkms on FreeBSD
Severity: normal
Classification: Unclassified
OS: FreeBSD
Reporter: bugzilla(a)tecnocode.co.uk
Hardware: All
Status: NEW
Version: XOrg CVS
Component: libdrm
Product: DRI
Created attachment 75298
--> https://bugs.freedesktop.org/attachment.cgi?id=75298&action=edit
Fix detection of Intel-style atomic primitives on amd64 (patch by Brian Waters)
Series of 3 patches coming up which allow libdrm to be compiled with KMS
support on FreeBSD.
--
You are receiving this mail because:
You are the assignee for the bug.