https://bugs.freedesktop.org/show_bug.cgi?id=71083
Priority: medium
Bug ID: 71083
Assignee: dri-devel(a)lists.freedesktop.org
Summary: (struct drm_encoder_helper_funcs)->mode_set not
re-called after display (and EDID) change
Severity: normal
Classification: Unclassified
OS: All
Reporter: zajec5(a)gmail.com
Hardware: Other
Status: NEW
Version: unspecified
Component: DRM/other
Product: DRI
I use my DCE5 Barts (HD69xx) with AV-receiver Onkyo TX-SR605 and one of the
following displays:
1) TV Sony Bravia KDL-52X3500
2) Projector Epson EH-TW6100
My problem is that when I change display connected to the Onkyo's output EDID
changes, but drm doesn't call mode_set as long as I use the same resolution.
To force drm to call mode_set I've to change resolution (xrandr --output HDMI-0
--mode X) and then swtich back to the mode I want.
While the display seems to be working fine without that mode_set call, the
audio engine doesn't. As part of the modesetting handler we read ELD-related
info from EDID and write it to the audio engine of the GPU. Without this
happening I can't play correctly audio (because also sees info about previous
device, not the current one).
I think mode_set should be called every time EDID changes. Is that right?
In case someone's curious:
1) EDID with Onkyo + Sony TV:
00ffffffffffff003dcb610700000000
0011010380a05a780a0dc9a057479827
12484c21080081800101010101010101
010101010101023a801871382d40582c
450040846300001e011d007251d01e20
6e28550040846300001e000000fc0054
582d53523630350a20202020000000fd
00303e0e460f000a2020202020200185
02034c705c1f03041213051420071610
15110206010f1e0b1a191d0e0a242625
2335097f070f7f071707503f06c05706
005f7e01671e00834f00006c030c0012
00b82dc000000000e3050301023a80d0
72382d40102c458040846300001e011d
00bc52d01e20b828554040846300001e
00000000000000000000000000000078
2) EDID with Onkyo + Epson projector:
00ffffffffffff004ca333d000000000
0c150104952616780aa0558d515a962a
1c505400000001010101010101010101
0101010101016a4d80a07038fc413020
36007ed710000018d49a80a07038fc41
302036007ed710000038000000fc0053
414d53554e470a2020202020000000fc
00313733485430322d4330310a200035
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=69675
Priority: medium
Bug ID: 69675
Assignee: dri-devel(a)lists.freedesktop.org
Summary: audio broken in 24Hz/24p since 3.11 (regression)
Severity: normal
Classification: Unclassified
OS: All
Reporter: pierre-bugzilla(a)ossman.eu
Hardware: Other
Status: NEW
Version: unspecified
Component: DRM/other
Product: DRI
Bug 64503 is back again, but this time it isn't a case of PEBKAC. Instead it is
commit e6e792092e816bea0797995c886fb057c91d4546 that breaks things.
With 3.10 I have just this 24p mode in Xorg:
[ 47.361] (II) RADEON(0): Modeline "1920x1080"x24.0 74.25 1920 2558 2602
2750 1080 1084 1089 1125 +hsync +vsync (27.0 kHz e)
With 3.11 I have two:
Xorg.0.log:[ 56.189] (II) RADEON(0): Modeline "1920x1080"x24.0 74.25 1920
2558 2602 2750 1080 1084 1089 1125 +hsync +vsync (27.0 kHz e)
Xorg.0.log:[ 56.189] (II) RADEON(0): Modeline "1920x1080"x24.0 74.18 1920
2558 2602 2750 1080 1084 1089 1125 +hsync +vsync (27.0 kHz e)
And although the second one gives me an image, audio is royally screwed up.
Please revert, or at least give us a knob to disable these extra modes.
--
You are receiving this mail because:
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=53544
Bug #: 53544
Summary: Incorrect modeline due to incorrect EDID block for LG
SL80 TV
Classification: Unclassified
Product: DRI
Version: DRI CVS
Platform: Other
OS/Version: All
Status: NEW
Severity: normal
Priority: medium
Component: General
AssignedTo: dri-devel(a)lists.freedesktop.org
ReportedBy: paulepanter(a)users.sourceforge.net
The LG SL80 TV exhibits the same garbled screen as reported in #26294 [1] when
connecting an ASUS Eee PC 701 4G over VGA.
This is just a separate report for this particular TV and a patch is going to
be send to the list. Log files are included in comment 26294#c20 and comment
26294#c21 of the referenced report.
[1] https://bugs.freedesktop.org/show_bug.cgi?id=26294
--
Configure bugmail: https://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=32821
Summary: [mach64] DRI doesn't work on SGRAM with 4096Kb
Product: DRI
Version: XOrg CVS
Platform: x86 (IA32)
OS/Version: Linux (All)
Status: NEW
Severity: major
Priority: medium
Component: DRM/other
AssignedTo: dri-devel(a)lists.freedesktop.org
ReportedBy: nss.zpeedzter(a)gmail.com
Created an attachment (id=41615)
--> (https://bugs.freedesktop.org/attachment.cgi?id=41615)
with default packages and default conf
I have an integrated ATI Rage Pro 2x on a Dell OptiPlex GX1.
I'm running ARCHLINUX
here's a part of the xorg log with default packages and default configuration
[ 23.698] (--) Depth 24 pixmap format is 32 bpp
[ 23.757] (WW) MACH64(0): DRI static buffer allocation failed -- need at
least 10000 kB video memory
....
....
[ 23.791] (II) MACH64(0): Direct rendering disabled
and the second situation:
I install mach64drm from AUR and added an xorg.conf and switched to 16bpp
here's a part of xorg log:
[ 25.394] (II) MACH64(0): I2C bus "Mach64" initialized.
[ 25.455] (WW) MACH64(0): DRI static buffer allocation failed -- need at
least 7593 kB video memory
....
....
[ 25.489] (II) MACH64(0): Direct rendering disabled
--
Configure bugmail: https://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=28000
Summary: [Mobile GM965/GL96][Wine][3D] Display and textures
corrupted in Warhammer 40k Soulstorm
Product: DRI
Version: XOrg CVS
Platform: x86 (IA32)
OS/Version: Linux (All)
Status: NEW
Severity: critical
Priority: medium
Component: General
AssignedTo: dri-devel(a)lists.freedesktop.org
ReportedBy: fdelente(a)mail.cpod.fr
Created an attachment (id=35465)
--> (https://bugs.freedesktop.org/attachment.cgi?id=35465)
Xorg.0.log
$ lspci
00:02.0 VGA compatible controller: Intel Corporation Mobile GM965/GL960
Integrated Graphics Controller (rev 03)
00:02.1 Display controller: Intel Corporation Mobile GM965/GL960 Integrated
Graphics Controller (rev 03)
$ uname -m
i686
$ uname -a
Linux slick 2.6.32.3-smp #7 SMP PREEMPT Tue Apr 6 19:10:03 CEST 2010 i686
Intel(R) Core(TM)2 Duo CPU T5750 @ 2.00GHz GenuineIntel GNU/Linux
-- xf86-video-intel/xserver/mesa/libdrm version:
xf86-video-intel-2.11.0-i386-1
xorg-server-1.8.0-i386-1
Mesa-7.8.1-i386-1
libdrm-2.4.20-i386-1
-- kernel version: ("uname -r")
2.6.32.3-smp
-- Linux distribution:
Slackware 13.0 initially, but I recompiled many packages from source
-- Machine or mobo model:
Packard Bell BG46 fr
-- Display connector: (e.g. VGA, DVI, HDMI, S-video, ...)
VGA? LCD panel
3) Reproduce steps. Probability if not 100% reproducible.
100% reproducible: when starting Soulstorm, some bitmaps and 3D textures are
corrupted. The same game works flawlessly on another computer with an Nvidia
card.
4) Attachment:
-- Xorg.0.log
-- dmesg output (better with boot option "drm.debug=0x06")
-- xorg.conf (if you changed any default options)
No change there
-- screenshot or photo (optional, a picture is worth a thousand words)
-- output of "xrandr --verbose" for display mode issue
-- intel_reg_dumper output (see the guide) for display issue.
-- for GPU hang, get the last batch buffer (see the guide).
--
Configure bugmail: https://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=28841
Summary: Heavy text corruption in virtual consoles using DRI
Product: DRI
Version: unspecified
Platform: x86 (IA32)
OS/Version: Linux (All)
Status: NEW
Severity: normal
Priority: medium
Component: General
AssignedTo: dri-devel(a)lists.freedesktop.org
ReportedBy: eblanca76(a)users.sourceforge.net
I have a clean install of latest ubuntu lucid lynx, I added framebuffer support
for g550 (matroxfb) and text in virtual consoles appears corrupted, it is made
of random pixels appearing far away from the expected prompt. I can get back to
clean text if I switch to xorg (ctrl+f7) then back to the virtual console, but
if I type in new text, it appears ugly as usual.
The only workaround for this is to disable DRI, then the text appears perfect
and no switch is needed. Of course, this way the system cannot support any
game.
I attach my actual xorg.conf.
My config: ubuntu 10.04 (Lucid) 2.6.32-22-generic i686 GNU/Linux with
gnome 2.30.2 and xorg X server 1.7.6.
--
Configure bugmail: https://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=48215
Bug #: 48215
Summary: [NV11] Display constantly switches on and off roughly
every 10 seconds.
Classification: Unclassified
Product: DRI
Version: XOrg CVS
Platform: x86 (IA32)
OS/Version: All
Status: NEW
Severity: normal
Priority: medium
Component: DRM/other
AssignedTo: dri-devel(a)lists.freedesktop.org
ReportedBy: alex.buell(a)munted.org.uk
Display constantly switches on and off roughly every 10 seconds on my NV11
machine. This started happening with 3.2.0.
Checking backwards, 3.1.0 works just fine. I am currently trying a bisect to
find the problem.
--
Configure bugmail: https://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=52560
Bug #: 52560
Summary: Xorg -configure creats useless xorg.conf for tdfx
Classification: Unclassified
Product: DRI
Version: XOrg CVS
Platform: All
OS/Version: Linux (All)
Status: NEW
Severity: normal
Priority: medium
Component: DRM/other
AssignedTo: dri-devel(a)lists.freedesktop.org
ReportedBy: tuxracer(a)hispeed.ch
Had want to install my old Voodoo3/2000 PCI that 2 or 3 years ago perfectly
work.
Now on actual xorg-server at least 1.10 1.11 1.12 I really unable to configure
a working 3D. If i use the driver without any xorg.conf I can use at least
Software-Rendering. But if i create a xorg.conf using Xorg -configure it gives
no 3D.
On old xorg-servers the only thing i had to do for 3D on this card was to set
the resolution to 1024x768 at 16bit colordepth and it worked.
--
Configure bugmail: https://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=38364
Summary: Ignoring invalid EDID block 1 do entire edid is
invalid and not just block 1
Product: DRI
Version: XOrg CVS
Platform: x86-64 (AMD64)
OS/Version: All
Status: NEW
Severity: normal
Priority: medium
Component: General
AssignedTo: dri-devel(a)lists.freedesktop.org
ReportedBy: zaverel(a)free.fr
Hello ,
All is fine except i can't change resolution on my second monitor vga
,tv in fact, (reported as DVI-I-2 ) anymore with my 9400gt.
Now , i'm on :
linux-2.6.39-gentoo-r1
xorg-server-1.10.2
xf86-video-nouveau-0.0.16_pre20110323
libdrm-2.4.25
Errors in dmesg are:
nouveau 0000:02:00.0: DVI-I-2: Ignoring invalid EDID block 1.
[drm:drm_edid_block_valid] *ERROR* Raw EDID:
<3>00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
<3>00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
<3>00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
<3>00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
<3>00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
<3>00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
<3>00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
<3>00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
whatever i valid with xrandr is not really do on my tv
although xrandr say it's good , my tv always report 1202x670 50hz
I tried with and without xorg.conf
My last kernel working is linux-2.6.36-gentoo-r6.
If, i tweak drm_edid.c from kernel-2.6.39-gentoo-r1 like this:
--- drm_edid.c 2011-06-10 22:37:36.605848000 +0200
+++ linux/drivers/gpu/drm/drm_edid.c 2011-06-13 13:04:43.136786102 +0200
@@ -292,7 +292,7 @@
block + (valid_extensions + 1) * EDID_LENGTH,
j, EDID_LENGTH))
goto out;
- if (drm_edid_block_valid(block + (valid_extensions + 1) *
EDID_LENGTH)) {
+ if (drm_edid_block_valid(block + (valid_extensions + 0) *
EDID_LENGTH)) {
valid_extensions++;
break;
}
that work good like before but i'm not sure that is safe.
More info in
http://lists.freedesktop.org/archives/nouveau/2011-June/008548.html
I can post logs here too , just tell me.
See you
--
Configure bugmail: https://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
https://bugs.freedesktop.org/show_bug.cgi?id=60879
Priority: medium
Bug ID: 60879
Assignee: dri-devel(a)lists.freedesktop.org
Summary: X11 can't start with acceleration enabled
Severity: blocker
Classification: Unclassified
OS: Linux (All)
Reporter: mustrumr97(a)gmail.com
Hardware: x86-64 (AMD64)
Status: NEW
Version: unspecified
Component: DRM/Radeon
Product: DRI
Created attachment 74857
--> https://bugs.freedesktop.org/attachment.cgi?id=74857&action=edit
Screenshot of Xorg
I've recently bought a Radeon 7000 graphics card - Sapphire Radeon HD 7870 XT.
I enabled glamor in xorg.conf.
The X server fails to start with glamor enabled. For 10 seconds it shows
nothing. I've attached a screenshot taken with my mobile phone after that.
After killing X the monitor turns off as if the GPU is off (probably true).
X works if I disable glamor. However this way I'd be forced to use swrast.
xorg-xserver - 1.13.2
xf86-video-ati - git or 7.1.0 (both tried)
glamor - git or 0.5 (both tried)
libdrm - git or 2.4.42 (both tried)
kernel - git or 3.7.7 (both tried)
mesa - git
weston also fails to start. It just shows a black screen with a white stripe on
top. Killing it returns me to a tty.
I'll attach parts of the kernel log when starting Xorg and weston.
--
You are receiving this mail because:
You are the assignee for the bug.