Am 09.08.2010 09:33, wrote Alex Deucher:
2010/8/9 Marius Grögermarius.groeger@googlemail.com:
Am 08.08.2010 20:09, wrote Alex Deucher:
2010/8/8 Marius Grögermarius.groeger@googlemail.com:
Am 08.08.2010 18:22, wrote Alex Deucher:
Also, I kind of was hoping that once I could use 1280x720 for both the console and the X screen, it would would allow me to switch between the two transparently. Instead, the TV takes notice of the switch and needs some extra syncing time. Is this expected behaviour?
You mean switching underscan off and on or a VT switch? The hw has to reprogram the mode when it changes the underscan. As for a VT switch, it should just be changing the crtc base, but IIRC there was a bug where X and the console used slightly different modes in some cases.
VT switch. I use video=1280x720@50 on the command line and select the corresponding EDID resolution within X. Is this bug still pending or is my scenario supposed to work?
You may be seeing this issue: http://lists.x.org/archives/xorg-devel/2010-August/011743.html
"In the absence of the user specifying an overriding monitor configuration, trust the KMS drivers to have correctly probed the output modes."
Well, in my case I *am* specifying an overriding monitor configuration. Is there still a chance that video=1280x720@50 could be meaning s.th. different then the corresponding mode in X that I explicitly choose.
Can you verify that the console and X are using the same modeline? The mode selected by video=1280x720@50 likely has different timing than the timing used by the mode in X. Is the VT switch smooth when you don't specify the mode on the console or X (i.e., let the driver decide on it's own)?
Ah ok, "likely has different timing" - this is probably the issue here. I'll be investigating this. Is there a way to influence the timing used by video=1280x720@50 to match the one used in X? Or should I try finding out about the console timing and use that as an xorg.conf modeline?
Regards Marius