This connector attribute allows you to enable or disable underscan on a digital output to compensate for panels that automatically overscan (e.g., many HDMI TVs). Valid values for the attribute are:
off - forces underscan off on - forces underscan on auto - enables underscan if an HDMI TV is connected, off otherwise
default value is auto.
Signed-off-by: Alex Deucher alexdeucher@gmail.com --- drivers/gpu/drm/radeon/atombios_crtc.c | 36 +++++++++--------------- drivers/gpu/drm/radeon/radeon_connectors.c | 23 +++++++++++++++ drivers/gpu/drm/radeon/radeon_display.c | 41 ++++++++++++++++++++++++++++ drivers/gpu/drm/radeon/radeon_encoders.c | 5 +++- drivers/gpu/drm/radeon/radeon_mode.h | 18 +++++++++++- 5 files changed, 98 insertions(+), 25 deletions(-)
diff --git a/drivers/gpu/drm/radeon/atombios_crtc.c b/drivers/gpu/drm/radeon/atombios_crtc.c index a2e65d9..12ad512 100644 --- a/drivers/gpu/drm/radeon/atombios_crtc.c +++ b/drivers/gpu/drm/radeon/atombios_crtc.c @@ -44,10 +44,6 @@ static void atombios_overscan_setup(struct drm_crtc *crtc,
memset(&args, 0, sizeof(args));
- args.usOverscanRight = 0; - args.usOverscanLeft = 0; - args.usOverscanBottom = 0; - args.usOverscanTop = 0; args.ucCRTC = radeon_crtc->crtc_id;
switch (radeon_crtc->rmx_type) { @@ -56,7 +52,6 @@ static void atombios_overscan_setup(struct drm_crtc *crtc, args.usOverscanBottom = (adjusted_mode->crtc_vdisplay - mode->crtc_vdisplay) / 2; args.usOverscanLeft = (adjusted_mode->crtc_hdisplay - mode->crtc_hdisplay) / 2; args.usOverscanRight = (adjusted_mode->crtc_hdisplay - mode->crtc_hdisplay) / 2; - atom_execute_table(rdev->mode_info.atom_context, index, (uint32_t *)&args); break; case RMX_ASPECT: a1 = mode->crtc_vdisplay * adjusted_mode->crtc_hdisplay; @@ -69,17 +64,16 @@ static void atombios_overscan_setup(struct drm_crtc *crtc, args.usOverscanLeft = (adjusted_mode->crtc_vdisplay - (a1 / mode->crtc_hdisplay)) / 2; args.usOverscanRight = (adjusted_mode->crtc_vdisplay - (a1 / mode->crtc_hdisplay)) / 2; } - atom_execute_table(rdev->mode_info.atom_context, index, (uint32_t *)&args); break; case RMX_FULL: default: - args.usOverscanRight = 0; - args.usOverscanLeft = 0; - args.usOverscanBottom = 0; - args.usOverscanTop = 0; - atom_execute_table(rdev->mode_info.atom_context, index, (uint32_t *)&args); + args.usOverscanRight = radeon_crtc->h_border; + args.usOverscanLeft = radeon_crtc->h_border; + args.usOverscanBottom = radeon_crtc->v_border; + args.usOverscanTop = radeon_crtc->v_border; break; } + atom_execute_table(rdev->mode_info.atom_context, index, (uint32_t *)&args); }
static void atombios_scaler_setup(struct drm_crtc *crtc) @@ -282,22 +276,22 @@ atombios_set_crtc_dtd_timing(struct drm_crtc *crtc, u16 misc = 0;
memset(&args, 0, sizeof(args)); - args.usH_Size = cpu_to_le16(mode->crtc_hdisplay); + args.usH_Size = cpu_to_le16(mode->crtc_hdisplay - (radeon_crtc->h_border * 2)); args.usH_Blanking_Time = - cpu_to_le16(mode->crtc_hblank_end - mode->crtc_hdisplay); - args.usV_Size = cpu_to_le16(mode->crtc_vdisplay); + cpu_to_le16(mode->crtc_hblank_end - mode->crtc_hdisplay + (radeon_crtc->h_border * 2)); + args.usV_Size = cpu_to_le16(mode->crtc_vdisplay - (radeon_crtc->v_border * 2)); args.usV_Blanking_Time = - cpu_to_le16(mode->crtc_vblank_end - mode->crtc_vdisplay); + cpu_to_le16(mode->crtc_vblank_end - mode->crtc_vdisplay + (radeon_crtc->v_border * 2)); args.usH_SyncOffset = - cpu_to_le16(mode->crtc_hsync_start - mode->crtc_hdisplay); + cpu_to_le16(mode->crtc_hsync_start - mode->crtc_hdisplay + radeon_crtc->h_border); args.usH_SyncWidth = cpu_to_le16(mode->crtc_hsync_end - mode->crtc_hsync_start); args.usV_SyncOffset = - cpu_to_le16(mode->crtc_vsync_start - mode->crtc_vdisplay); + cpu_to_le16(mode->crtc_vsync_start - mode->crtc_vdisplay + radeon_crtc->v_border); args.usV_SyncWidth = cpu_to_le16(mode->crtc_vsync_end - mode->crtc_vsync_start); - /*args.ucH_Border = mode->hborder;*/ - /*args.ucV_Border = mode->vborder;*/ + args.ucH_Border = radeon_crtc->h_border; + args.ucV_Border = radeon_crtc->v_border;
if (mode->flags & DRM_MODE_FLAG_NVSYNC) misc |= ATOM_VSYNC_POLARITY; @@ -1176,10 +1170,8 @@ int atombios_crtc_mode_set(struct drm_crtc *crtc, atombios_crtc_set_pll(crtc, adjusted_mode); atombios_enable_ss(crtc);
- if (ASIC_IS_DCE4(rdev)) + if (ASIC_IS_AVIVO(rdev)) atombios_set_crtc_dtd_timing(crtc, adjusted_mode); - else if (ASIC_IS_AVIVO(rdev)) - atombios_crtc_set_timing(crtc, adjusted_mode); else { atombios_crtc_set_timing(crtc, adjusted_mode); if (radeon_crtc->crtc_id == 0) diff --git a/drivers/gpu/drm/radeon/radeon_connectors.c b/drivers/gpu/drm/radeon/radeon_connectors.c index 6b9aac7..609eda6 100644 --- a/drivers/gpu/drm/radeon/radeon_connectors.c +++ b/drivers/gpu/drm/radeon/radeon_connectors.c @@ -312,6 +312,20 @@ int radeon_connector_set_property(struct drm_connector *connector, struct drm_pr } }
+ if (property == rdev->mode_info.underscan_property) { + /* need to find digital encoder on connector */ + encoder = radeon_find_encoder(connector, DRM_MODE_ENCODER_TMDS); + if (!encoder) + return 0; + + radeon_encoder = to_radeon_encoder(encoder); + + if (radeon_encoder->underscan_type != val) { + radeon_encoder->underscan_type = val; + radeon_property_change_mode(&radeon_encoder->base); + } + } + if (property == rdev->mode_info.tv_std_property) { encoder = radeon_find_encoder(connector, DRM_MODE_ENCODER_TVDAC); if (!encoder) { @@ -1120,6 +1134,9 @@ radeon_add_atom_connector(struct drm_device *dev, drm_connector_attach_property(&radeon_connector->base, rdev->mode_info.coherent_mode_property, 1); + drm_connector_attach_property(&radeon_connector->base, + rdev->mode_info.underscan_property, + UNDERSCAN_AUTO); if (connector_type == DRM_MODE_CONNECTOR_DVII) { radeon_connector->dac_load_detect = true; drm_connector_attach_property(&radeon_connector->base, @@ -1145,6 +1162,9 @@ radeon_add_atom_connector(struct drm_device *dev, drm_connector_attach_property(&radeon_connector->base, rdev->mode_info.coherent_mode_property, 1); + drm_connector_attach_property(&radeon_connector->base, + rdev->mode_info.underscan_property, + UNDERSCAN_AUTO); subpixel_order = SubPixelHorizontalRGB; break; case DRM_MODE_CONNECTOR_DisplayPort: @@ -1176,6 +1196,9 @@ radeon_add_atom_connector(struct drm_device *dev, drm_connector_attach_property(&radeon_connector->base, rdev->mode_info.coherent_mode_property, 1); + drm_connector_attach_property(&radeon_connector->base, + rdev->mode_info.underscan_property, + UNDERSCAN_AUTO); break; case DRM_MODE_CONNECTOR_SVIDEO: case DRM_MODE_CONNECTOR_Composite: diff --git a/drivers/gpu/drm/radeon/radeon_display.c b/drivers/gpu/drm/radeon/radeon_display.c index 12a5414..74dac96 100644 --- a/drivers/gpu/drm/radeon/radeon_display.c +++ b/drivers/gpu/drm/radeon/radeon_display.c @@ -921,6 +921,12 @@ static struct drm_prop_enum_list radeon_tv_std_enum_list[] = { TV_STD_SECAM, "secam" }, };
+static struct drm_prop_enum_list radeon_underscan_enum_list[] = +{ { UNDERSCAN_OFF, "off" }, + { UNDERSCAN_ON, "on" }, + { UNDERSCAN_AUTO, "auto" }, +}; + static int radeon_modeset_create_props(struct radeon_device *rdev) { int i, sz; @@ -974,6 +980,18 @@ static int radeon_modeset_create_props(struct radeon_device *rdev) radeon_tv_std_enum_list[i].name); }
+ sz = ARRAY_SIZE(radeon_underscan_enum_list); + rdev->mode_info.underscan_property = + drm_property_create(rdev->ddev, + DRM_MODE_PROP_ENUM, + "underscan", sz); + for (i = 0; i < sz; i++) { + drm_property_add_enum(rdev->mode_info.underscan_property, + i, + radeon_underscan_enum_list[i].type, + radeon_underscan_enum_list[i].name); + } + return 0; }
@@ -1069,17 +1087,26 @@ bool radeon_crtc_scaling_mode_fixup(struct drm_crtc *crtc, struct drm_display_mode *adjusted_mode) { struct drm_device *dev = crtc->dev; + struct radeon_device *rdev = dev->dev_private; struct drm_encoder *encoder; struct radeon_crtc *radeon_crtc = to_radeon_crtc(crtc); struct radeon_encoder *radeon_encoder; + struct drm_connector *connector; + struct radeon_connector *radeon_connector; bool first = true; u32 src_v = 1, dst_v = 1; u32 src_h = 1, dst_h = 1;
+ radeon_crtc->h_border = 0; + radeon_crtc->v_border = 0; + list_for_each_entry(encoder, &dev->mode_config.encoder_list, head) { if (encoder->crtc != crtc) continue; radeon_encoder = to_radeon_encoder(encoder); + connector = radeon_get_connector_for_encoder(encoder); + radeon_connector = to_radeon_connector(connector); + if (first) { /* set scaling */ if (radeon_encoder->rmx_type == RMX_OFF) @@ -1097,6 +1124,20 @@ bool radeon_crtc_scaling_mode_fixup(struct drm_crtc *crtc, memcpy(&radeon_crtc->native_mode, &radeon_encoder->native_mode, sizeof(struct drm_display_mode)); + + /* fix up for overscan on hdmi */ + if (ASIC_IS_AVIVO(rdev) && + ((radeon_encoder->underscan_type == UNDERSCAN_ON) || + ((radeon_encoder->underscan_type == UNDERSCAN_AUTO) && + drm_detect_hdmi_monitor(radeon_connector->edid)))) { + radeon_crtc->h_border = (mode->hdisplay >> 5) + 16; + radeon_crtc->v_border = (mode->vdisplay >> 5) + 16; + radeon_crtc->rmx_type = RMX_FULL; + src_v = crtc->mode.vdisplay; + dst_v = crtc->mode.vdisplay - (radeon_crtc->v_border * 2); + src_h = crtc->mode.hdisplay; + dst_h = crtc->mode.hdisplay - (radeon_crtc->h_border * 2); + } first = false; } else { if (radeon_crtc->rmx_type != radeon_encoder->rmx_type) { diff --git a/drivers/gpu/drm/radeon/radeon_encoders.c b/drivers/gpu/drm/radeon/radeon_encoders.c index 5e7a053..4a4ff98 100644 --- a/drivers/gpu/drm/radeon/radeon_encoders.c +++ b/drivers/gpu/drm/radeon/radeon_encoders.c @@ -212,7 +212,7 @@ void radeon_encoder_set_active_device(struct drm_encoder *encoder) } }
-static struct drm_connector * +struct drm_connector * radeon_get_connector_for_encoder(struct drm_encoder *encoder) { struct drm_device *dev = encoder->dev; @@ -1694,6 +1694,7 @@ radeon_add_atom_encoder(struct drm_device *dev, uint32_t encoder_id, uint32_t su radeon_encoder->encoder_id = encoder_id; radeon_encoder->devices = supported_device; radeon_encoder->rmx_type = RMX_OFF; + radeon_encoder->underscan_type = UNDERSCAN_OFF;
switch (radeon_encoder->encoder_id) { case ENCODER_OBJECT_ID_INTERNAL_LVDS: @@ -1707,6 +1708,7 @@ radeon_add_atom_encoder(struct drm_device *dev, uint32_t encoder_id, uint32_t su } else { drm_encoder_init(dev, encoder, &radeon_atom_enc_funcs, DRM_MODE_ENCODER_TMDS); radeon_encoder->enc_priv = radeon_atombios_set_dig_info(radeon_encoder); + radeon_encoder->underscan_type = UNDERSCAN_AUTO; } drm_encoder_helper_add(encoder, &radeon_atom_dig_helper_funcs); break; @@ -1736,6 +1738,7 @@ radeon_add_atom_encoder(struct drm_device *dev, uint32_t encoder_id, uint32_t su } else { drm_encoder_init(dev, encoder, &radeon_atom_enc_funcs, DRM_MODE_ENCODER_TMDS); radeon_encoder->enc_priv = radeon_atombios_set_dig_info(radeon_encoder); + radeon_encoder->underscan_type = UNDERSCAN_AUTO; } drm_encoder_helper_add(encoder, &radeon_atom_dig_helper_funcs); break; diff --git a/drivers/gpu/drm/radeon/radeon_mode.h b/drivers/gpu/drm/radeon/radeon_mode.h index 95696aa..71aea40 100644 --- a/drivers/gpu/drm/radeon/radeon_mode.h +++ b/drivers/gpu/drm/radeon/radeon_mode.h @@ -66,6 +66,12 @@ enum radeon_tv_std { TV_STD_PAL_N, };
+enum radeon_underscan_type { + UNDERSCAN_OFF, + UNDERSCAN_ON, + UNDERSCAN_AUTO, +}; + enum radeon_hpd_id { RADEON_HPD_1 = 0, RADEON_HPD_2, @@ -226,10 +232,12 @@ struct radeon_mode_info { struct drm_property *coherent_mode_property; /* DAC enable load detect */ struct drm_property *load_detect_property; - /* TV standard load detect */ + /* TV standard */ struct drm_property *tv_std_property; /* legacy TMDS PLL detect */ struct drm_property *tmds_pll_property; + /* underscan */ + struct drm_property *underscan_property; /* hardcoded DFP edid from BIOS */ struct edid *bios_hardcoded_edid;
@@ -266,6 +274,8 @@ struct radeon_crtc { uint32_t legacy_display_base_addr; uint32_t legacy_cursor_offset; enum radeon_rmx_type rmx_type; + u8 h_border; + u8 v_border; fixed20_12 vsc; fixed20_12 hsc; struct drm_display_mode native_mode; @@ -354,6 +364,7 @@ struct radeon_encoder { uint32_t flags; uint32_t pixel_clock; enum radeon_rmx_type rmx_type; + enum radeon_underscan_type underscan_type; struct drm_display_mode native_mode; void *enc_priv; int audio_polling_active; @@ -392,7 +403,7 @@ struct radeon_connector { uint32_t connector_id; uint32_t devices; struct radeon_i2c_chan *ddc_bus; - /* some systems have a an hdmi and vga port with a shared ddc line */ + /* some systems have an hdmi and vga port with a shared ddc line */ bool shared_ddc; bool use_digital; /* we need to mind the EDID between detect @@ -414,6 +425,9 @@ radeon_combios_get_tv_info(struct radeon_device *rdev); extern enum radeon_tv_std radeon_atombios_get_tv_info(struct radeon_device *rdev);
+extern struct drm_connector * +radeon_get_connector_for_encoder(struct drm_encoder *encoder); + extern void radeon_connector_hotplug(struct drm_connector *connector); extern bool radeon_dp_needs_link_train(struct radeon_connector *radeon_connector); extern int radeon_dp_mode_valid_helper(struct radeon_connector *radeon_connector,
Am 04.08.2010 01:59, schrieb Alex Deucher:
This connector attribute allows you to enable or disable underscan on a digital output to compensate for panels that automatically overscan (e.g., many HDMI TVs). Valid values for the attribute are:
off - forces underscan off on - forces underscan on auto - enables underscan if an HDMI TV is connected, off otherwise
default value is auto.
Terrific! Two questions:
- inevitably, on my TV Set (SONY KDL 3000) this now doing too much underscan. In pixels: without your patch, I used a custom modeline to map 1280x720p to 1220x680p, so I'm 40 pixels down in each dimension. How to fix that?
- more of a general drm question I guess: in what way are the connector attributes available on the command line? I couldn't find a complete kernel command line or modprobe invocation.
Thanks Marius
2010/8/4 Marius Gröger marius.groeger@googlemail.com:
Am 04.08.2010 01:59, schrieb Alex Deucher:
This connector attribute allows you to enable or disable underscan on a digital output to compensate for panels that automatically overscan (e.g., many HDMI TVs). Valid values for the attribute are:
off - forces underscan off on - forces underscan on auto - enables underscan if an HDMI TV is connected, off otherwise
default value is auto.
Terrific! Two questions:
- inevitably, on my TV Set (SONY KDL 3000) this now doing too much
underscan. In pixels: without your patch, I used a custom modeline to map 1280x720p to 1220x680p, so I'm 40 pixels down in each dimension. How to fix that?
Adjust radeon_crtc->v_border and radeon_crtc->h_border in the patch to whatever size you want.
- more of a general drm question I guess: in what way are the connector
attributes available on the command line? I couldn't find a complete kernel command line or modprobe invocation.
I guess you'll need to write an app to invoke the proper ioctls to change them at runtime. I'm not sure if one exists or not at the moment.
Alex
Am 04.08.2010 16:35, schrieb Alex Deucher:
2010/8/4 Marius Grögermarius.groeger@googlemail.com:
Am 04.08.2010 01:59, schrieb Alex Deucher:
This connector attribute allows you to enable or disable underscan on a digital output to compensate for panels that automatically overscan (e.g., many HDMI TVs). Valid values for the attribute are:
off - forces underscan off on - forces underscan on auto - enables underscan if an HDMI TV is connected, off otherwise
default value is auto.
Terrific! Two questions:
- inevitably, on my TV Set (SONY KDL 3000) this now doing too much
underscan. In pixels: without your patch, I used a custom modeline to map 1280x720p to 1220x680p, so I'm 40 pixels down in each dimension. How to fix that?
Adjust radeon_crtc->v_border and radeon_crtc->h_border in the patch to whatever size you want.
Thanks. It turns out that I need different values to fit the screen (probably due to native 1366/768 != 1280/720). This is of course at the cost of slightly changing the rendered ratio, but that's fine with me.
Any plans to make those values tunables?
Also, I kind of was hoping that once I could use 1280x720 for both the console and the X screen, it would would allow me to switch between the two transparently. Instead, the TV takes notice of the switch and needs some extra syncing time. Is this expected behaviour?
Thanks Marius
2010/8/8 Marius Gröger marius.groeger@googlemail.com:
Am 04.08.2010 16:35, schrieb Alex Deucher:
2010/8/4 Marius Grögermarius.groeger@googlemail.com:
Am 04.08.2010 01:59, schrieb Alex Deucher:
This connector attribute allows you to enable or disable underscan on a digital output to compensate for panels that automatically overscan (e.g., many HDMI TVs). Valid values for the attribute are:
off - forces underscan off on - forces underscan on auto - enables underscan if an HDMI TV is connected, off otherwise
default value is auto.
Terrific! Two questions:
- inevitably, on my TV Set (SONY KDL 3000) this now doing too much
underscan. In pixels: without your patch, I used a custom modeline to map 1280x720p to 1220x680p, so I'm 40 pixels down in each dimension. How to fix that?
Adjust radeon_crtc->v_border and radeon_crtc->h_border in the patch to whatever size you want.
Thanks. It turns out that I need different values to fit the screen (probably due to native 1366/768 != 1280/720). This is of course at the cost of slightly changing the rendered ratio, but that's fine with me.
Any plans to make those values tunables?
Perhaps if there is enough demand.
Also, I kind of was hoping that once I could use 1280x720 for both the console and the X screen, it would would allow me to switch between the two transparently. Instead, the TV takes notice of the switch and needs some extra syncing time. Is this expected behaviour?
You mean switching underscan off and on or a VT switch? The hw has to reprogram the mode when it changes the underscan. As for a VT switch, it should just be changing the crtc base, but IIRC there was a bug where X and the console used slightly different modes in some cases.
Alex
Am 08.08.2010 18:22, wrote Alex Deucher:
Also, I kind of was hoping that once I could use 1280x720 for both the console and the X screen, it would would allow me to switch between the two transparently. Instead, the TV takes notice of the switch and needs some extra syncing time. Is this expected behaviour?
You mean switching underscan off and on or a VT switch? The hw has to reprogram the mode when it changes the underscan. As for a VT switch, it should just be changing the crtc base, but IIRC there was a bug where X and the console used slightly different modes in some cases.
VT switch. I use video=1280x720@50 on the command line and select the corresponding EDID resolution within X. Is this bug still pending or is my scenario supposed to work?
Regards Marius
2010/8/8 Marius Gröger marius.groeger@googlemail.com:
Am 08.08.2010 18:22, wrote Alex Deucher:
Also, I kind of was hoping that once I could use 1280x720 for both the console and the X screen, it would would allow me to switch between the two transparently. Instead, the TV takes notice of the switch and needs some extra syncing time. Is this expected behaviour?
You mean switching underscan off and on or a VT switch? The hw has to reprogram the mode when it changes the underscan. As for a VT switch, it should just be changing the crtc base, but IIRC there was a bug where X and the console used slightly different modes in some cases.
VT switch. I use video=1280x720@50 on the command line and select the corresponding EDID resolution within X. Is this bug still pending or is my scenario supposed to work?
You may be seeing this issue: http://lists.x.org/archives/xorg-devel/2010-August/011743.html
Alex
Am 08.08.2010 20:09, wrote Alex Deucher:
2010/8/8 Marius Grögermarius.groeger@googlemail.com:
Am 08.08.2010 18:22, wrote Alex Deucher:
Also, I kind of was hoping that once I could use 1280x720 for both the console and the X screen, it would would allow me to switch between the two transparently. Instead, the TV takes notice of the switch and needs some extra syncing time. Is this expected behaviour?
You mean switching underscan off and on or a VT switch? The hw has to reprogram the mode when it changes the underscan. As for a VT switch, it should just be changing the crtc base, but IIRC there was a bug where X and the console used slightly different modes in some cases.
VT switch. I use video=1280x720@50 on the command line and select the corresponding EDID resolution within X. Is this bug still pending or is my scenario supposed to work?
You may be seeing this issue: http://lists.x.org/archives/xorg-devel/2010-August/011743.html
"In the absence of the user specifying an overriding monitor configuration, trust the KMS drivers to have correctly probed the output modes."
Well, in my case I *am* specifying an overriding monitor configuration. Is there still a chance that video=1280x720@50 could be meaning s.th. different then the corresponding mode in X that I explicitly choose.
Regards Marius
2010/8/9 Marius Gröger marius.groeger@googlemail.com:
Am 08.08.2010 20:09, wrote Alex Deucher:
2010/8/8 Marius Grögermarius.groeger@googlemail.com:
Am 08.08.2010 18:22, wrote Alex Deucher:
Also, I kind of was hoping that once I could use 1280x720 for both the console and the X screen, it would would allow me to switch between the two transparently. Instead, the TV takes notice of the switch and needs some extra syncing time. Is this expected behaviour?
You mean switching underscan off and on or a VT switch? The hw has to reprogram the mode when it changes the underscan. As for a VT switch, it should just be changing the crtc base, but IIRC there was a bug where X and the console used slightly different modes in some cases.
VT switch. I use video=1280x720@50 on the command line and select the corresponding EDID resolution within X. Is this bug still pending or is my scenario supposed to work?
You may be seeing this issue: http://lists.x.org/archives/xorg-devel/2010-August/011743.html
"In the absence of the user specifying an overriding monitor configuration, trust the KMS drivers to have correctly probed the output modes."
Well, in my case I *am* specifying an overriding monitor configuration. Is there still a chance that video=1280x720@50 could be meaning s.th. different then the corresponding mode in X that I explicitly choose.
Can you verify that the console and X are using the same modeline? The mode selected by video=1280x720@50 likely has different timing than the timing used by the mode in X. Is the VT switch smooth when you don't specify the mode on the console or X (i.e., let the driver decide on it's own)?
Alex
Am 09.08.2010 09:33, wrote Alex Deucher:
2010/8/9 Marius Grögermarius.groeger@googlemail.com:
Am 08.08.2010 20:09, wrote Alex Deucher:
2010/8/8 Marius Grögermarius.groeger@googlemail.com:
Am 08.08.2010 18:22, wrote Alex Deucher:
Also, I kind of was hoping that once I could use 1280x720 for both the console and the X screen, it would would allow me to switch between the two transparently. Instead, the TV takes notice of the switch and needs some extra syncing time. Is this expected behaviour?
You mean switching underscan off and on or a VT switch? The hw has to reprogram the mode when it changes the underscan. As for a VT switch, it should just be changing the crtc base, but IIRC there was a bug where X and the console used slightly different modes in some cases.
VT switch. I use video=1280x720@50 on the command line and select the corresponding EDID resolution within X. Is this bug still pending or is my scenario supposed to work?
You may be seeing this issue: http://lists.x.org/archives/xorg-devel/2010-August/011743.html
"In the absence of the user specifying an overriding monitor configuration, trust the KMS drivers to have correctly probed the output modes."
Well, in my case I *am* specifying an overriding monitor configuration. Is there still a chance that video=1280x720@50 could be meaning s.th. different then the corresponding mode in X that I explicitly choose.
Can you verify that the console and X are using the same modeline? The mode selected by video=1280x720@50 likely has different timing than the timing used by the mode in X. Is the VT switch smooth when you don't specify the mode on the console or X (i.e., let the driver decide on it's own)?
Ah ok, "likely has different timing" - this is probably the issue here. I'll be investigating this. Is there a way to influence the timing used by video=1280x720@50 to match the one used in X? Or should I try finding out about the console timing and use that as an xorg.conf modeline?
Regards Marius
2010/8/9 Marius Gröger marius.groeger@googlemail.com:
Am 09.08.2010 09:33, wrote Alex Deucher:
2010/8/9 Marius Grögermarius.groeger@googlemail.com:
Am 08.08.2010 20:09, wrote Alex Deucher:
2010/8/8 Marius Grögermarius.groeger@googlemail.com:
Am 08.08.2010 18:22, wrote Alex Deucher:
> > Also, I kind of was hoping that once I could use 1280x720 for both > the > console and the X screen, it would would allow me to switch between > the > two > transparently. Instead, the TV takes notice of the switch and needs > some > extra syncing time. Is this expected behaviour?
You mean switching underscan off and on or a VT switch? The hw has to reprogram the mode when it changes the underscan. As for a VT switch, it should just be changing the crtc base, but IIRC there was a bug where X and the console used slightly different modes in some cases.
VT switch. I use video=1280x720@50 on the command line and select the corresponding EDID resolution within X. Is this bug still pending or is my scenario supposed to work?
You may be seeing this issue: http://lists.x.org/archives/xorg-devel/2010-August/011743.html
"In the absence of the user specifying an overriding monitor configuration, trust the KMS drivers to have correctly probed the output modes."
Well, in my case I *am* specifying an overriding monitor configuration. Is there still a chance that video=1280x720@50 could be meaning s.th. different then the corresponding mode in X that I explicitly choose.
Can you verify that the console and X are using the same modeline? The mode selected by video=1280x720@50 likely has different timing than the timing used by the mode in X. Is the VT switch smooth when you don't specify the mode on the console or X (i.e., let the driver decide on it's own)?
Ah ok, "likely has different timing" - this is probably the issue here. I'll be investigating this. Is there a way to influence the timing used by video=1280x720@50 to match the one used in X? Or should I try finding out about the console timing and use that as an xorg.conf modeline?
Probably the best bet is to match the X timing to the mode used by the console. IIRC, the drm fb code generates the mode using the cvt algo or something like that.
Alex
dri-devel@lists.freedesktop.org