On Wed, Jun 17, 2015 at 04:14:07PM -0700, Doug Anderson wrote:
If you plug in a DVI monitor to your HDMI port, you need to filter out clocks > 165MHz. That's because 165MHz is the maximum clock rate that we can run single-link DVI at.
If you want to run high resolutions to DVI, you'd need some type of an active adapter that pretended that it was HDMI, interpreted the signal, and produced a new dual link DVI signal at a lower clock rate.
Signed-off-by: Doug Anderson dianders@chromium.org
Note: this patch was tested against a 3.14 kernel with backports. It was only compile tested against linuxnext, but the code is sufficiently similar that I'm convinced it will work there.
Really? I have to wonder what your testing was...
hdmi->vic = drm_match_cea_mode(mode);
if (!hdmi->vic) { dev_dbg(hdmi->dev, "Non-CEA mode used in HDMI\n"); hdmi->hdmi_data.video_mode.mdvi = true; } else { dev_dbg(hdmi->dev, "CEA mode used vic=%d\n", hdmi->vic); hdmi->hdmi_data.video_mode.mdvi = false; }
mdvi indicates whether the _currently set mode_ is a CEA mode or not (imho, it's mis-named). It doesn't indicate whether we have a HDMI display device or a DVI display device connected, which seems to be what you want to use it for below.
To sort that, what you need to do is detect a HDMI display device using drm_detect_hdmi_monitor() on the EDID received from the device before parsing the modes, and save that value in a dw_hdmi struct member, and I'd suggest that it's a top-level struct member, not buried in 'hdmi_data' or 'video_mode'.