https://bugzilla.kernel.org/show_bug.cgi?id=172421
John (sirfixabyte@gmail.com) changed:
What |Removed |Added ---------------------------------------------------------------------------- CC| |sirfixabyte@gmail.com
--- Comment #19 from John (sirfixabyte@gmail.com) --- I read all the previous comments - for and against the adoption of the patch. Question: Why is it easy to find a pixel clock patch for WINDOWS (www.monitortests.com AMD/ATI Pixel Clock Patcher by someone named 'ToastyX') -available and supported since 2012 - and successfully running 4K screens (since January 2016) on the same Radeon cards being discussed here ? Many people on that webpage discussing it. Seems to work. I personally tested on a few older Radeon cards and it works at 3840x2160 for me. I have NOT yet run it for hours (I rarely run Windows, and then only to test of fix PC's for others). I have not connected temperature sensors to heat sinks yet.
If the pixel clock generator circuitry is on the same die as everything else, then it shares a heat sink. The whole thing designed such that with recommended airflow across that heat sink, the GPU remains functional. BUT, that can mean running 3 separate displays - utilizing the full capacity of the GPU.
IF this patch from Elmar Stellnberger were to be used to run a SINGLE 4K LCD at 3840 x 2160 at 30% overclock on the pixel clock generator, maybe the overall GPU would be generating much less than maximum heat, and the heatsink/fan could easily keep it cool.
Does anyone have a maximum pixel clock specification for the various pixel clock generator designs on the various ATI/AMD dies ? Did ATI set limits due to HDMI cables and overall ability of heatsink to dissipate the heat when running the GPU at max speed / load on 3 screens ? Is there a listed maximum voltage that the PLL can run at - long term - without damage ? for most integrated circuits data sheets I have ever read, there is a relationship between max speed and temperature of the die.