Hello Thierry,
Commit [1] introduced a severe GPU performance regression on Tegra20 and Tegra30 using.
[1] https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/commit/?h...
Interestingly the performance is okay on Tegra30 if CONFIG_TEGRA_HOST1X_FIREWALL=n, but that doesn't make difference for Tegra20.
I was telling you about this problem on the #tegra IRC sometime ago and you asked to report it in a trackable form, so finally here it is.
You could reproduce the problem by running [2] like this `grate/texture-filter -f -s` which should produce over 100 FPS for 720p display resolution and currently it's ~11 FPS.
[2] https://github.com/grate-driver/grate/blob/master/tests/grate/texture-filter...
Previously I was seeing some memory errors coming from Host1x DMA, but don't see any errors at all right now.
I don't see anything done horribly wrong in the offending commit.
Unfortunately I couldn't dedicate enough time to sit down and debug the problem thoroughly yet. Please let me know if you'll find a solution, I'll be happy to test it. Thanks in advance!