On Fri, Dec 17, 2010 at 5:02 AM, David Sin davidsin@ti.com wrote:
On Thu, Dec 16, 2010 at 06:43:48PM +0100, Arnd Bergmann wrote:
As far as I can tell, both DMM and GEM at a high level manage objects in video memory. The IOMMU that you have on the Omap hardware seems to resemble the GART that sits between PC-style video cards and main memory.
I don't know any details, but google quickly finds http://lwn.net/Articles/283798/ with a description of the initial GEM design. My main thought when looking over the DMM code was that this should not be tied too closely to a specific hardware, and GEM seems to be an existing abstraction that may fit what you need.
Arnd
Thanks for the pointer, Arnd. I also found a nice readme file in the gpu/drm directory, which points to a wiki and source code. I'll read into this and get back to you.
I get the impression with the ARM graphics, that you just have a lot of separate drivers for separate IP blocks all providing some misc random interfaces to userspace where some binary driver binds all the functionality together into a useful whole, which seems like a really bad design.
Generally on x86, the tiling hw is part of the GPU and is exposed as part of a coherent GPU driver.
I'm just wonder what the use-cases for this tiler are and what open apps can use it for?
Dave.