What Linux driver subsystem/API is used for a simple screen/monitor device?

memtha :

I am developing an embedded system with a touchscreen. The touchscreen operates as both input and output, with a "virtual" keyboard overlaying the graphical output.

I have a working device driver that reads input from the touch sensor and translates it correctly to key presses, created with the help of this guide on kernel.org. I want to expand this driver to also handle image output to the screen.

I want to support both getty and X, with as little duplication as possible. I am running a minimal Debian variant with cherry-picked packages, such as minimal X. Note that I do not intend on attempting to get this driver into the repository pipeline, though I might dump it on a public GitHub repository.

Outputting screen images is presently done via a cringy workaround: a boot option to force rendering to the CPU's embedded graphics hardware, despite it not being connected to a display, and a daemon that continuously screen-scrapes that buffer, modifies a handful of pre-defined pixels to create the keyboard visual, and pushes it out to the real screen.

This works as a proof of concept, proving that I do correctly understand the language the screen device expects, but is obviously sub-optimal.

kernel.org also has a guide for "DRM" device drivers, but that seems like serious overkill for what my hardware is capable of:

The Linux DRM layer contains code intended to support the needs of complex graphics devices, usually containing programmable pipelines well suited to 3D graphics acceleration.

None of my hardware has anything resembling 3D acceleration, so I conclude that this is probably not what I want.

What subsystem/API should I use? I figure one piece of missing terminology is what is holding back my searches, but any more information on how to accomplish this would be appreciated.

Hardware details (probably irrelevant): The CPU and screen communicate via 8080-esque parallel protocol, which the CPU does not support natively, so I'm emulating it with GPIOs (by manipulating registers via mmap).

Sending a complete screen image takes about 20 ms, but obtaining a complete copy from the embedded graphics buffer takes ~180 ms, so skipping that step is the most important objective. The screen hardware includes enough SGRAM memory to keep an entire frame worth of data, and supports writing a rectangular sub-region, so a hook to only update the part of the screen that has changed would be desirable.

The screen is not particular about the timing of incoming data. The touch sensor input is handled by a purpose-built IC that communicates with the CPU via I²C, which the CPU does support. The present driver uses the linux/input-polldev.h interface. The CPU is a Broadcom BCM2835, the screen is a TFT with an embedded Himax HX8357 controller, the touchscreen sensor decoder is an ST STMPE610, and there is a voltage levelshifter (Nexperia 74LVCH245A) in play between the HX8357 and the BCM2835. More details are available upon request.

LasseF-H :

The terminology that you are missing is a framebuffer device. You can find the kernel.org documentation for it here.

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related