Thomas: The stuff below derives from what I mentioned to you long, long ago about my xterm, the one that "would be little different than xnest." I said it before, and I'll say it again: "vaporware is so delightfully easy to write." Anyway, I'm sending you a copy so that you can tell me I'm mad. :^) Matt, I've been thinking some more about terminals. First I'd like to expand the definition of terminal to include more modern usage: A terminal is what the user sits in front of. Right? This makes sense, because it's the be-all and end-all of the user's life, his beginning and his end, so yeah, it's terminal, and so is the associated addiction. Let's consider the terminals I use, in other words, the hardware I have arranged around the various computer chairs. You will recognize this one. :^) Ant ------. NES ---- VCR ------ A/V switch ----+-. SNES ------' | | | | | N64 ------------------' | | | | PSx --------------------' | | | DVD ----------------------+--------)-)---. .----(chainsaw)------. | | | Monitor ----)- ATI es1370 -)--)-+-+-)----- Gamepad Monitor ----)- S3 3-1/2" | | | | | CD-RW i810 ---)--)---)-)--+-- Microphone Keyboard ---)- PS/2 5-1/4" | | | | `-- Radio Mouse ------)- PS/2 bt878 --)--+---)-' `--------------------' | .----(stereo)-------------. | GBA ----)- MD-walkman Video -----)----' L-spkr --)- L-out Radio | R-spkr --)- R-out MD r/w | | Cassettes CD changer | `-------------------------' Now *that's* a terminal. :^) Now imagine chainsaw with an attached printer, scanner, modem, fax machine, digital camera, and, erhm, some USB junk. See, things can get a lot more complicated. Believe it or not, I have unused ports on chainsaw: - Serial port - Parallel port - DVI/VGA port - Composite out - Composite in - Microphone port - Primary spk out - Secondary spk out - Many USB ports - AMR (modem) slot - AGP slot All the other computers I use are relatively simple. If I mashed them all together, the result would have: - PCMCIA slots - Graphics card - Builtin monitor - Sound card - Speakers - 3-1/2" floppy - ZIP drive - Builtin keyboard - Stick mouse - Glidepoint mouse - PS/2 mouse - USB mouse - PS/2 keyboard - USB keyboard - Port replicator - CD-ROM/CD-RW - Serial port(s) - Parallel port - External monitor - S-Video out - USB ports What am I getting at? Well, as you can see, the UNIX notion of a terminal is pretty antiquated--- the most advanced capabilities I've seen supported are an attached printer and basic Tek graphics. Access to all this other hardware computers have these days is done through separate interfaces to which network access is usually difficult, or at least very... "diverse". Hardware device Network forwarding mechanism ~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Text display telnet, ssh Basic keyboard telnet, ssh Basic mouse telnet, ssh Graphics display XFree86 Frame grabber XFree86 Raw keyboard XFree86 Raw mouse XFree86 Joystick ? Sound NAS, aRTs (?) Printer cups Storage NFS CPU time OpenMosix (?) I didn't list VNC because it exports access to local graphics applications rather than exporting access to a local graphics terminal, so for our purposes it's backwards (although still useful?). Example time!! I'm sitting in my dorm room at UTA. Since I don't trust the OIT to keep its network secure, I have placed a firewall between my terminal and the uplink to disallow all incoming connections. Okay, I want to do some work on omega, so I ssh over. Now I need a printout (teacher wants hardcopy), so what do I do? My options: - Send the job to a lab printer, get up and walk to the lab, wait in line, and pick it up. - Redirect the job to a file, scp it to my local machine, then send it to my printer. - Install Samba, allow incoming connections on the Windows ports, then send the job from omega straight to my printer. - Allow incoming commections on the cups port, convince OIT to switch, then send the job from omega straight to my printer. Of course, I choose door number two. Next I need to copy my source code onto a floppy (teacher wants that too). How? I again have options: - Move the files to public_html on omega, download to my diskette, move the files out of public_html, and hope no one else stole a copy. - scp the files to my local machine, then copy them to my diskette. - Install NFS, get r00t on omega, mount my floppy over the network, then copy to /mnt/andydisk. - Configure my firewall to allow incoming FTP data, FTP the files from omega to my disk, and hope no one sniffed my password. Again, door number two. This time it's no so bad, but still it requires either that I re-enter my password (annoying!) or use public/private keys (which as far as I can tell is broken or disabled on omega). Meanwhile, I've been happily sending keyboard events and receiving a full- screen text display through my original ssh link, all courtesy of the UNIX tty system. ssh also allows port forwarding, which is useful in the following scenarios: - Assuming I have the cups client software installed on omega, I could forward my cups server port through the ssh link and on omega set CUPS_SERVER to localhost:port. This would let me print to my local printer without fussing around with redirection and scp or walking to the nearest lab printer. - If I want to do any graphics work, I can enable X forwarding in ssh (hopefully omega doesn't have that turned off as well...). Once the option is enabled, it's pretty much automatic since when I log in the DISPLAY variable is properly set. But this is done using ssh magic, with no help from the terminal system. What I have in mind is a unification of all this junk. If all the hardware located at a single computer station are together classified as being part of the terminal, then they should all be accessable to any program with access to your tty. For instance, if I ssh to another computer and run zsnes, it should be able to see my joystick and send back audio and a graphics display all through the same link. If I run sox on another computer, it should be able to get sound from my local microphone. If I had a midi keyboard it should be possible for a remotely-running synthesizer program to read my notes and convert to audio that I hear through my speakers. And a more extreme example: it would be nice to export CPU time as part of the terminal interface (since it's otherwise wasted...), so that remotely-executing zsnes actually gets distributed between the host computer and my terminal. An additional benefit would be the removal of vt.c(2.6)/console.c(2.4) from the kernel. It's not entirely tragic that Linux standardized on the vt102, but it's still policy in the kernel, a large amount of code emulating a device that doesn't really exist. What's more, it serves to hide the real device (the VGA), which on modern computers is *far* more powerful than any vt102. Let's take the case of dosemu running QBASIC running that funky 3d text maze program I showed you long ago. (Hang on, it's complicated.) The maze program knows exactly what it wants the screen to look like, a pixellized bitmap. It must translate that bitmap to LOCATE and PRINT CHR$ because that's what QBASIC expects. QBASIC translates those commands to int10 BIOS calls. The BIOS converts from a tty-centric view (print string, change color, carriage return, etc.) to a VGA-centric view (write character and attribute to position) and pokes the appropriate values in memory. dosemu traps the resulting memory fault and internally emulates the MOV instructions, instead writing to a buffer representing what it thinks the screen should look like. At a regular interval it attempts to update the user's screen by sending the modified rows of text to slang (similar to curses; check it out). In the process dosemu must translate from the DOS character set (cp437, for instance) to the host character set (often latin1). Slang, unaware that dosemu fully cooked everything, tries to internally handle newlines and tabs and such, but of course it finds none, since QBASIC, the BIOS, and dosemu already did that. In the process it updates its own screen buffer, then when it thinks it's time to flush it converts that to a series of characters and escape sequences, buffers that temporarily, and writes it to stdout. Linux, upon receipt of text on /dev/tty*, translates it from vt102-speak to a series of MOV's to VRAM. How many buffers do we have? Maze knows what it wants the screen to look like, maybe QBASIC keeps a copy (probably not), BIOS possibly keeps a copy, dosemu has a copy, slang knows what the screen looks like, Linux may keep a copy, and so does the VGA. Wow! Depending on how you look at it, that's anywhere from three to seven equivalent buffers! Now let's change this a bit. First, let's make Maze avoid LOCATE/PRINT in favor of DEF SEG=&HB00/POKE. Next, let's remove vt102 from the kernel and replace it with a VGA driver. Finally, let's make a userland terminal daemon to group together the VGA, the keyboard, the mouse, the sound card, and so on (I make it sound so easy, hehehe). dosemu is connected to the terminal, which in turn is connected to the appropriate /dev nodes. Maze tells QBASIC to write to VRAM (thereby bypassing the BIOS tty driver). dosemu catches the faults, and converts them to "write character and attribute to position" codes which it sends to the terminal daemon. The terminal daemon converts these codes to a format the kernel understands (ioctls, streamed bytes, or better yet, writes to an mmap). The kernel executes the VGA calls by actually writing to VRAM. Much better, I think. What if a program wants a vt102? The terminal daemon can emulate it, possibly through use of a separate translation program or plugin. What about pseudo-terminals? A new terminal daemon can be spawned and connected to a host application instead of (or in addition to) /dev nodes. The tricky part is figuring out the communication between application and the terminal daemon. The protocol would need to completely describe a huge amount of hardware, possibly everything from cue:cat scanners to floppy drives to keyboards to homebrew LED panels, anything that can be reasonably classified as being part of a terminal. The underlying protocol should probably be a serialized byte stream so that it can most easily be forwarded using ssh, but there must be a wrapper library providing an easier means of programming all the available hardware. It should also extensible with new protocol. For instance, if the program is running on the same machine as the graphics card, it should be able to request access to a shared memory segment mapped to the frame buffer. And remember when I was talking about having a separate control channel? That wouldn't be necessary if all data to be displayed is encapsulated as control events, so rather than just sending "hello, world" the application (via the library) may send "21:print12:hello, world,," (syntax borrowed from http://cr.yp.to/proto/netstrings.txt just for fun). What I'm describing here is a universal X. Pheer. :^) Also the interface between the terminal daemon and the host (for pseudo- terminals) is nontrivial. Take the case of an xterm with scrollback. If the user hits shift-PgUp or scootches the scrollbar, how does the xterm access the scrollback buffer? Where does it live, in the xterm or in the terminal daemon? If it's in the terminal daemon, then the xterm needs to have read access to it and to trim its maximum length. If it's in the xterm, then a very similar block of code will have to be implemented in xterm, eterm, wterm, aterm, dtterm, konsole, and all other xterm clones. konsole has a feature allowing virtually infinite scrollback by writing old buffer to disk... would it be appropriate to add such a thing to the terminal daemon? Ack, where's the division of responsibility!? Right about now I'd love to have this code ready-to-go so that I can create two terminal daemons, each connected to its own keyboard and monitor, then on each run dosemu running VSM. Then my work for The Movie Store would be finished. Hmm, I'd also need both terminal daemons to connect to the same printer, so a spooler would be necessary. While I'm thinking of it, virtual consoles could be implemented as follows: Monitor Keyboard Mouse `--------.|.--------' ||| Terminal Daemon | VC Switcher ||| .--------'|`--------. Terminal Daemon | Terminal Daemon | Terminal Daemon | bash | vim getty When the user presses, say, Alt+F1, the top terminal daemon would receive this event from the keyboard device and pass it through to the VC switcher, which would then disconnect whatever virtual console was previously connected in favor of console #1. Some protocol could be worked out for notifying the old console of the disconnect, possibly including revocation of its mmap of the display (if it had one). What's more, the VC switcher (which would work like GNU screen) could overlay the display with notification of activity on a console not currently selected (contention between the VC switcher and the selected terminal daemon for this region of the screen would have to be resolved somehow, if the terminal daemon has an mmap of the actual hardware). Or it could divide the screen up into panes. Neat. :^) As you might imagine, this idea needs much, much, MUCH more thought before it can possibly be turned into code. For now it's just a thought, perhaps something you might be interested in. If not, oh well, this email was fun to write. :^) -- Andy Goth + unununium@openverse.com + http://ioioio.net/