This was how we did computer graphics before there were affordable (less than $50K) frame buffers in 1980.
In my 1975 MIT Digital Systems Lab my team constructed a hardware Game Of Life out of TTL gates. I designed the display as timed X-Y points on an oscilloscope. In those days a kilobyte of RAM still cost a hundred dollars, so the computer lab rationed the number of memory chips each team could use. I recall we stored the automata in a dual 64 x 64 bit frame buffer or two kilobytes overall.
The first generation computer graphics languages were vector-oriented to support either oscilloscopes or pen-plotters.
My first color frame buffer terminal was 512 x 512 x 8bits $30K AED in 1980. I think this costs than a dollar on a low-end cellphone now.
Ancient desktop is todays (or at least yesterdays) embedded. I guess this dates me but instead of taking the legacy Z80 embedded micro controller class a long time ago, I took the then new brand new 68hc11 class, although the Z80 guys got to use two DACs and a scope to display a cube on the scope as one of their labs. Now using a mhz class cpu to output a table at khz speeds isn't very impressive, so logically the next step in the lab was rotating the cube. I donno if they had to dream up their own trig or got a canned library. Maybe they canned the whole thing and just stored 30 or so frames of rotation data and switched which frame they displayed every tenth of a second or whatever. Donno didn't take the class but we got to see their work on lab day.
I would imagine whatever took a "sheet of paper" sized development board with dedicated hardware DACs and eproms around 1990 could be done as a single chip solution today, assuming you can find a single chip solution that actually has two on board DACs. Or up the challenge by synthesizing PWM signals and low pass filtering them, essentially a class D amplifier controlled by software.
Surprisingly or not, PWM circuits (a counter and binary comparator) are more common than DACs on today's microcontrollers. I blew up some somewhat expensive tweeters finding out that just because you can't hear the 31.275kHz PWM carrier doesn't mean the amplifier isn't amplifying it.
It is definitely plausible that they were rendering their spinning cube in real-time. A lot of people have written wolf3d-style raycasting engines for z80 (mostly z80-based TI graphing calculators, which are clocked between 6MHz and 15MHz. much faster than a TRS-80 to be fair). This guy even made a doom-style engine: http://benryves.com/journal/3739423 (see animated screenshot). The stuff the demoscene has done with z80 microcomputers can be even more impressive.
True, but I was concerned with it being a school lab, they were already demonstrating dual dacs, linked list data structure with semi-complicated data structure (well, OK, x and y values and maybe just a dumb lookup table) in assembly. Asking them to add a third aspect of trig might have been too much for one lab so maybe they canned the rotation and just had a very large data table animation rather than calculate on the fly.
My project for the demo day in 68hc11 class was the 68hc11 was a weirdo among microcontrollers for having an external memory bus, so I slapped a 32k sram onto it making it a 32K MC and made what amounts to a drum sampler, press this button to record a couple seconds off the onboard A/D and press this button to play it back using an offboard DAC. Now a days of course microcontrollers rarely expose their memory bus and you'd just buy a COTS MC with more memory on the chip. We only had one day to prep for demo day and asking the Z80 kids to do DACs, and a vector algo, and trig all in one day might have been asking too much.
Its been awhile, but I think the demo day theme was we all had to use at least one off chip DAC.
I saw a video titler which solved the problem of expensive memory by keeping the frame buffer compressed, and decompressing it at frame rate. I remember that composing the image was very slow (on a 4 MHz 68000). The compression algorithm was run length limited.
The product was the Vidstar-2000 from Video Data Systems in 1984.
Some early flight sims used a trick of rendering just before electron gun beam. You only needed a small amount of ram for a big screen, as long as you were able to keep up with the beam.
The first generation computer graphics languages were vector-oriented to support either oscilloscopes or pen-plotters.
My first color frame buffer terminal was 512 x 512 x 8bits $30K AED in 1980. I think this costs than a dollar on a low-end cellphone now.