Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
May 14, 2021 06:52 am GMT

'I wonder if...': Moving ASCII art

'ASCII cinema,' was all that I heard. Sat in a weekly sprint meeting, my mind focused more on arranging my upcoming week of work than paying attention to the person offering a retrospective of their just-completed sprint, my ears perked up at the mention of what sounded like something magical.

The reference was, of course, to asciinema, a tool for recording terminal sessions that was wholly-relevant to the retrospective. But all I heard was, 'You could play The Godfather in the terminal.'

Such technology speaks for itself, surely?

We all joked about my absentmindedness and mused for a few minutes on the practicalities of playing back video as moving ASCII art, then continued with the retrospective. But my brain couldn't let go of the challenge; by the end of the meeting I had already mapped out the basic workings of the codec in my mind's IDE.

For those as lazy as me who just want to see the end result, this is what I managed to code in the space of about half an hour, using a GIF as input, JSON as the storage format and Golang as the programming language:

Make it so

Having decided to write the most [un]necessary codec I could think of, my first thought was how to store the data.

In the spirit of misusing technology, I decided that JSON would be the way to go. The format certainly lent itself to storing necessary metadata such as the frame rate, which I planned to preserve from the input file and match on playback with dynamically calculated sleep commands.

I recently gained some experience working with images on the pixel level for another project, so my go-to approach was to read in an image (a GIF version of a movie seemed to make sense) and grab the RGB value of each pixel to save losslessly in some manner that could later be displayed in the terminal.

To begin with, I knew that I needed to store an initial key frame containing the colour values of all of the first frame's pixels. Coding in Golang, a map of maps of integers (an associative multidimensional array, in other languages) seemed to fit the bill, where the map keys would represent x/y pixel coordinates and the leaf values would be -1, 0 or 1 to denote the pixel's colourI initially opted for a black / grey / white colour palette.

The JSON for such a key frame would look something like:

{  "1":           // Row 1    {      "1": 1,    // Column 1      "2": 0,    // Column 2      "3": -1    // Column 3    },  "2":           // Row 2    {      "1": 0,    // Column 1      "2": 0,    // Column 2      "3": 0     // Column 3    },  "3":           // Row 3    {      "1": -1,   // Column 1      "2": 1,    // Column 2      "3": 1     // Column 3    }}

When the values for such a data set are arranged in order, a basic picture emerges:

0000000000000000001111110000000111111110000011111111110000011111111000000111111110000000111111000000000000000000

Subsequent frames would also have their pixels traversed, but only those pixels whose values differed from the previous frame would be stored on the next frame map. For example:

{  "1":           // Row 1    {      "2": 1,    // Column 2      "3": 0     // Column 3    },  "3":           // Row 3    {      "3": -1    // Column 3    }}

If the only change in a frame was that a character blinked, then only data for the pixels around their eyes would need to be stored.

Playing it all back

As mentioned previously, I opted for black, white and grey as the three colours of the palette. My theory was that you could get a fairly good image out of those three colours by taking the average of each pixel's RGB components, ((R+G+B)/3), and assigning the lower third to black, the middle third to grey and the upper third to white.

Characters that covered an appropriate amount of space on-screen could be used to represent those colours (I chose a space, a slash and a capital W, respectively). Displaying them would simply involve building up multiple strings (one per scanline), flushing them all to the terminal once per frame and then issuing sleep and clear commands.

The first test produced this:

Three 'colours' worked, but not well enough.

The result wasn't terrible, but the image also wasn't as clear as I had hoped. I needed to preserve more colour data. After some googling I managed to find a list of around 70 characters that covered a wide range of screen real estate.

I processed my test video again but the result was disappointingthe animation was mostly recognisable, but the resolution was simply too low to make the extra 'colours' worth it.

After much tinkering, a selection of ten characters seemed to yield the best results. This is what you saw in the first GIF.

Improvements I should have made

There were some obvious improvements that I could have made to the codec, had I not given up once I had proved to myself that, yes, I could watch The Godfather in the terminal, if I really wanted to.

Experimenting with a 500MiB JSON file (converted from a 20-minute TV show), I was faced with a lead time of around one minute while the data was loaded into memory. It strikes me that abandoning JSON and switching to a one-frame-per-line format would enable streaming to begin instantly.

Switching to a non-JSON format would probably also yield lower file sizes, perhaps as small as 200MiB in this instance, assuming that low-impact delimiters were used in favour of the quotes and braces that comprise a sizeable percentage of JSON strings.

It would also be interesting to explore run-length encoding for key frames (and/or those frames with over 50% of pixel values changed). Although I have not yet carried out any experiments to confirm it, I suspect that the small 'colour' palette would lend itself nicely to this kind of compression.


Original Link: https://dev.to/ctowestmidlands/i-wonder-if-moving-ascii-art-5b67

Share this article:    Share on Facebook
View Full Article

Dev To

An online community for sharing and discovering great ideas, having debates, and making friends

More About this Source Visit Dev To