I've been posting here on the Gemlog for about six months now and an apparent pattern has emerged where I want to make a post every ten days or so. A week would be too often, and more than two weeks seems too infrequent.
(I could verify this by running a small script that averages the post intervals, but maybe I'll invest the time into more useful pursuits. π)
This post is just a summary of what has been going through my mind recently, so no cohesive narrative this time.
The past days have been been tiring, although mostly for mundane reasons: running around with my toddler at the summer cabin, waking up at night due to various reasons β including a mini heat wave β and generally trying to keep up with household chores. I'm really feeling the lack of high-quality sleep.
Finger is an interesting old-school protocol. You can use it to serve arbitrary text based on an input string. The lack of hyperlinks in content is the only thing that prevents it from becoming interactive, although I suppose a client could just go ahead and detect URLs in the response.
I wrote a Python script that reads a .gmi file and hardwraps it with some basic ASCII formatting, like centering and underlining headings so they stand out. I was planning to include ANSI color codes as well, but it appears those don't make it through old-school terminal Finger clients. I could use this to serve all the Gemini content here via Gopher, too, but I'll save that little project for a rainy day.
In other news, it's WWDC week! I've been paying for an Apple developer account for a number of years, so this is when I get my money's worth: the privilege of running buggy pre-release operating systems. So far iPadOS 15 seems cool with its multitasking UI improvements. I won't be updating my phone any time soon as I prefer it to remain in normal working condition. The current TestFlight build of Lagrange seems to run like a champ without any new issues, which is nice.
Another Apple related tidbit that I've found interesting is that their Dolby Atmos Spatial Audio feature is now available. I'm sure better headphones will benefit more from this; I've been listening to some songs on AirPods Pro. The results are a bit hit and miss for me. Some songs just feel oddly muted, as if the frequency equalizer was mistuned. Other songs show marked improvement, particularly if they are acoustic and use only a couple distinct instruments. You can hear the spatial separation of acoustic guitars quite well, for example.
Spatial audio rendered via headphones is still just a magic trick since the final audio ends up being stereo anyway. People's ears are shaped differently, so the modifications needed to apply to audio frequences (and timing...? I'm not an expert) to perform the multi-channel mix would ideally be optimized for each listener individually. I believe Sony discussed this at the time of the PlayStation 5 launch. Apple seems to be using a single generic audio profile for everyone.
In any case, I'll probably end up leaving the spatial audio setting enabled to get the benefit of these improved audio mixes. I also understand that in some cases (like watching video) spatial audio can respond to turning of the listener's head, which I haven't tested myself but can imagine will make the effect considerably more believable.
π 2021-06-09
CC-BY-SA 4.0
The original Gemtext version of this page can be accessed with a Gemini client: gemini://skyjake.fi/gemlog/2021-06_10-day-poster.gmi