Wednesday, September 11, 2019

Tech Talk (need coffee?)


I like that Wikipedia helped propagate the term "disambiguation" as indeed we have a need to disambiguated a lot of the time. From all the press given "machine learning", if you have that wired up with "AGI" (artificial general intelligence), you may be thinking we're on the brink of the Singularity.  Or you may think more as I do, that a radio hasn't a chance of inventing radio, no matter how sireny the songs.

The science fiction world we each live in, really matters, to each one. But as a fish to water, so we are, to this ocean of ours.  Expressing it is nigh impossible in some ways.  That sense of a bubble with limits is still valid, however much we've turned to foams (cite Sloterdijk).

Perhaps your world features immanent AI takeover of human affairs.  Or maybe it features old people suckering for expensive "we'll upload your intelligence" schemes.  We've seen all these plots in the movies.  I'm not suggesting either world is original in the sense of widely unshared.

I'm back to a full teaching load and want to give my Youtubes more time to just sit there and undergo fusion, or fission, or whatever metaphor.  Translation:  I'm not making one today.  I've been on a roll lately, churning them out.

Am I positing a mental process out in rackspace somewhere?  Not exactly.

I'm suggesting more stochastic energy patterns, thanks to random search engine activity, with researchers wanting to track down some supposed factoid.

That there's a stash of Youtubes on all that "Bucky stuff" already out there becomes known, whether watched or not.  Nor is mine the only stash.

In my science fiction, we feature "grid talk" a lot, which includes appreciating California's commitment to open source (peer review).

Once a secretive security state thinks it's in charge, and responsible for covering up all hint of scandal, we've lost public oversight of a public utility.  The open source liberal arts were not about altruism first and foremost, so much as omni-triangulation and integrity.

You don't want to put all your eggs in a no-bottom or weak-bottom basket.

I've been going over some of the Youtubes about the prospects for AGI in the future.  What I look for is honesty about where we're at today i.e. I have little patience for hush hush "secret labs".

Regardless of where you stand on whether electronic circuitry might support consciousness, implicit endorsement of deliberately misleading science fiction doesn't look good in the rear view mirror.