I keep seeing advertisements online for lite/small/basic/dumb phones.  These are usually promising to break the user away from the mind-numbing addiction to the doomscroll and allow them to once again see the world around them.  I am guessing that none of these products actually have much of a likelihood of succeeding in the marketplace because, at the end of the day, they are the electronic equivalent of a healthy diet and we all want pizza.

But I get the appeal.  I have gone to great lengths to simplify and cut back and escape the ultra-intrusive and soul-crushing miasma that is the modern internet, social media, news, hell even the gas pumps have Maria Menounos talking your ear off whenever you just want to fill ‘er up.  The world is loud.  Everybody everywhere wants a piece of everyone else, everybody wants to be viral and sticky and every available niche is being filled by noise.  It’s awful.  No wonder we tell ourselves that a simpler phone will save the day.  Seems like such an easy solution, but that’s an illusion. The phone isn’t the problem.  The phone is a delivery device for the poison of our modern culture, sure, the smart phone is to psychological poison as the cigarette is to carcinogens, but the real problem is the fascination with and addiction to the gazillion small hits of dopamine we get from ingesting the latest stupid headline, the latest trivial status update, the latest tweet, the latest Tik Tok video, the latest, the latest, the endless content ocean.

I put it to you that the mindless consumption of endless hours of low value content and ephemeral news (always mostly bad) has never, in the history of humanity, been a healthy activity.  It was a little harder to do, back in the day, I’ll give you that, but only just.  You know what the Fox News MAGA Boomers have in common with their Zoomer grand-kids?  The former keep a television on during all waking hours, feeding themselves an endless stream of targeted information chosen by an editorial staff in the service of advertisers and the latter stare at a phone during all waking hours, feeding themselves an endless stream of targeted information chosen by an algorithm in the service of advertisers.  The Venn diagram is a circle.  Only the content differs.  The narrowcast, tailored, corporatized “social” web and app ecosystem is no more diverse, empowering, educational, or conducive to free thought than the old broadcast radio and television it has superseded.  At least there were three major networks broadcasting television to our parents generation and, you know, PBS, but for us it’s one bubble, crafted by tracking cookies, collaborative filters, and virality that create an echo chamber at the personal level that gives Fox News programming a run for it’s money in it’s extreme lack of variety.

Reality has been so curated for us, our ideas and desires and personal situations, our friendships and family connections have been productized, monetized, and exploited so heavily, that we find ourselves in an almost absurd predicament as a society.  We technically have more access to all of the information in the world than any population in mankind’s history and yet on a daily basis we have to make such a violent and intentional effort to encounter it that it might as well not be there.  We are the least informed consumers, the least enlightened populace, and the most radically misinformed bunch of sad sacks that the modern post-enlightenment world has ever seen.

Of course, this is all in the service of scratching the itch of boredom.  We work at our jobs all day and we crave something interesting and corporations are really really incredibly good at giving us diversions.  Allegedly we want to know what’s happening in the world, connect with our friends, laugh at something silly, but really, it’s just that we are bored and don’t know what to do with that novel feeling in a world so filled with stimulation.  In fact, I would go so far as to say that we don’t even have a chance to get legitimately bored.  We simply find ourselves lacking a diversion, which is not the same thing.  We have forgotten how to just exist to such a level that we equate being alive with boredom.  We get an idle minute and we have to decide to be unconscious (sleepy time!) or to seek out something diverting.  Diversion wins.  We happily step into the most convenient available trap.  The Phone.  The TV.  Potayto.  Potahto.  So, you see, this isn’t a new problem and a simpler phone isn’t much of a solution.  What we need to do is learn to do nothing and have it be enough.  Allow inaction to occur.  Don’t call it boredom.  And don’t seek a diversion.  Here are some exercises you can try.

Exercise: Turn off all electronics.  Put them in a totally separate room.  Make a meal.  Eat it and give it your full attention.  Don’t shovel it in your mouth while scrolling Twitter.  Taste it.

Exercise: Switch out some piece of media consumption that you currently use a device for with it’s “obsolete” equivalent.  For example, you like your e-reader?  Read a print book for a change.  You love Spotify?  Dig out those old tapes or records or CDs from the closet and play one.  Experience the difference between streaming “media” into your bubble and the physical act of interacting with a physical piece of media.  Read a paper newspaper.

Exercise: Remember back to things you used to do to entertain yourself before you had a smartphone that you don’t do anymore.  Do that for a day.  See how it feels.

Exercise: Schedule times to be online for a week but otherwise, be offline by default.  Throughout human history, as recently as 10 years ago, most people were not carrying a phone around with them 24/7 and could not be pinged, messaged, rung up, or tweeted at and somehow, somehow, these brave ancestors survived.  Imagine a world in which your time was respected, in which nobody expected you to be waiting by the phone 24/7, nobody panicked if you went a day or two between texts, how much pressure would that take off your shoulders?  How much relief would you feel?

Exercise: Find a news outlet that is honest, reliable, and without partisan bent and (if you must consume current events) make that your first stop of the day.  Before you encounter memes, spin, or your own bubble, try to be aware of a neutral reporting of facts, sans opinions.  Then, for bonus points, form your own opinions.

Exercise: Track the trackers.  Add an extension to your browser that alerts you to how many organizations track your every move online and block them.  Observe changes in your online experience.  Opt for media interactions that don’t track you and, even more to the point, don’t monetize your activity.  Buy products, not access, copies, not subscriptions.  Companies don’t track you if you aren’t being monetized.  When is the last time you actually owned a copy of a new album rather than just streaming it?

Look, I get it, we aren’t ever getting rid of this technology.  You’re not going to live this way all the time.  These are exercises intended to make you think about the choices you’re making on a daily basis.  Practices to gain some perspective.  Things you can try doing to make yourself more aware of the ways you are being catered to, manipulated, handled, exploited, and sold.  We aren’t going back to the “good old days”.  There aren’t any.  We are, however, going to wind up in Idiocracy if enough of us don’t get out of the bubbles and into reality.  So, you know, stop reading this.  I’m not tracking you or monetizing your eyeballs but still, get offline.  Paint something.  Play that xylophone you got at the yard sale.  Read a physical book.  Sit quietly in a room and listen to your environment.  This whole online thing is a fiction and you know it.  Shoo.

Hold tight, this one’s gonna get nerdy.

Let’s take a little trip in the way-back machine to the dawn of the desktop computing era, that time period that we seem to be incapable of escaping: 1984.  It was in 1984 that Steve Jobs famously unveiled the first “modern” computer, the prototype forerunner of all we use today, the original Macintosh.  It wasn’t the first GUI, but it was the first commercial application of the idea that a computer has a mouse and desktop and icons and a What You See Is What You Get user experience.

The original operating system for this original machine was primitive on every level.  It ran on a 400 kilobyte floppy disk and that included the ability to do text-to-speech.  A marvel of engineering and design, yes.  Influential like The Beatles? Yes.  But a strong foundation for future computer operating systems?  Hell no.

The original Macintosh System Software had ground-breaking interface design, clever engineering to do a lot with a little, but ten years after that original launch the world had changed a lot.  Now it was 1994 and in the decade following Hello, Microsoft had turned Windows into a Mac competitor on IBM PC compatibles and Apple had squandered it’s first mover advantage and become something of an afterthought.

One lesser known, but massively important, thing happened during the decade of Windows’ rise to prominence, and it is that Apple fired Steve Jobs, shortly after the Mac launched.  Jobs then started a new company called NeXT (he also bought Pixar from Lucasfilm and created that whole company, which is, like, the hugest footnote to a career ever, but I digress) and NeXT needed an operating system for their cool new computer, the Cube.  Jobs didn’t want a clever bit of under-engineering (ala the Mac System Software) for his Cube.  He wanted what the big boys had been using since the late 1960’s: Unix.  So, he built an operating system on an open-source Unix variant called BSD Unix (Berkeley Software Distribution).  The resulting operating system, NeXTStep, was not a commercial success and neither was the company he founded.

Though not commercially successful in the way Jobs intended, two big things can be laid at the doorstep of NeXT and Jobs in this time period.  First: the guy who single-handedly created the World Wide Web, Sir Tim Berners-Lee, invented the Web on a NeXT computer.  So, by 1994, when we suddenly had the blossoming Web, when the internet started becoming a part of the culture, it was born on NeXT.  If that had been the only contribution of NeXT to the world, it would have been enough, but another thing happened.  Apple, struggling to avoid bankruptcy, brought Steve Jobs back as an interim CEO and one of the first things he did was buy NeXT and turn NeXTStep into….  tada!  Mac OS X.  Which then begat iOS and all the other flavors of Mac OS X.

If you own a Mac, if you own an iPhone, if you own an iPad, if you own an Apple Watch, or an Apple TV, or any Apple product made in the last 15 years except a clickwheel iPod, you have used the current iteration of the NeXT platform that Jobs launched after being fired by Apple back in 1985.  This also means that you have been using BSD Unix under the covers, whether you realize it or not.

Another 10 years later, 2004, and this was all obvious.  The WWW had taken over the world, Apple was back, Mac OS X was launched and headed towards success, Jobs was plotting the iPhone and iPad.  This is all history.  However, at the time, 1994, the big news was actually coming from Microsoft and their launch of Windows 95.  So let’s talk about that for a minute.

Windows was second to market with a GUI, and not technically superior to the original Macintosh operating system, but unlike Apple, Microsoft was determined to do something about it.  The previous version of Windows (3.1) was a 16-bit shell that ran on top of DOS.  Windows 95 was going to be a full 32-bit operating system like OS/2 Warp (Google it, it was a thing at the time) and what’s more it was going to include a new Microsoft Dial-Up Service called MSN that would compete with AOL and Compuserve (no, there was no internet access yet, they missed that one).   In 1994 we all knew it was coming but we didn’t get it until August 1995.  I should know, I bought a copy on launch day.

Windows 95 did change the world.  The user interface conventions we take for granted on modern Windows computers, they all started on Win95.  It was for today’s Windows but the original Macintosh was for the modern Mac from a user-interface convention perspective.  What it lacked, just like the Mac of the era, was stability.  And that’s where the real story begins, not with Windows 95, but with the REAL progenitor of the modern Windows computer, a totally different thing called Windows NT.

Now, you might be thinking, I have heard of Windows 95, and Windows 98, Windows XP, etc., but what is Windows NT?  I’ll tell ya.  Windows NT, initially released in mid-1993, was a version of Windows that was designed around a new operating system kernel, the “New Technology” kernel.  A kernel, BTW, is like the heart of an operating system.  It controls the reading and writing of data, the execution of programs, communication with devices, all that stuff.  It is not the part you see on screen, with the windows and icons and stuff.  All of that is just the graphics.  So, back to NT.  The first few releases were intended for servers, not desktops, where they wouldn’t be asked to run games or general productivity applications and would also be expected not to crash.  During the second half of the 90’s, Windows lived two lives.  There was the one that normal users had (95/98/ME) and there was NT (which most users never even heard of).

The first time most normal users got their hands on Windows NT it was going by a new name: Windows XP. The years leading up to Windows XP had allowed Microsoft to develop a strategy (“compatibility mode”) so that apps written for non-NT versions of Windows could run on NT, thereby allowing them to migrate to a more robust core for their operating system, the exact same thing Apple was trying to do with OS X.  In the case of Windows, the core was built around the NT kernel, in the case of Mac the core was built around the BSD Unix kernel, in both cases the goal was to get users off the crappy 80’s foundation and onto something reliable.  You can have religious wars about the NT kernel vs BSD and the user-interface choices made by Apple and Microsoft, but in general, these two platforms making these major shifts created the operating environments for most of the devices we all use today including laptops, smartphones, and tablets.

Now, I’ve intentionally left something out of this picture and it’s a doozy.  Way back in 1991, a student at the University of Helsinki, Linus Torvalds, was learning operating system design with a Unix clone called MINIX and he was annoyed by the limitations.  So, he made his own and he shared it on the fledgling internet and the snowball was pushed from the top of the mountain.  His creation, eventually dubbed Linux (named for Linus himself) has steadily grown and improved and spread throughout the known computing universe.  By some estimates, over 90% of the servers on the internet run Linux.  In the world of servers and other computers that normal users don’t touch, Linux is the king.  You use it every day that you go online, and you probably don’t know it.

And for Android users, this is even more true.  Do you have an Android phone, tablet, or watch?  Guess what….  The NT kernel -> Windows.  BSD Kernel -> macOS/iOS.  Linux Kernel -> Android.

So, NT was designed so Microsoft could compete with Unix in the server business, but instead it became XP and (eventually) Windows 10.  BSD Unix was used by Steve Jobs to make NeXT, which became Mac OS X and Linux, which was a clone of Unix, took over the server business instead of OSX or NT and now it’s at the core of almost every mobile device not sold by Apple.

At the end of the day, Unix-style operating systems are OWNING and even Microsoft has figured this out.  Microsoft came out as huge proponents of Linux several years ago, principally spearheaded by the head of their Azure division.  If you don’t know what Azure is that just means that you aren’t a professional software developer.  It’s not a consumer product, it’s a server thing for people to run their apps on the internet, hosted by Microsoft, and it’s extremely Linux-friendly.  Likely, the folks at Microsoft realized they would have no choice but to support Linux if they wanted to have a cloud-server product since almost the entire server side of the internet is based on Linux.  And they were right.  The guy at Microsoft who ran the Azure division was a fella named Satya Nadella and if that name rings a bell it’s because he is now the CEO of Microsoft, having replaced Steve Ballmer who replaced Bill Gates.

OK, so, the guy who brought Linux into Microsoft is now running Microsoft, so what?  Where are you going with this Sutter?  Well, remember how NT was a server thing and then became the new kernel for Windows a few years later with XP?  Well, there is increasing reason to believe that NT might be heading towards being replaced with, you guessed it, Linux.

A couple of years ago, Microsoft introduced a new Windows feature called Windows Subsystem for Linux or WSL.  WSL allowed a user to run a Linux environment within their Windows environment instead of dual-booting.  I tried it out and quite honestly I couldn’t see a use for it.  If I wanted to run Linux, I could run a full Linux environment.  If I wanted to make Windows more Unix-like there were a number of ways to do that.  WSL seemed like a solution in search of a problem.  But then they came out with WSL 2 (aka: Electric Boogaloo) and things got more interesting.  To radically over-simplify: version one created an environment where native Linux applications could get translated to the Windows core.  In version 2, Windows can now basically run the actual Linux kernel as one of it’s own processes, no translation.  They have even announced support for Linux graphical applications (WSL was only a command-line thing before).

This is starting to sound familiar…  Mac OS X anyone?  Or perhaps Android?  Mac OS X is a graphical shell designed by Apple that happens to run on top of the BSD Unix core.  Android is a graphical shell designed by Google that happens to run on top of the Linux core. It isn’t an insane leap of logic to envision a world in which Windows becomes a graphical shell designed by Microsoft that happens to run as a process on top of the Linux core, (rather than the existing NT core).  The current evolution of the Windows Subsystem for Linux, the entire Azure cloud offering, and the fact that Satya is the CEO all point in this direction, potentially.

What this would mean is that we would have reached a point at which every major operating system is some variant of Unix.  Mac, Windows, Android, iOS, watchOS.  Microsoft already made the startling decision to give up on developing Internet Explorer and Edge and instead of make a new version of “Edge” that is, at it’s core, Google Chrome, (which itself is based on Webkit, the open-source HTML rendering engine that started on Linux in the KDE Desktop and was adopted by Apple for Safari on iOS and then Google and now Microsoft).  They’ve learned that nobody cares about the engine under the hood, they care about the look and feel and the apps they can run.  A Linux-powered iteration of Windows might seem like a leap, but frankly, it’s not.  People made the leap from Windows 98 to Windows XP via emulation and compatibility layers.  The same transformation today could be done less painfully thanks to existing Windows compatibility open-source like WINE (software that let’s you run Windows apps on Linux without Windows).  The merger of Win and Lin seems almost inevitable.

And I’m not the only one saying it.  Open source pioneer Eric S. Raymond has recently posited the same idea (http://esr.ibiblio.org/?p=8764).  I, for one, hope this trend continues.  With the release of macOS Catalina, Apple has taken some previously unprecedented actions towards making OSX into the most closed, most proprietary, least free, computing platform every built, a platform in which no software that is not blessed and sold by Apple will even be able to execute.  Their pending move to making their own processors will make them an even more radically closed platform as their hardware too will be strictly proprietary.  As a proponent of open-source, freedom to repair and the like, I can’t condone the purchase of such disposable and proprietary technology.  The Windows NT kernel has never been my favorite, I’ve always been a Unix guy at heart, so, the idea that Windows might finally transform into Yet Another Unix Variant is one of the better possibilities I’ve run into lately.  Bring it on, Microsoft.  If this is how the Unix desktop finally conquers the market, as bizarre as the road traveled may have been, I’m ready for it.

I had a bit of an epiphany last night.  I don’t know if it’s particularly profound, but I feel like my eyes were opened to a few truths that I have long known and simply forgotten to apply in my life.

First thing.  I have been a music person my entire life.  I have listened to music, thought in music, sang to myself in the car, in the shower, written my own songs, recorded music, performed music, learned instruments, collected music, obsessed over music.  I know people who maybe own two or three CDs and casually listen to Spotify, you know, normal people.   In contrast, I have literally thousands of albums in numerous formats: vinyl records, shellac 78s, CDs, cassettes, reel to reel tapes, digital files, you name it.  OK, I don’t have any 8-track carts, gotta draw the line somewhere, but I do actually own a functional hand-cranked Columbia Grafonola record player. 

I’ve personally been involved with and worked on the recording of at least 40 recorded albums or singles as either a performer, engineer, producer, or sometimes all of the above.  I have a recording studio in my basement.  I own dozens of musical instruments.  Guitars, basses, drum kits, keyboards, horns, accordions, slide whistles.  Hell, there is a documentary being made in which my musical endeavors and life’s work feature prominently. 

I say all of this to highlight the fact that you would be hard-pressed to find a person who’s life is more obviously centered around music, which makes it all the more strange to me that I’ve been so out of touch, emotionally and professionally, with music for the last few years. 

I have played in several bands and participated in the documentary, but I haven’t released a new album of original music since a minor acoustic EP that I recorded in a day back in December 2014.  I used to wonder if something was wrong with me if I didn’t release an album a year, at least, and I’m now coming up on six years with nothing to show for it except for the memories of some gigs played, a handful of unfinished projects and a few one-off songs or videos.  I have written and recorded things but I just haven’t been able to get into any sort of rhythm (pun intended) with my musical life.  I think that’s because I haven’t HAD a musical life.  Instead, I have been knee deep in the Miasma and it’s killed my sense of joy, wonder, and creativity.  At the same time, as a listener, I have allowed music to become a background wallpaper to my daily life instead of truly engaging with, appreciating it, eating, sleeping, and breathing it as I used to do.

The Miasma is a term I recently acquired from the book Fall; or, Dodge in Hell by author Neal Stephenson.  It is the catch-all term for the cultural wasteland of insanity, trolling, confirmation bias, misinformation, distortion, propaganda, bad blood, viral marketing, and lowest common denominator garbage that the modern internet has descended into.  Everything about the public discourse, the endless doomscrolling, the sheer end of the world nihilism of late stage capitalism, authoritarianism, stupidity, violence, and (bonus!) a global pandemic, it’s all so disheartening, so maddening, that turning on a television, reading a newspaper, looking at a social media feed, or visiting nearly any part of the internet for any reason is guaranteed to make whatever mood I am in worse.  Good moods become bad moods, bad moods become dire.

Instead of using music or meditation or poetry or art or any of the other tools at my disposal to counter the effects of the Miasma, I have fallen into an engagement trap based on the fact that, at one point, I used to love the internet.  I did.  I believed in it.  I thought it was a net-positive for humanity.  In the world before the web, communities were more physically isolated, knowledge harder to access, there was much more terra incognita.  The promise of the web and the connected digital society as laid out by luminaries like Ted Nelson, Vannevar Bush, Nicholas Negroponte, Alan Kay, and even Steve Jobs was so appealing.  It was almost like a second Enlightenment Age dawning.  All the worlds knowledge available, all the communications barriers broken.  How could this be anything other than an Objectively Good Thing?

Well, as it turns out, every silver lining has a cloud.  As it turns out, people were not historically hostile and tribal merely because of limited communications technology or limited access to information.  People are hostile and tribal because they have been made that way through billions of years of natural selection.  They require almost no incentive whatsoever to pick sides and develop animosity towards each other.  Kurt Vonnegut nailed it with his granfalloon concept.  Thanks to this programming, hyper-connecting all the people was always going to mean that the people who thrive on rancor, discord, and negativity would have louder voices and more power to shape our culture than they did before.  Capitalism, which naturally goes where the market leads, would naturally find ways to monetize and stoke this hostility and division in order to make money.  Religions and political parties would do the same, feeding the flames to advance power and agendas.  These are not new forces in human society, they existed as far back as written history records and likely much further back.  It turns out that the previous limits imposed by geography, technology, and access to information were also holding some of our tribalism and collective insanity in check by channeling it into narrow and somewhat isolated outlets.  That is no longer possible.  Thanks to the democratizing power of the internet, we now have all of the foibles and ridiculousness of our species running amok, unfettered, unchecked by any force, Enlightenment 2: Electric Boogaloo has given way to Idiocracy 2: Boogaloo Now Means Race War.

But wait a minute, I hear you saying, wasn’t this post about music?  Yeah, yeah, I’m getting there, it’s my blog, I wanna take a while to get to my point, that’s my prerogative.  Keep your shirt on.

OK, so, the Miasma was probably inevitable, in retrospect, but I didn’t anticipate it.  I believed, perhaps too strongly, in the positive and empowering aspects of the always on, hyper-connected, society.  I thought it would lead me to more creativity (more ability to share what you create is good, right?), more human connection (all my old friends are here, that’s gotta be good, right?), and all the old hassles of primitive technologies would be rendered obsolete by the wireless, simple, one device to rule them all vision of the smartphone as digital camera, digital music player, GPS, movie player, social life, VR headset, internet information appliance, dessert topping, floor wax, etc.   I was an early adopter.  I was a proponent.  I was a fan. 

I was wrong.

The all-in-one device is a marvel of convenience, but it makes focused attention on any one single thing of value extremely challenging.  Always being connected is great for knowing where to find a gas station while driving in an unknown area or for settling a bet about a piece of trivia with a friend, but it creates a constant psychological drag on the real world experience of every day life because you often feel compelled to use it just because it’s there and you’re bored for 5 whole consecutive seconds.  A globally connected platform for delivering creative work to audiences is theoretically empowering for artists, but since everybody throws everything out there, nothing feels special or unique or lasting, almost everything feels ephemeral, transitive, meaningless, like a night at an open-mic where the entire audience is on-stage at once, talking at the same time. 

In the Miasma, all of these things that hypothetically could have been enriching, empowering, and inspiring have mostly turned to shit.  Devalued, corrupted, monetized, destroyed, and we as a society have been lessened to the extent where Donald Fucking Trump actually became President of the United States.  Think about that.  As far back as the 80’s that would have been the punchline to a joke about America failing as a country and IT.  ACTUALLY.  HAPPENED.

I can chart my decline in creative interest and output on a graph (yes, I’ve actually done this on paper) and it directly correlates to the rise of the post-Facebook/post-iPhone Miasma version of the internet.  My flagging interest in saying anything whatsoever to the world at large, my increasing disinterest in my OWN MUSICAL WORK, my general sense of despondency about anything, or anyone, anywhere, truly mattering at all, my ever deeper struggles with the blank page or blank tape, it all correlates perfectly to the amount of time I have spent online since the New Enlightenment turned into the Miasma. 

The question is, what is a boy to do?

The internet I fell in love with is gone, for good.  The world I grew up in is radically changed.  No use looking backwards, it is what it is. I can limit my online time, work on my mindfulness, and swear a lot, but it I can’t undo what’s been done.  This is where my job is.  This is where my friends are.  This is how the music and tech industries function.  If I want to work in technology and/or be a creative, I can’t pretend the cultural landscape is what it was 12 years ago. 

I think the answer, ironically(?), is hinted at in trends I am beginning to encounter in the habits of the generation being raised with hyper-connectivity and social networking since childhood.  They are not enamored of apps and smartthings, they don’t think they’re especially cool or interesting, and they don’t inherently think the digital stuff is better or worse than what came before it.  It’s all just tech.  This is why a lot of people these days are, apparently, rediscovering mixtapes made with actual cassettes.  I did not foresee cassettes coming back, but they are.  Why?  Making mix tapes with your own voice and choices of songs was fun when I was a kid and it’s still fun now.  Who cares that you can listen to the same songs on your phone on Spotify?  That doesn’t feel unique like a tape does.  Another example, my niece became obsessed with typewriters at age 12 despite having a smartphone and tablet.  People who didn’t experience the migration from analog to digital to networked are not inherently biased against the old tools and can even appreciate their quirks and limits but mostly they appreciate the physicality, the reality, of analog. 

The Miasma is an endless stream of mostly negative messages masquerading as news, relationships, and information which is tailored to hook you, personally, and to shape your world and your view of it.  Unconnected technology only puts out what you put into it, there is no agenda, no secret influencers.  Maybe the way to get creative again is, in part, to only use tools and technologies that don’t try to influence my behaviors. 

And while I do think that’s a part of it, the real insight I had is that the flip side of the Miasma is how it makes you, me, everybody who participates, into both influenced and influencer.  We are all trying to culturally signify our alignments, beliefs, and affiliations.  We are all posting selfies and liking posts and crafting a semi-public persona as a type of performance art.  This is not an environment that fosters or encourages actual creativity.  In fact, it’s an active impediment because it creates the illusion of creativity.

Taking a photograph and applying some funny filter to it or cobbling together a meme is an act of creation, sure, but it’s more craft than art.  It’s more like making a hand-print turkey painting than it is like writing a confessional poem.  These types of minor creative output are mostly imitative or derivative, and the primary value is amusing other people.  These are all performance, but not all art is performance.

I recently read something written by Jeff Buckley in the liner notes to the posthumously released collection of material he was working on at the time of his death “Sketches for My Sweetheart the Drunk”.  He wrote the following about his songwriting:

There is also music I’ll make that will never-ever-ever be for sale. This is my music alone, this is my true home; from which all things are born and from which all my life will spring untainted and unworried, fully of my own body.

And this is something I have known for a very long time but I have let myself forget, the simple basic fact that you need to create first and foremost for your ears alone, for your heart alone, for your soul alone, if you want to have a home to share with others.  You can’t make that kind of art with the thoughts, feelings, opinions, or judgments of other people in mind.  You can’t be wondering if they will like you or what you have to say.  It’s not about them.  It’s the opposite of performance.  It’s self-exploration.  The more my life has become about the performances and manipulations of the Miasma, the more I’ve come to critically judge my own work and the less free I have felt to just play, explore, experiment, and enjoy the process of making music that nobody will ever hear.  I’ve been laboring under the false feeling that if I make music that I don’t think is “releasable” then I shouldn’t have bothered to make it.  When I was in high school sitting cross-legged on my bed with a four-track recorder recording ambient soundscapes about Tony Bennett or swarms of bees I wasn’t worrying about anybody hearing me or caring what I was doing…  I was having fun.

Fun.  Yes, fucking FUN.  Where is fun in 2020?  Where is joy in 2020?  Where is there joy to be found in the endless doomscroll of the Miasma or the viral marketing hellscape or the endless disgusting behavior of the bigots and fundamentalists or the constant manipulation of influencers and trends and memes and the barrage of messages and notifications and micro and macro time sinks of modern life?  I’ll tell you where it is.  Nowhere.  Missing in action.

And there, ladies and gentlemen, there is the key in all of this navel gazing.  Without fun, without joy, even the joy of painful catharsis (and yes, there is joy to be found in working through painful emotions, just think of the joy of relief when you remove a really bad splinter), what are you sharing?  What have you got other than an empty “look at me”? 

I’ve let the Miasma train me.  I’ve let it get me focused on publishing, producing, consuming and being consumed, constantly trying to drink a bottomless pool dry, and neglecting the square one of unplugging, playing, doing things just because they are interesting, making music for nobody else to hear, remembering that the bad news will still be there whether you look at it or not but that your soul won’t be if you won’t look after it.  When was the last time I just put on a record and listened to it without also being online?  When the last time I picked up a guitar and just made something up with no plan?  When was the last time I turned away from all screens, tablet, television, phone or e-reader, and just lived in the world of the actual senses? 

I am not sure.  I know that my entire life was spent in real space up to a point, and then it started digitizing, and it eventually wound up twisted around this shared online fiction we now call a culture, but the answer is not about “going back”, it’s not about “disconnecting”, it’s about remembering that the Miasma cannot provide meaning, it cannot provide true joy, but music can, real life can, and if I want to find that again, I need only remember how to play, how to write for myself and myself alone, and then to make a conscious decision to stop participating in the endless performance.

I often find myself wondering how people can live in the 21st century, using computers and smart phones, benefiting from modern medicine and transportation, seeing the very fruits of scientific discovery in their daily lives EVERY.  SINGLE.  DAY.  while continuing to deny science and reason.

I once had an infuriating conversation with somebody who believed in the flat earth.  When I asked him if he ever used the GPS on his phone he said that or course he did.  When I explained that the very existence of GPS demonstrated that the world was not flat he proceeded to give his imaginary explanation for how the GPS system “really” worked using cell towers, not satellites.  When I explained to him that a) I am an engineer and I know first hand it doesn’t work that way, b) GPS works even when you are nowhere near a cell tower, and c) the GPS system predates the cell towers in the first place he was completely unmoved.   He had seen a video on YouTube and that was all the evidence he required.  Speaking to a person he knew who had first hand knowledge of the topic was not as convincing as his “research”.

I asked him why, if the earth was flat, was I able to see the curve of the earth from the airplane I had been in the week before.  Apparently all airplane windows are designed in such a way that they distort your view to give you the illusion of a curved horizon, no matter what altitude or angle you are looking at.  This apparently includes the flat, front-facing windows that the pilots see out of as well.  It’s all a part of the conspiracy to keep the truth hidden.  NASA is responsible.

OK, but NASA is only in the USA.  What about all the other countries with space programs and airlines?  Why has no rogue nation ever told the truth about the fake GPS system and the flat horizon line?  Well, that just proves NASA is part of a global conspiracy, not a local one.

It went on like this.

For hours.

I demonstrated that I had satellite internet, showed him the line of sight to the satellite I was using, demonstrated the data latency times to show that the information had to travel a great distance from my home to the satellite, unlike the cell tower, explained how that signal travel time could be used to measure distance between a radio and a satellite and how triangulation worked, how if you could know your distance from three satellite signals, you could determine your position in three-dimensional space.

This got me nowhere.

I showed him photos I had personally taken of Jupiter and it’s moons that proved that they existed, they were round, and they were visible to the naked eye.  I demonstrated that the only way they would ALWAYS be round is if they were a sphere, because if they were discs, any change in their angle to our planet would change their visible shape.  I pointed out that nobody has ever observed any planetary body showing as anything other than a sphere so there was no reason to believe our planet should be any different.

None of this mattered.  Not even a little.

I showed him, using a lightbulb and a plate, how the flat earth model he believed in would mean that the sun would either ALWAYS be visible in the sky OR would set in the south and rise in the north.  He didn’t deny that the sun rises in the east and sets in the west, nor did he deny that the sun is only visible to half the planet at any given time, but he still clung to his belief even though it was clearly impossible for those facts to work in his model.

Many times in this conversation I said “I really don’t care what you believe and it’s not my job to change your mind, can we please talk about something else?” but he insisted that I needed to see the truth of his belief or else I would be falling for the Great Lie of the round earth and wouldn’t be able to accept the Bible and be saved and then he would proceed to trot out some other easily debunked “evidence”.

There was no escaping the topic, no possible way to change his mind, and no way for him to see that he was essentially eating an apple while denying the existence of apple trees every time he punched an address into Google Maps.

If he wasn’t an uncle I hadn’t seen in years I would have just given up on civil conversation and mocked him relentlessly until he at least agreed to talk about something, anything, else.  Or I would have shaken my head and walked away.  I did neither.  It was excruciating.

I’ve replayed this incident many times, trying very hard to imagine how his brain allowed him to hold onto such a patently, provably, demonstrably, false idea even as he was having it demonstrated to him.  It was really something and has really taxed my imagination and empathy, while simultaneously giving me a valuable insight into reality-denialism in all it’s forms.  What is it like to be drowning while denying the existence of water?  How do people wind up like this?  Are they born this way?  Conditioned by religion?  What is it?

I still can’t explain it to my satisfaction.  When I was a religious believer, I believed because I was presented with cherry-picked and distorted evidence that gave me the illusion that my belief system was grounded in observable reality.  Eventually, I gained enough knowledge about reality to accept that my beliefs were fantasies and I stopped holding those beliefs.  I was amenable to reason as a believer and also as an unbeliever, once I had enough facts.  But there are people who are not just unreasonable, but who work like mad to simultaneously convince themselves that they are, in fact, reasonable rather than just accepting that reason has nothing to do with it.  They simply like the idea that there are fairies at the bottom of the garden, and that’s that.

I can respect the “blind faith” people about 3% more than the “here are my carefully selected ‘reasons'” people.  None of this would matter all that terribly much except that anti-reality political and religious organizations have massive amounts of power in the society I live in.  The anti-reality people are destroying the environment, killing tens of thousands of people during a pandemic by resisting science and reason, and have removed me from having any meaningful relationship with my parents and most of my surviving siblings.  I can’t easily let go of the hope/notion that there is some way to get through to them.  I know that is a foolish hope to cling to but it’s like an itch that I can’t stop scratching.

Whatever evolves on this planet to replace our species after these people drive us to extinction, whenever sentience next rears it’s ugly head on this planet after we are dust, I sincerely hope they find a cure.

I’m proud to announce the completion and pending release of a new album.  I have once again joined creative forces with Lemuel “Ace” Herlihy (aka: Michael Heuer) and a new album, entitled Amateurs is the result.  Our last album was called Nininger and was released six years ago.  On that one we each contributed a 35+ minute long ambient/noise piece under an anagram-derived moniker.  I was listed as Tasty Rerun and Michael as Lemuel “Ace” Herlihy.  This year we collaborated much more closely, composing and recording together in a series of studio sessions to create a very different beast.  Also this time out Tasty and Lem have decided to band together under the name Nova Pill Beam.

Mixing and mastering work is underway, release will be some time next month.  You’ve been warned.  🙂

I have been eligible to vote in American elections since the early 90’s but the first time I actually voted was in 2004.  I hadn’t previously voted because I was a member of a religion which specifically prohibited voting.  Weird, I know.  As soon as I was no longer a member of that group, I got excited about the opportunity to vote and that year I supported John Kerry in his failed bid to unseat incumbent president (and Rove/Cheney ventriloquist’s dummy) W.  It was an extremely disappointing experience that left me profoundly disturbed.  I found it very hard to understand why anybody who was even marginally conscious about anything happening in the country could ever vote for a man with the apparent IQ of a sea cucumber, but clearly I misunderestimated my fellow citizens.  It wouldn’t be the last time.

People, it turns out, are gullible, easily manipulated, unreliable, and generally bad at critical thinking.  Also, for the most part, poor judges of character.  I suppose I kinda knew this before 2004, but I was working pretty darn hard to avoid thinking about it.  I was aware that politics were a thing and people were passionate about them but I had also been taught that none of it mattered, it was all equally corrupt and bad, and the whole “political system of things” was doomed to destruction anyhow so I didn’t think I needed to worry my pretty little head about it.  And so I didn’t.

Once I started to realize that politics is simply our name for “how the human species makes group decisions instead of just killing each other” I began to realize that I was at the mercy of the collective bad decision making, poor critical thinking skills, and gullibility of my neighbors and there was no Jesus on horse with a flaming sword a-comin’ to save the day.  That was terrifying and I would like to report that it has gotten less so, but that would be a lie.  It is not less terrifying and people do not instill me with any more confidence today than they did before.  Probably less since 2016.

But here we all are and 2020 is here and the POTUS is a mob boss, the Russians and Republicans are strategic allies, every Democratic candidate on the table has a fatal flaw, and every left-leaning person I know is fighting with every other left-leaning person so we’re probably gonna get twelve more years of He Who Shall Not Be Named after he just goes ahead and declares himself president for life and suspends elections and since I am powerless to change these things, I need to figure out how to live with them and, ideally, not enter into a crippling depression.

The simplest option, perhaps the only one that really rises to the level of a solution, is just to tune out.  Go back to how I grew up.  Focus on music, family, personal development, art, and the rest, just show up to vote my conscience but, otherwise, simply ignore all the bad stuff that’s happening.  Don’t follow the news, don’t obsess over the soap opera, keep a distance.

This is much harder to do in my current life than it was when I was growing up.  I grew up in the pre-internet era when news was a paper delivered once a day which I mostly ignored, despite delivering it around the neighborhood.  I read the comics and skimmed the TV listings for movies or shows I might want to record, but beyond that I was pretty much unaware of and uninterested in the world outside my neighborhood.  No social media, no cable news, no office filled with co-workers with opinions.  It was simple.  Now, if I want to see what’s going on my friends lives, I dip into social media and pretty soon I’m seeing political posts and I’m having opinions and the bubble is gone.  I work in an office on a computer all day, the internet is always happening, and I can choose not to look but it takes a lot more self-control.  It’s easy to avoid things that you have to out of your way to see, it’s hard to avoid things that pop up on your screen or arrive in your inbox.

I’m not sure, either, that I would like to return to the ignorance is bliss stage of my life.  I wasn’t just uninformed, I was MIS-informed.  Because I wasn’t aware of actual events actually happening in the world around me, I was able to be fed a bunch of untrue information that formed the basis of the worldview promulgated by my religion and this kept me from thinking for myself for a very long time.  Long story short, I was insulated and thus slow in developing my critical thinking skills about the world even as I developed my intellectual capacities in other areas like music and computer programming.  Once you are out of a bubble, you can’t go back in.  The nature of bubbles is that they pop and then they no longer exist.

At the risk of over-simplifying then, I see three options.  Go back to being Bubble Boy, lose my damn mind over every new outrage, or, option three, balance.

Here are my fledgling rules for finding balance in a world of political insanity:

1. Don’t over-consume.  Read the news once or twice a week to stay informed on major events, but avoid binging, avoid politics talk shows, podcasts, cable news, blogs, and the obsessive 24/7 coverage.

2. Don’t fuel negative feelings, find positive things to do.  When exposed to the latest Trump outrage or Republican violation of law, morality, the constitution, and basic human decency, you can either fume and stew or put something good into the world instead.  Finish an unfinished project, write a song, listen to a new record, watch a classic film you’ve been meaning to watch, read a novel.  The world doesn’t get better without good things happening, do something positive in response to a negative.  If you let bad people and events paralyze you, the end result is less good in the world.

3. Participate, but moderately.  Vote when you get to vote.  Be informed enough to make good decisions.  Maybe even volunteer to do some canvasing, but also refrain from activities that only serve yourself.  Fighting with people online isn’t going to make any change happen.  Neither is checking out completely and staying home.  Participate in the democracy like it matters but don’t think your passion can change the world or allow yourself to become so disenchanted that the bastards win.

Informed, meaningful, participation plus just enough news intake, and a commitment to contributing my time and energy to positive things as a way to fight against the negative ones are really the three guidelines I’m going to try to stick to.  Feel free to remind me I said this next time I find myself ranting or obsessing.  I’ll appreciate the reminder even if I say, “I know but….”

Two blog posts in a row, what??

This morning I finished reading an anthology volume called Great Modern Short Novels or something to that effect. The novellas were:

  1. Lost Horizon (James Hilton)
  2. The Red Pony (John Steinbeck)
  3. The Third Man (Graham Greene)
  4. A Single Pebble (John Hersey)
  5. The Light In The Piazza (Elizabeth Spencer)
  6. Seize the Day (Saul Bellow)
  7. Breakfast at Tiffany’s (Truman Capote)

I had never read any of them and I enjoyed them all.  I had seen the film adaptations of Lost Horizon, The Third Man, and Breakfast at Tiffany’s, but even those held some surprises in the reading.  Breakfast at Tiffany’s specifically is much more modern than the film version would have you believe. 

Among multiple pieces of dialog that I found surprising for 1958 was when Holly Golightly was talking about marriage and said “If I were free to choose from everybody alive, just snap my fingers and say come here you, I wouldn’t pick Jose.  Nehru, he’s nearer the mark.  Wendell Willkie.  I’d settle for [Greta] Garbo any day.  Why not?  A person ought to be able to marry men or women or—listen, if you came to me and said you wanted to hitch up with Man O’ War, I’d respect your feeling.  No, I’m serious.  Love should be allowed.  I’m all for it.  Not that I’ve got a pretty good idea what it is.”

Same sex marriage being casually discussed by a character in a novel in 1958?  And it’s far from the only instance in the book.  On another occasion she suggests that Rusty Trawler should “settle down and play house with a nice fatherly truck driver”. 

That’s not the only dialog that seems more apropos to 2020 than 1958.  You know that part in the movie where she’s trying to get her cat to leave and she tells the cat to “Beat it!” and “Scram!”?  In the book she also tells the cat to “Fuck off!”  Hard for me to picture Audrey Hepburn voicing that dialog in the movie.

Like I said, modern.  The story has problematic elements, but I am just noting that I was a bit surprised by Holly Golightly, despite seeing the film version.  In the book she is barely 19 years old but she’s had eleven lovers (“not counting anything that happened before I was thirteen because, after all, that just doesn’t count”) she talks about dykes and gay marriage and bi-sexuality, drops an F-bomb, happily takes money from sugar daddies, and rather than staying with Paul at the end, she leaves the country and he ends up with the cat.  This is hardly a new revelation (https://www.theparisreview.org/blog/2018/12/21/was-holly-golightly-bisexual/) but it was definitely not the BaT I am familiar with.

That’s a lot of knobs and sliders and wires…

I haven’t released a new album since The Coal Room EP in December of 2014 but that doesn’t mean I haven’t been musically active. The last six years have been rather productive, actually. I spent some time as a member of the band Robots From the Future, also a stint in the band Fistful of Data’s, I played a handful of solo sets, I released a couple singles with music videos, I sat in on drums for a couple “adult jam” style gigs with friends, and then I joined up with my current band, Awkward Bodies and we have played some great shows and released a few singles as well. Still, I have missed being a recording artist unto myself so this year I hope to change that a little.

I’ve accumulated a bit of a backlog of solo material, enough for more than one album. I’ve winnowed the list down to a decent dozen or so and I expect that I may just start working my way through recording them. It’s no exaggeration to say I’m overdue for a legit solo album.

Since I was in sixth grade I’ve always been involved in one band or musical project or something and have only had a few years where I didn’t at least appear on some sort of recording that was distributed for public consumption in some way. Last year I appeared on three Awkward Bodies singles, the year before that I released a video single for one of own songs, Ostrich, but 2015-2017 was a three year dry spell right on the heels of The Coal Room (and the rather obscure RFTF single “Bloody Baby” which appeared on a comp). And that’s OK. I’m at peace with it. During that time I took some sax lessons, I played shows in a few bands, I kitted out a new studio at a new house, I wrote and recorded new music, I was involved in a pretty cool upcoming Nuclear Gopher documentary… basically, I was very active, just not in the recording arena. But I miss it.

Just this week, I had a recording session with Lemuel “Ace” Herlihy (Michael Heuer) for what will (hopefully) be a followup to our 2014 ambient/experimental/noise smash hit album “Nininger”. The mess in the photo above is from the session. That was really fun and if all goes according to plan we should see that record done in time to submit it for this year’s RPM Challenge. I’m thinking we should call it “Nininger 2: Electric Boogaloo (Return of Ignatius Donnelly)”.Awkward Bodies has been an extremely rewarding experience and tonight we will be having our first practice with our new lineup. Lem Herlihy and I will get together again next week, I hope, and maybe, just maybe, I’ll do what I did with The Coal Room and lavoneloveletter and just take a single session to get the bones of whole new album laid down and then finish it off with overdubs over the course of a few weeks/months. I have the songs. What is stopping me?

I haven’t updated this here blog for a few months because I couldn’t. The admin login was broken and I kept meaning to find time to fix it but failing. Today I am happy to say I figured it out, got the site updated to WordPress 5.3, and switched the theme to the new Twenty Twenty theme. I plan to tweak that a bit, but hey, at least the site is fixed. Woot.

I will never forget the first time I encountered the internet. It was 1994 and I was working at my very first computer programming job at a small sales-lead management company near Minneapolis. I had written a DOS program called EDT that used a modem to dial up to various magazine publishers and download their sales leads over the phone. One day my manager, Michelle, entered my cubicle and handed me a piece of paper and said, “I am not sure how this works, but this publisher says that they want to provide their files over something called the internet. Can EDT do that? I signed us up for an internet service account, these are the instructions to get started with our username and password.”

It was the first time I had ever heard the word “internet”. I took a look at the printed instructions. They were from a dial-up ISP called Skypoint. There was a phone number and instructions to connect with a z-modem terminal program. EDT supported z-modem so I dialed up and connected to the internet for the very first time using the very first piece of software that I wrote at my very first job as a computer programmer. Once I was in their text-based menu system, I managed to follow the directions to download something called Trumpet WinSock, which added support for something called TCP/IP to my Windows 3.1 machine, and I was also able to get a piece of software called Mosaic 0.89, which was a browser for something called the World Wide Web.

It took me the better part of the afternoon, but pretty soon I was able to access the sales leads via something called FTP, I loaded my first web page at skypoint.net and my life was never quite the same. I signed up for a personal Skypoint account almost immediately.

The internet of that era consisted primarily of email, listservs, FTP sites, Gopher servers, a fledgling and quite small Web that was almost entirely text-based, Usenet, IRC chat, and dial-up telnet access for when you just wanted to efficiently access information instead of fiddling around with graphics. The dial-up speed was insanely slow, my modem was only able to connect at 9600 kbps, about one sixth of what we would now think of as “dial-up speed”. Windows 95 didn’t exist yet and when it launched it didn’t even include internet access because Bill Gates wasn’t yet sure it was going to be a thing worth doing.

The internet I met in 1994 bore very little resemblance to the internet of 2019. It was global but personal, open yet idiosyncratic, difficult to navigate but immensely rewarding. I would come into the office early just to spend an hour or two exploring. It felt like the beginning of a massive revolution, a cultural shift, that would change everything for the better. I fell in love and for the subsequent 25 years I have stayed online and worked and lived on the cutting edge of internet and computer technology. I have owned many computers, built many websites and web applications, met countless people, and rarely gone more than a day or two without a visit to that virtual electronic universe.

About 10 years ago the internet underwent a profound change with the move to mobile broadband, the centralization of e-commerce, the rise of social media, and the eventual dominance of the online world by the Big Tech companies: Apple, Google, Amazon, Facebook, and (to a much lesser extent) Microsoft and the wild, weird, somewhat chaotic world of the internet I first fell in love with started to be replaced by the corporate internet we all interact with today and I’m here to say that when that happened, we collectively lost something, and I miss it.

I no longer love the internet. In fact, I kind of hate it.

I grew up in a world that had three television channels on VHF and one or two low-watt local stations that were sometimes watchable on UHF. If you wanted to watch TV, you watched whatever happened to be on those channels. Even when I bought my family’s first VCR with my paper route money, I still had to read the TV listings in the newspaper, circle shows I might want to see, and program the VCR to record them if I didn’t want to miss them. Media types, music, movies, TV shows, books, they were all very different from each other, not simply different types of bit streams and if you wanted to hear an album or read a book or watch a movie, it was not as simple as firing up Spotify or Netflix. It was incredibly inconvenient and required a lot of planning and intentionality. Back when I first encountered the internet I was thrilled by the possibilities it promised to create exactly this world we now have. The ability to read all the books, watch all the movies, hear all the music, it seemed like such a great idea, and it really was, but now that we are here, I find that it is not without a price and the price is impact. It turns out that when everything is convenient and available, nothing seems all that terribly valuable or interesting and distraction becomes a serious concern, as does complacency.

Everything, from the works of Marcel Proust to a cat chasing a laser pointer, melts into a sort of stew of sameness. People seem less interesting when you just see what they post on Instagram. To paraphrase a bard from the 1980’s, it feels like there’s 57 channels and nothing’s on. When I go online today, instead of a sense of wonder and curiosity, I feel a vague disgust and boredom, and this makes me sad. Everybody performing for each other, myself included, every click and site visit tracked for SEO and marketing purposes, ultra-intrusive ads, and an endless stream of trivia. This is not the internet I fell in love with.

I’ve recently decided to try to do something about this but I have yet to find what I feel has been lost. In the last couple of years I have made several major changes to my computer habits. I replaced my iPhone with an Android phone, shut all the notifications off, and started using a web browser that blocks all trackers. I have implemented rules for myself for social media usage, limited how often and to what extent I engage in Twitter, Instagram, and the dreaded Facebook. I’ve even gone so far as to rehabilitate a few old computers that don’t have WIFI or modern web browsers so I can use computers to do things like writing and music production without the temptation to lose hours of my life to memes and viral videos, news and gossip, and the rest of the modern digital stream of endless distraction. I’ve blacklisted some websites to remind myself to keep away from them and all of this just seems to make me feel a little more resentful.

I don’t like feeling like I have to be on defense every time I go online. I don’t like the default assumption that I should always be available to respond to tweets, texts, IMs, or even phone calls. It’s not like I want to be some sort of Luddite, not at all. I love the speed and power of modern computers and I love having access to the world’s media on-call, and I probably have more technology in my backpack on a daily basis than most normal people have in their homes, but I’m really struggling to enjoy what we have collectively created. What was once special is now common, and what was once empowering now feels like a sink for time, attention, and energy with very little reward. Where once computers made me feel more creative, they now feel more like they are trying to seduce me into mindless consumption and the amount of work required to regulate the intrusiveness of the technology is tiring and disheartening.

I think there is a philosophical difference in the way computers were designed, envisioned, and used during the initial phase of personal computing and the role they play in modern life and, personally, I’ve found that the earlier ethos fits my personality better than the current one.

Take Apple, for example. When the original Macintosh computer was being designed, the vision Steve Jobs had was “a bicycle for the mind.” Maybe not the most obvious metaphor, sure, but I always liked it. A computer was a tool that allowed the user the power to create things and do things that would otherwise have been beyond their reach in the way that a bicycle would expand the power of our personal mobility, but unlike a car, would not replace it. In high school I used to go to Kinko’s to use the desktop publishing Macs to create and print inserts and labels for tapes to distribute music recorded by my brother and I, which we then sold at school. Later, when video editing became possible thanks to the early iMacs with Firewire/DV cameras, I made short films and even wrote a few screenplays with the hope of shooting a low budget feature. Digital audio workstation software allowed Rhett and I to record albums of much higher sound quality and complexity than would have ever been possible with earlier tape machines. New machines gave us higher quality graphics, faster and easier manipulation of video and audio and images, and more storage and connectivity to cameras and audio equipment. The early internet gave us a platform to promote our work to random strangers all over the world. For years Apple seemed to be focused on making better and better tools for creativity, until they decided the real money was in apps and phones and watches and titanium credit cards. Where once they focused on empowering the user to create, the current focus of Apple is creating a sticky consumer experience, and they are very good at it, but the difference in the product is striking.

It is actually hard to get anything useful done on a computer that is constantly pushing notifications and updates at you, that has embedded social media sharing in all the programs, that has limited ports for connecting to other devices, or limited storage that encourages using their Cloud, or “consumer” grade applications that are so dumbed down that your biggest creative decisions are which filters to apply or which canned beats to loop.

This train of thought has caused me to start considering alternatives and I’ve found a couple and surprised even myself. I started thinking about, of all things, knives. I have a pocket knife that I use for one thing or another almost every day. It’s a single blade with a wooden handle. Nothing fancy. I used to carry a Swiss Army knife with a bunch of tools built in. A corkscrew, a screwdriver, a toothpick, a pair of scissors, but 99% of the time I discovered that I didn’t need a crappy version of all of those tools, I just needed a good version of a small knife, so I switched. I chose specialization over versatility. I thought about what that philosophy would mean for writing. If a modern computer is the ultimate digital Swiss Army knife, what would a single purpose writing computer need? The answer was: not very much. A clear screen, a great keyboard, the ability to save text in a format that could be edited and potentially printed or published with modern tools, and most importantly, no distractions. I found something called the Freewrite, a Kickstarter launched device with an e-ink screen, WIFI, and a kick-ass mechanical keyboard but it was rather expensive. As I was contemplating it I realized I already had experience with a computer that was phenomenal for writing and little else, and it was inexpensive to boot. I remembered that I had once owned a 1991 Apple Macintosh Powerbook 170, a chunky, black and white, extremely primitive laptop, and that I had written tens of thousands of words with it before “upgrading”. A few months later and I’ve reacquired and refurbished a few old Powerbooks, all predating the turn of the century, and it has worked. I’ve taken to writing again and it’s fantastic. Trains of thought stay on their tracks, distractions disappear, and writing is fun again. I’m writing this on a 1997 Powerbook 3400c using a copy of Word that is so old that the built-in dictionary doesn’t know the word “internet”.

This computer can connect to a network. I can download and install old applications on it, but, it can’t surf the modern web. It has a full-size keyboard with desktop-style keys unlike a modern MacBook with it’s low-profile chiclet style keys. The screen is clear and crisp, more than enough for text. There are no push notifications and there hasn’t been a software update available since Bill Clinton was the president. As a tool for focusing, in a contemplative manner, on forming thoughts into sentences and paragraphs, it’s nearly perfect, but it is not social, and it’s multimedia capabilities are laughable.

After I write this, I will transfer the file to my modern laptop, run it through spellcheck and do some editing and proofreading and then I will post it on my blog. It is almost a dead certainty that I wouldn’t have written anything this long if I was working on my modern machine. I just don’t have that kind of self-control.

This experience of using a single-purpose computer to do a focused task has been a sort of revelation. I have found that my writing productivity has jumped so spectacularly with the switch to this older style machine that it caused me to consider taking a similar approach to two other areas of interest: audio production and video editing.

When it comes to audio production, going back to an old computer seems to me like an absurd idea. When I want to record music I just want to focus on the performance. A vocal part, a drum track, a bass line. I don’t want to fuss with storage limits or old technology. So, I considered what I would need. First, I wanted the ability to record at least eight tracks of audio simultaneously with very high sound quality. Second, I wanted to be able to easily transfer a recording to a computer for editing, mixing, and mastering. Third, I wanted to be limited to audio recording, again with no distractions. This lead me to purchase the first dedicated recording machine I’ve owned in over 20 years, a 32-track all-in-one digital studio, which was a few hundred bucks, not much more than what I paid for a cassette four-track in the 90’s. I have had a couple of sessions with it and again, it has been an extremely rewarding experience. It’s been many years since I’ve been able to just plug in and hit record. I usually spend 45 minutes getting things set up, I get prompted to install updates to my recording software, and I lose my momentum and the session fizzles out. With a dedicated machine, I have none of that. Unless it breaks, I can just turn it on and work.

After I finish tracking a song, I just take the memory card out, connect it to my computer, transfer the files and edit and mix. My god I’ve missed working like that.

I don’t think I’ll ever go back to the Swiss Army knife.

Video editing will be a bit more of a challenge, but I have a plan for that as well. Shooting video is the easy part. I have a digital SLR Nikon camera as well as a few other HD camera options sitting around. They aren’t 4K, but, that’s OK. I will use a relatively modern computer for editing video, it would be silly not to, but I don’t intend to work on my mobile, connected, laptop. No, I am taking a desktop machine, an older Mac, and setting it up with an extra monitor, a copy of Final Cut Pro, and no internet connection unless I choose to plug it in. When I want to cut something together, I will shoot the footage, load it up on that machine, and edit it, away from the digital noise. When it’s ready I will export it and share it with the world. That’s the only thing I will use that computer for and as long as it can do that task well enough for my needs, that’s all it will do.

What I’m sacrificing in convenience, I’m making up for in focus. That’s the deal, and it’s a fair trade.

The modern internet is soul killing. Modern computers are so infested with it that they make focused creativity challenging, but it’s a solvable problem and I look forward to a year of enjoying computers and their role in creativity again, even if it means I spend less time online. The old internet is gone anyhow, the new one is boring and intrusive, I might as well get back to riding bicycles.

One more thing…

I started this whole post talking about the demise of the old, weird, internet and the rise of corporate consumer computing, but I’ve only discussed solutions for restoring my sanity in creative domains. What about the actual internet? Well, they may be less popular than they once were, but many of the old ways of doing things online still technically exist and new ways are being invented. Blogs are still out there, even if Medium has hijacked them. People are still out there, even if Facebook has turned them all into an endless feed of reality programming. There is a fledgling movement to reclaim the internet, make it personal again, there is a website called Indie Web that I have recently discovered that is encouraging the development of ways to share things online without Amazon, Facebook, or Google, and I plan to get mindful on that front as well. In the next few months I hope to start running my own web server again, on my home connection, with my own domain, and take control of my online presence, data, and profile. It’s not enough to not be tracked while shopping Amazon for vegan jerky, I want to host and control my own words, images, music, and videos. It will take some work, but that’s nothing new. It was a lot of work to establish NuclearGopher.com and distribute music 20 years ago and very little has been done to make being independent of mega-corporations easier in the last decade, but it’s work worth doing and work that I know how to do. I hate the internet as I’m currently experiencing it, but I haven’t given up on the idea of being connected to the wide, weird, world, no matter what the monetizers and influencers want me to do. I’m looking forward to making this interesting again, I hope others do the same.