Might as Well Write

Not long ago, a friend of mind pointed out that I seem to enjoy giving advice to smart and/or creative people who need an emotional nudge toward feeling better about doing the things they love. I did a little digging, and maybe he’s right:

I also enjoyed helping student writers when I taught, offering my idiosyncratic advice on how to actually write, sure, but also on how to be the kind of person who can at least endure and maybe even thrive in a creative life.

So I decided to create a new blog called Might As Well Write where, like Cheryl Strayed in her Dear Sugar columns, I accept questions from creative people on subjects ranging from day jobs to detail and answer them with the homespun country wisdom I learned at my pappy’s knee.

Or at least the wisdom I learned by wringing tepid success from a lifetime of possibility.

The title comes from the idea that many creative people seem to have about whether it’s “worth” writing, whether all the time is justified by fame or sales or appreciation. I used to wonder that, too, until I eventually confessed a simple truth to myself:

What the hell else am I doing? Is writing for ninety minutes a night taking time away from drilling clean wells in Bangladesh? Of course not. I might as well write, tinkering and fishing my way to a readers perhaps one at a time.

So Might As Well Write isn’t a blog about finding instant fame or going viral. It isn’t about eleven sure-fire story structures that jack directly into our primitive brains. It isn’t about

It’s about treating our creative selves well so that we can persist and grow little by little into people pursuing something like art.

So drop on by subscribe to the Twitter feed, and maybe even ask a question!

The Good Father

You know who I don’t talk about enough? This guy.

That’s Larry Hall, my stepfather. He married my mom after my parents divorced, and when I met him I thought he was a dreamy, irresponsible goofball completely unlike my father.

Luckily, I was right. Not so luckily, it took me awhile to appreciate that fact.

There were seriously some years in the late Eighties and early Nineties when it was legitimately questionable that I’d grow up to be a decent human being. (It might still be. Ha!) The most glaring model of masculinity so far in my life led me to confuse assholery with strength, and I’m sure there were times when Larry checked his watch and calendar for when I’d move the hell out for college…or prison.

Yet through all of that, he introduced me to the Middle Earth Role Playing game and typed my first serious story for submission to Hitchcock’s and watched the premiere episode of Star Trek: The Next Generation with me. More importantly, he made my mother safe and comfortable for the first time in a quarter century, and though I didn’t know it, he did the same for me.

Now, I’m not going to kid you into thinking you’d want to HIRE Larry to work at your company or government agency or to guard your building as he did briefly later in life. He seemed to be often drawn to jobs that had cool accessories, like badges or one of those pointy bishop hats, but his priority was getting home to talk endlessly with my mother while smoking on the back porch, or to read and nap afterward.

He was kind of a hedonist that way, though an inexpensive one. Which worked out well because they were almost always broke in the way society measures wealth.

He and my mother and younger brother Andrew went through some terrible times financially because, alas, the world doesn’t know what to do with people who work to live instead of the other way around. Yet even through their deprivation (and believe me, it was fucked up), they had this weird fatalistic cheerfulness that said, “Life is totally shit just beyond the glow of our flashlights, but at least we have those.”

When he met my mother while they both worked at a social service agency, he told her he was gay. She reported this to me with whispered drama before he came to visit the first time, like we should hide our most fabulous glitter from him. And yes, he had a gentle voice most of the time and liked fragrances and was better at dancing than any of us.

He might have told her he was gay to avoid the risk of romantic entanglement before he was ready, or maybe he was bisexual. It doesn’t really matter.

He later married Mother, which made him at least a Dianne-a-sexual, and that worked out just fine for both of them.

The last thing any of us needed was a capital-M Man, at least as they usually assert themselves in the world. Larry balanced the masculine and feminine with a wide streak of empathy and perceptiveness that men are encouraged to extinguish.

The most masculine thing I ever saw Larry do was rush out with a gun to investigate an intruder, though unfortunately it was a tiny pearl-handled .22 that might have been better kept in a garter belt than a holster. But let me tell you, he carried that thing outside into the darkness like a New York City cop to protect us, so again…the balance.

I don’t know how he’d feel about me telling these stories about him. I hope he wouldn’t be embarrassed because I’m not making fun of him at all. These are the stories of a hero.

Before he died, Larry said he was proud of the man I’d become, perhaps with the unspoken clause of that sentence being “…despite all the signs.” That meant more to me than any praise I could have gotten from the guy on my birth certificate because Larry knew manhood, all of it, and his path wasn’t narrow.

The night before he married my mother, there was a small bachelor party with his best man, my brother-in-law, and teenaged me. Marty had to drive him home because he’d gotten a little tipsy, and he had to ask whether to turn off at an exit or not.

“Go forward,” Larry slurred. “Never straight.”

That’s manhood. That’s fatherhood. And I wish I could tell him I know that because of him.

Great Uncle Will, What Was the Pandemic Like?

“Well, Merricat, we handled it in a very American way. We started off with a blithe dismissal of the threat followed by a sudden panic, which triggered hoarding and conspiracy theories that many tried to challenge with argument and virtue signaling. Once we got our footing, though, we marketed and profiteered our way to the end until even the CDC got bored with it.”

“Was it really a big deal?”

“It turns out, honey, that 584,000 deaths are a big deal to some kinds of people but not to others. That was a weird lesson we learned when wearing a lightweight cloth was worth storming state capitols to prevent.”

“But we overreacted, right?”

“To all crises, there’s really only the choice to underreact or overreact because nobody knows what perfectly reacting looks like until it’s all over. So in America, we split the difference by overreacting in some ways and grossly underreacting in others.”

“Was the Old Internet involved?”

“Boy, was it! That’s where sneering malcontents competing for meanness points used to live. They liked to overreact to overreactions because they magically knew exactly what we should have done all along from flunking out of biology in community college. They were the kind of people who’d give us shit for swatting a wasp with a 2×4 when that’s all we had.”

“Is criticism bad?”

“Good faith criticism sure isn’t, but even if we’d discovered that masks were useless — which we didn’t — what was the harm of trying them? Oh, that’s right: our deeply entrenched freedom to be profoundly stupid. We were a country back then who confused the LIBERTY to hit ourselves in the face with a brick with the NECESSITY of it.”

“Is the pandemic why we’re living in this abandoned mental asylum?”

“Oh, of course not. The real disaster came a few years later with Pronoun War I and II, followed by the Great Gas Surplus when panicked simpletons blew up all the Shell stations because fuel got suspiciously cheap. Then we had no way to ship food across the country and many had to literally air fry their guns to eat them instead. It was a crazy time.”

“I wish I’d lived to see all that.”

“Well, it certainly showed us who we were. Now finish your sriracha armadillo ear soup and cover your facial vacuole with the cloth mask like I showed you. I think a radioactive tornado of non-biodegradable drinking straws is coming.”

Advice for a Wunderkind

Someone I know has come home for the summer after their first difficult year of college, struggling with anxiety and depression. They started as a junior because of dual enrollment credits, and they’re attending with a prestigious scholarship about an arcane and technical subject.

Nobody that brilliant would ask me directly for advice, so I’ll write it here to be taken or left.


When I started college thirty years ago this coming August, I had the weird kind of smarts that were hard to distinguish from being a deranged idiot. I think I had a lot of raw mental horsepower that I didn’t know how to control, like putting a rocket engine on a unicycle, and back then it was common for educators to step back and just hope people like me wouldn’t become serial killers.

My cunning vocational scheme was to earn a PhD in English, become a famous writer with an adoring audience of voting age readers, and then run for public office to become the first truly literary President who would save the world with a sense of wit and epic drama.  

Even Manson’s Helter Skelter made more sense than that.

This “genius” showed up with a single milk crate of possessions and his sister had to buy him a toaster because he didn’t think of that.

I felt like a lot of people had high expectations for me, but none higher than my own. I didn’t have the added pressure of your scholarships and a global pandemic thrumming through everything in life like the cosmic microwave background, so I can only imagine how much harder it is for you.

The odd thing about being a wunderkind is that people are either watching you intensely to perform or they’re not looking at you at all, and it’s hard to tell which is worse.

  • If they’re watching, you have the pressure of living up to their hopes and investment in you.
  • If they’re not watching, you have to decide what YOU find important enough to do, and it’s hard to imagine anything worth doing if it isn’t measured or judged.  

When faced with the risk of disappointing other people or disappointing myself, I always went for the latter. I never took a single creative writing course in college because I couldn’t face the possibility that I’d fail at something that mattered so much to me.

(That may not be an issue for you, but I’m guessing that being thought of as smart in your subject isn’t helping.)

It took me years to learn that the key to getting anything done for people like us with anxiety and depression is to lower the stakes. Even though it feels like we’re failing for not taking our work seriously enough, the truth is that we’re taking it too seriously.

Pressure amplifies anxiety and depression, and though some of it isn’t avoidable, one kind is: the pressure you put on yourself, especially at the start of something new.

Here are some things to know:

  • It’s likely that some experimentation is still required for whatever medication you’re taking. It took about three years for them to get mine straight, and I’ve been taking the same meds for fifteen years now. And while I’m perhaps not a paragon of productivity, I’m getting shit done and not hating myself, so there’s that.
  • It’s also likely that no amount of rational thought will lead you out of the way you’re feeling, so don’t make yourself feel worse for your “failure” to do that.
  • One thing you’ll learn as you get older is to take some of your own thoughts less seriously. We’re smart, but not every thought is gold. Some of what we think comes from bad evolutionary software or failures of neurotransmitters, and one easy way to know the difference is to assume that almost all judgments of your own abilities are deeply skewed.  
  • Meditation can help you separate your thoughts from your “self.” I’m told LSD can do that too, but I’ve never tried it. Meditation is probably safer.  
  • The most powerful word I know for people intimidated by their own work is “tinker.” I don’t try things or plan them or even do the most important things…I tinker with them, playing around to see what works or feels better in each iterating experiment.
  • You’ve gotten where you are quickly, but you actually have a lot more time than you think you do. You don’t have to decide everything right now.
  • If you have discovered that you don’t love the thing you thought you did, that’s perfectly okay. If you’ve discovered that you do still love it but not in this time and place and circumstance, that’s okay, too. If you’ve discovered that you love it so much that you’re afraid to risk screwing it up, then just know that nothing stays screwed up forever.
  • If you woke up tomorrow and said, “Fuck this, I’m going to HVAC school,” every member of your family would have your back.
  • Take this summer and do absolutely nothing of consequence. I’d suggest walking, though. Maybe not in the middle of the night in the hood like I used to. Ask yourself some questions in a journal, too; one that always works for me is, “What the fuck am I doing here, really?”

That’s a lot to take in, and it’s possible your summer is over now that you’ve read it. The upshot is this:

There are people who will care about you no matter how you decide to apply your talent.

Try to be one of them.

My Computer History: End Program

I’ve been writing recently about my Gen X computer experiences from the days when they required more patience and commitment than the convenient appliances of today.

I wrote about how the TI-99/4A was my gateway to computing, and how I learned persistence from the TRS-80 Model III, and how the Commodore 64 inspired my ingenuity, and how the Apple II+ made me an amateur scientist, and how the Apple //c kicked off my writing. I wrote about the weird outliers of the Commodore SX-64 and Amiga 500 that may have been too clever for their own good. And I wrote about how the PS/2 and Macintosh labs in college set up my vacillating loyalties that persist even today.

I’m lucky that I’ve seen both worlds, enjoying the hobbyist tinkering of BASIC and also the vast interconnectivity of the devices we have today. I feel a little like someone who witnessed the Wright Brothers taking flight and then the Moon landing six decades later, with each magical in its time and context.

It was a lot easier to be accidentally dumb back then, with only nearby resources to learn from. If you lived somewhere with a lousy library or culture of idiocy, you could have huge gaps in your knowledge that were hard to even know about, much less fix. I learned to shave from a high school friend who described it over the phone, but it took my stepfather to tell me to use warm water instead of cold. Now I could learn that on YouTube.

You felt weirder then, too, with only the sample size of your town or neighborhood to go by. There weren’t many alternative models for how to be, and most of them came down to stark divisions instead of gradations: male or female, white or not, American or not, rich or poor, able or “handicapped,” straight or queer, dreamer or worker, city or country, north or south, bookish or mechanical. When a deaf boy joined my second-grade class, I thought his hearing aid was a Walkman and he was a bad ass rebel who refused to take it off.

It’s harder to be that dumb today, but some people are making their best go of it.

Now with access to all the knowledge on the planet, you have to work hard not to know things – to choose only a few sources of information that reinforce your perspective and filter out others. But humans are humans and our brains take a lot of energy, so it’s natural that we fall into the shortcuts of prejudice if we don’t fight against it.

Those old computers taught me that ideas are malleable, refined through a process of active experimentation until they “work.” That hacking mentality has been the cornerstone of my writing practice as well as my perception of the world as a giant experiment ever in refinement.

What I learned from years of SYNTAX ERRORS is that life can be debugged and improved, though perhaps never perfected. There’s only “works” and “works better” for more people.  

It’s useful to know that even if “answers” are far easier to search for these days, we still have to do the experimenting ourselves.

Who’s ready for some programming?

My Computer History: Oh, Yeah, the Consoles

[I’ve been on a nostalgia trip lately thinking about the computers that influenced me growing up. They were a perfect metaphor for our latch-key generation: “Here’s a device with limited instructions. Good luck!” I know they changed the way I think, and this week, I’ll be blogging about the early computers that influenced me.]

Astute readers may have noticed that I haven’t yet mentioned any of the video game consoles that influenced me in my Gen X childhood, and there’s a very good reason for that.

If you were to engineer a torture device for people with untreated anxiety disorder, you’d invent “games” that hint at the possibility of winning by skill but endlessly rob it from you with no save states, no endings, terrible controls, and buggy programming.

I mean, this is what we were working with:

And this:

Did I have fun with my Atari 2600 as a kid? I did. But starting there, I probably lost years of mileage on my heart and lungs yelling at shitty console games with only a few shining stars to keep my hope alive that the next one would be better.

I can list the few video games that, through a combination of design and story, may actually have contributed something to my life. Note how many of them were on consoles.

  • Eamon (1980, Apple II), which was a fantasy text-adventure you could customize.
  • Raiders of the Lost Ark (1982, Atari 2600), which had a story and an ending.
  • Taipan (1982, Apple II), which was great for trading commodities and becoming a drug dealer before Grand Theft Auto made it cool.
  • Lords of Conquest (1982, Commodore 64), which was a Risk-like game that I enjoyed playing with Norman.
  • The Legend of Zelda (1986, NES), which had a save game and a storyline plus lots of secrets to discover.
  • Wasteland (1988, Apple II/Commodore 64), which had an engrossing storyline and witty writing.
  • Wing Commander (1990, PC), which had a great storyline and fun flight sim physics.
  • X-Wing (1993)/TIE Fighter (1994, PC), both of which were deeply story-driven with great flight-sim features.
  • Tomb Raider (1996, PlayStation), which barely makes it onto this list because the archaeological visuals only SLIGHTLY outweigh the frustrating gameplay.
  • Jedi Knight/Mysteries of the Sith (1997, PC), again with good stories.
  • Outlaws (1997, PC), again with a great story and wit.
  • Metal Gear Solid (1998, PlayStation), frustrating as fuck but a great story.
  • Half-Life (1998, PC)/Half-Life 2 (2004, PC), both featuring amazing game play and storytelling as well as multiplayer fun.
  • Neverwinter Nights (2002, PC), with a great storyline and game mechanics (though the graphics were odd).
  • Knights of the Old Republic (2003, PC), with one of the best storylines in a game ever.
  • Jedi Outcast (2002, PC)/Jedi Academy (2003, PC), also with great storytelling, level design, and lightsaber physics.
  • Lord of the Rings Online (2007, PC), which may be one of my favorites of all time because of the gorgeous accuracy and absorption of the world building, as well as the enjoyment of playing with friends.
  • Portal (2007, PC) /Portal 2 (2011, PC), which are both arch and witty games with mind-bending physics puzzles.
  • Red Dead Redemption (2010, XBOX 360), which had an engaging story and a lot of fun territory to explore.
  • Borderlands 2 (2012, PC), which tells an amazing story with some of the best characters I’ve seen in a game, including THE best villain.
  • Pillars of Eternity (2015, PC), a beautiful distillation of everything that SHOULD have been good about Baldur’s Gate and Icewind Dale.

Also, a few runners-up:

  • Far Cry 3 and Far Cry 5, which are both gorgeous games with interesting but very problematic stories. I feel pretty icky through much of Far Cry 3, and the ending of Far Cry 5 pisses me off.
  • River Raid and Frogger on the Atari 2600 are sentimental favorites because my mother loved them so much, sitting down after dinner on the floor with a cigarette and playing them for hours.
  • Various generations of shooters (Medal of Honor, Call of Duty, Rainbow Six) were fun to play with friends until the Adderall-twitchy ten-year-olds took over.

I freely confess that my exposure to video games has been spotty and my needs are unusual: I like tinkering and exploring and fighting if it’s relatively easy. I’m not looking for a tooth-and-nail struggle in my leisure hours because that’s what real life is for.

If you’re remembering the consoles of the 80s with an abiding fondness, here’s my challenge: pick up one of the new remakes of those consoles or an emulator, and time how long you enjoy playing it.

On the whole, I’m a little like a high-functioning alcoholic. I know that I’ve lost countless hours to video games that I didn’t really enjoy while I was avoiding other things. But the good ones…they were pretty good.

My Computer History: IBM PS/2 vs. Macintosh IIsi

[I’ve been on a nostalgia trip lately thinking about the computers that influenced me growing up. They were a perfect metaphor for our latch-key generation: “Here’s a device with limited instructions. Good luck!” I know they changed the way I think, and this week, I’ll be blogging about the early computers that influenced me.]

In this entry, I’m off to college with two computers that dueled for my heart.

IBM PS/2 Model 70

Released: April 1987 – April 1993

Specs: 6MB RAM, 386DX 16 MHz CPU

Apple Macintosh IIsi

Released: October 1990 – March 1993

Specs: 17MB RAM, 68030 20 MHz CPU

In Fall of 1991, I started college at the University of Florida.

That summer, I’d sold almost all of my vintage computers (the Commodore 64, the TI-99/4A, the VIC-20, and the Apple II+) for spending money.

(Funny story, that: I called in sick to my boss at the inventory service so I could go to a Don Henley concert, and he growled that I didn’t have to show up. When I called him back a week or so later, he told me he’d fired me. That led to one of the best summers of my life, and those computers were a cheap price to pay.)

So when I arrived at UF, I was computerless (though I brought a giant box of Apple disks in case, I don’t know, I bumped into somebody who still had one).

Me in the dorm, with the giant beige box of disks on the second shelf.

That meant I had to do all of my writing at the CSE (Computer Science Engineering) building at UF, where there was a huge ground floor computer lab with rows and rows of computers.

  • About 40% were VAX minicomputer dumb terminals where you could write programs for class and browse this weird global network through something called Gopher to argue on rec.arts.nerdy about whether the Enterprise could defeat a Star Destroyer.
  • About 55% were IBM PS/2 Model 70s, almost always in full use because students seemed to know how to use them better.
  • About 5% were Macintosh II SIs, which sat off to the side or in weird secret labs around campus which was perfect for me to avoid crowds.

There were a few basic phases of my writing at UF:

  • Letters by hand, sent to woo a young woman.
  • Papers written on the PS/2 machines in WordPerfect 5.1 in Courier New font (because that’s all that would print), until I figured out that the Macintoshes were almost always available.
  • Stories written on the Macintosh II SI, deeply emo and also meant to woo the same young woman.
  • Papers written on the Macintosh II SI, all in Avant Garde font.  
  • Stories (deeply influenced by Lovecraft and Bradbury) written by hand in libraries while waiting for the young woman to finish studying her endless mathematics and statistics.
  • Stories written on the PS/2 or an inherited 486 SX after “discovering” in the Writers Market guide that you could send them to magazines you’d never heard of in case, you know, they wanted them.

I never took a single creative writing class at UF, mostly because I was terrified to be told that I sucked. I managed to write two works of meta-fiction in lieu of research papers for a couple of classes, though that was hardly playing the varsity team; those professors were probably relieved not to read turgid declarative interpretations of 17th century drama.

Though even my turgid declarative interpretative papers tried to be entertaining, at least:

In one of my classes, a student seated in front of me noticed I was line-editing one of my stories and she asked if she could read it. That started a semester of me handing us handing pages back and forth, and the result became a story called “The Trespasser” which appeared eight years later in Cemetery Dance.

(Cool story: I was a Republican back then, largely as an act of deeply repressed rebellion, and I was wearing a Jeb! t-shirt on campus one day when I bumped into my editing friend. Her face seemed to sharpen into a cone of utter revulsion, and she never spoke to me again. Maybe I deserved it.)

In 1994 before I graduated, I got “serious” about writing, which unfortunately meant reading a lot of Writers Digest books, studying psychology for characterization, and outlining a Grand Unified Theory of how I’d write my fiction. Once I had all of that in hand, I’d be ready to start.

If I could go back and give myself some writing advice, I’d tell myself these things:

  • More Irving and Jackson and King than Lovecraft, trust me. Make the situations and people abominable, not the prose.
  • Photocopy “The Body” out of Different Seasons and type it in word-for-word. Notice how scenes and paragraphs begin and end, how characters express their reactions and feelings, and how much description is required to set up a place.
  • There’s no grand theory of writing for you or anyone else. You keep typing the next entertaining thing (either a cool experience or some conflict or both) until it sounds good.
  • The model should be journalism, not literature: getting a lot of words down fast and not focusing so much on all the things critics impose afterward like symbolism.
  • It’s ugly, but WordPerfect 5.1 on an IBM computer keeps up with your typing better than the Macintosh, and the less that stands between you and the words, the better.

The Macintosh might have won, but there was no way I could afford one of my own, and the writing was pretty much on the wall for IBMs being in every office.

The advent of these IBM compatibles was really more or less the end of computing as a hobby for me. After this, it was some programming at work in Visual Basic or .NET, some assembly and repair of computers, and a whole bunch of angry video gaming.

They become appliances after that, which I rather regret.

My Computer History: Commodore SX-64 and Amiga 500

[I’ve been on a nostalgia trip lately thinking about the computers that influenced me growing up. They were a perfect metaphor for our latch-key generation: “Here’s a device with limited instructions. Good luck!” I know they changed the way I think, and this week, I’ll be blogging about the early computers that influenced me.]

In this entry, I’ll cover two computers that had some influence on me, though it was brief.

Commodore SX-64

Released: 1984 –1986

Specs: 64KB RAM, 6510 1.02 MHz CPU

My friend Norman introduced me to a lot of geeky and fannish things, but he was especially an early adopter of computers. It was like he’d been waiting his whole life to enjoy something so logical and structured and safe. He owned a Commodore SX-64.  

The SX-64 was the “luggable” version of the Commodore 64, sporting a 23-pound weight to challenge your back and a five-inch screen to challenge your eyesight. That must have been fun for spreadsheet users; it was bad enough for gamers like Norman and me.

In the afternoons following school, I’d walk over to his house and we’d play either Autoduel or Lords of Conquest on the machine, setting it on its back with the screen facing upward so we could squint at it from above. Norman’s mother would offer us off-brand soda and oranges, and those visits were a great respite in the early days after my parents’ divorce.

Norman and his family were Japanese-American, and his folks had been interned in the camps out west during World War II. His mother Mary wrote a speech about her experiences for Toastmasters that she shared with me when she found out I wanted to be a writer, and it was very well-written.

I sometimes got the sense that they thought I was more likely to share that story than Norman was, though I never have because I feel only like it was loaned to me. Still, I was honored she thought of me in that way.

It was important to me at that time, too, to see other kinds of people leading other kinds of lives because my notion of “normal” was so skewed.

Not long before I went off to college, Norman sold the SX-64 to me because, small screen or not, it had a disk drive and my regular Commodore didn’t. I somehow had it in my head that I could write college papers on the thing, but I learned quickly it was too obsolete even with an external monitor.

I did play one of my favorite video games of all time on it, though: Wasteland. For a (relatively) primitive computer game, it had a great story with interesting characters. One time I was playing it during a difficult high school tumult of some kind, and when I rescued a character from the brink of death, he said, “Let’s go kick some ass!” I was taken aback by that, but it filled me with a strange sense of hope.

I guess it just hit me the right way at the right time.

Commodore Amiga 500

Released: 1987 –1991

Specs: 512KB RAM, 68000 7.16 MHz CPU

The reason Norman could sell me his SX-64 was that he upgraded to an Amiga 500. I used it only a few times, but it was an astounding machine with the first real GUI I’d ever seen, as well as the first 3.5” disk drive and mouse.

I remember looking at the mouse and thinking how counter-intuitive it was compared to a joystick, and I thought it would never catch on. Sixteen-year-old Will thought we’d be working in Photoshop with a joystick in 2021. Also, he thought he’d be the President of the United States, so he wasn’t a keen predictor of the future.

It was on Norman’s Amiga that I first played Lemmings (meh) and Marble Madness, which I loved.

What was significant to me about the Amiga was that it was a computer that I could tell was already doomed. IBM clones were invading schools and companies, and the Amiga seemed like the last real hobbyist machine that was just odd enough to be enjoyed by a limited group of people. It would live a little longer as the source of many animated TV titles you remember from the late 80s and early 90s, helped by a device called the Video Toaster.

I guess the Amiga was my first introduction to the principle that being amazing isn’t always enough.

My Computer History: Apple //c

[I’ve been on a nostalgia trip lately thinking about the computers that influenced me growing up. They were a perfect metaphor for our latch-key generation: “Here’s a device with limited instructions. Good luck!” I know they changed the way I think, and this week, I’ll be blogging about the early computers that influenced me.]

Apple //c

Released: April 1984 – August 1988

Specs: 128KB RAM, 65C02 CPU

In my junior year of high school, I started to write more papers for English and other classes, and neither my handwriting nor a typewriter would cut it anymore. I had a Brother daisy wheel printer for my Apple II+, but it sounded like small arms fire crackling across Beirut whenever it cranked something out on its tractor feed paper.

Plus AppleWorks, one of the early word processing programs, didn’t work on my Apple II+.

Fortunately for me, our high school’s library had an Apple //c lurking in one corner by the exit doors, and the librarian (Ms. Newnan) was kind enough to let me skip my lunches and hang out there instead of the noisy cafeteria where I’d been assigned to be.

I spent that time doing homework and papers at first as part of my renewed interest in doing well enough in school to get the hell out of that town, but then it slowly dawned on me.

I could write stories on this thing.

For most of my childhood, I’d told people that I wanted to be a movie director when I grew up, thinking that was the main way that I could make stories come alive for an audience. From what I’d read about George Lucas and Steven Spielberg, I assumed that they just showed up with a story they wanted to tell and then ordered a crew and actors around to do it.

When I wrote and performed stories for my class as a kid, I usually wrote them as scripts that only I could decipher and perform, sometimes improvising on the fly based on their reactions.

Once I got to high school, I’d read enough to recognize that real stories had paragraphs and descriptions and dialogue. My interests became more literary, and my answer to the slightly-changed question (“What do you plan to do for a career?”) metamorphosed into becoming an easily-employable college professor who wrote fiction in his copious free time.

(Hey, we didn’t have the Internet then to disabuse us of our delusions. What little I saw of my guidance counselor, she could only predict that I’d end up in college or prison.)

Access to the library’s Apple //c meant that I could type my work and correct it, and that led to my first attempts at stories with beginnings, middles, and ends. Most of the early ones were fairy tale-like, but then they grew more and more solid as time went on.

(It’s interesting to me that my own storytelling evolution followed that of humans: ad hoc performances to drama to simple myths to more complex stories.)

Around this time, the show Twin Peaks had already aired its first season and I was fascinated by the idea of an FBI agent like Dale Cooper for whom emotion was an important part of his detective work. As an proto-emo kid myself, fighting evil while looking dramatically cool to girls appealed to me deeply.

So when we were assigned to write our own Canterbury Tale after reading them for English class, mine was “The F.B.I. Agent’s Tale,” not about Dale Cooper but about an agent a little more like me at the time: creative and heartsick, turning his back on the ordinary world to fight evil alone.

I packed a lot into that story, including some indecision about my future, my split interests between the artistic and the practical, and my desire to do good in the world without having to be around people too much. It had a sad ending, of course, as I perceived to be most likely for me given my high school romances so far.

It was not even close to a good story.

But it was a whole story, and I’d written it with passion from my own (bizarrely perceived) experience. I sat down like a professional, typed in the words, rewrote them as necessary to sound better and more convincing, and birthed a blobby and emotional Thing to share with the world.

It took a lot more writing for those lessons to stick, especially the one about steering toward emotion and mutated personal experience instead of jokes and gags.

I’m not sure any of that could have happened as easily if I’d written by hand or on a typewriter. I needed the friendly (to me, after all my programming) feel of a computer keyboard to invite me in, and what I’d learned about development taught me how to shape things over and over until they felt right.

I spent years afterward trying to learn the “right” way to write a story with scene and structure, character and motivation, plot points and arcs…and really, I more or less had the basic technique down in 1991. All that was left was increasing my practice and improving my taste so I knew when not to stop fixing.

The first real story of my career started because one librarian perceived that I needed a quiet place and an Apple //c more than to be in the cafeteria. I’m grateful she did.

My Computer History: Apple II+

[I’ve been on a nostalgia trip lately thinking about the computers that influenced me growing up. They were a perfect metaphor for our latch-key generation: “Here’s a device with limited instructions. Good luck!” I know they changed the way I think, and this week, I’ll be blogging about the early computers that influenced me.]

Apple ][+

Released: June 1979 – December 1982

Specs: 64KB RAM, 6502 CPU

Wait, are we regressing in time? No, just in technology.

Nice as the Commodore 64 was, all I had for it was the tape drive and for some odd reason, Commodore 64 software and hardware wasn’t all that common near my small town. It was much easier to get software for the computers my school used, and those were Apple IIs.

Luckily for me, a college student at UF ran out of pot money midway through the semester and my sister bought his Apple II+ for me in 1988. The good news about Apple computers at the time, though, was that you could expand them, and mine ended up with 64KB of RAM, an 80 column card, dual disk drives, a printer, and a speech synthesizer.

Plus I had access through friends to a huge library of software and games.

My favorites for the Apple II were:

There were a few others, but when I played games on the Apple, it was mostly these.

Much like the Millenium Falcon, however, my Apple II had some issues that required constant calibration and tinkering, especially with the disk drives…so a good part of my time using it involved having the case open or off altogether.

By God, in my day, we EARNED our video games!

The Planiverse, 2D Worlds, and 3D Lives

Probably the most important influence of the Apple II came from programming on it. In 1989, my friend William Simmons introduced me to a book called The Planiverse by A.K. Dewdney that curved the trajectory of my life.

The Planiverse tells the fictional story of Dewdney’s computer science class accidentally stumbling upon a two-dimensional alien world while programming a simulation, and it combines theorizing about 2D physics, chemistry, engineering, and biology with the spiritual quest of its main character, YNDRD. The students watch YNDRD as he makes his way across his continent on a pilgrimage, and what he finds was one of my first encounters with the idea of finding the truth through the intersection of many perspectives.

It was the kind of book that cracks open your skull at just the right time in your life and changes the way you see the world forever.

Part technical manual and part philosophical exploration, the book has influenced me in ways I haven’t realized until writing this paragraph: stories told as fake non-fiction, technical writing, explorations of the soul taken through technology, epic journeys that end up back at the self.

What I took from it in 1989 was the desire to simulate a world on my Apple II. I didn’t think I’d contact a two-dimensional world, but I was fascinated by the idea of creating a world and defining its rules.

So I worked on a program called 2DWORLD in which square block animals explore a digital savannah, grazing on stationary “plants” or chasing “prey” if they come within a detectable range.   

Each animal’s “instinct” was basically this:

IF FOOD is visible, MOVE toward it one square, ELSE move in a random direction. IF FOOD not found after five iterations, DIE.

This was not a visually sophisticated simulation, just a dark green screen with colored blocks twitching after one another on it. But I could experiment with putting in fewer plants or more plants, fewer predators or more predators, and see how long the simulation would sustain itself.

When I visited my sister and her husband over the summer at UF, I spent a lot of time in Marston Science Library looking up the theory behind computer simulations, and I wrote it all up in a paper that I presented with my science fair project.

I got a C (eventually) because, as my non-computer-savvy science teacher put it, “This could be Pac-Man for all I know.” That would have been far more disheartening if I didn’t win the Computers division of the science fair with that project.

What did I learn there?

  • Experts aren’t always experts.
  • Even science can be done out of love and joy, one iteration at a time, following the next logical hypothesis until you discover something wondrous.
Fortunately, I have a positive attitude about it.