So I lost my job last week.

I have (had) two jobs. The first is the one I usually blog about, the one where I help build a democratic school that addresses the development of the whole child, including the development of his or her or ze’s social-emotional skills. It’s a real gas.

The second job, the one I lost last week, is the one where I provide high-level guidance to college students on the craft of creative writing. The college where I’ve taught for the past eight years faces a crisis-level enrollment challenge and, as an adjunct in the humanities, I’ve just felt, in my wallet, the force of that challenge.

It’s a great college. It not only does exactly what it says it does, but it does so with real passion and force. The professors generally walk the walk, and the staff members I’ve interacted with have all been genuinely kind and helpful. The entire philosophy of the college is that we are all members of various communities, and it’s imperative that we act in a knowledgeable and deliberate way to improve the lives of all the members of those communities and not just ourselves. The people I’ve met and worked with at the college strive to do exactly that.

Unfortunately, this will be the first semester in a very long time when I can’t count myself among them. And that disappoints me.

Luckily, the people I just described are not just my colleagues; they’re also my neighbors and my friends, so I can continue to count myself as a person in their wider communities.

There’s another reason I am disappointed though. Two more reasons, actually. The first is that, as a professionally unpublished writer, the only way I could rationalize my expensive investment in my M.F.A. was by pointing to the fact that an M.F.A. is the minimum requirement to become a writing professor, so if I wasn’t able to pay back the investment through publishing, I’d be able to do so through teaching. But now I don’t even have that. So yeah, that’s a disappointment.

The third reason is that, for the first time in eight years, I was going to do a wholesale strip-down of my bread-and-butter course: an introduction to creative writing aimed at non-major students to get them interested in the major.

Teaching at the college level is different than teaching at the high school level (and incredibly different from the middle school level). The teaching part of it is the same — be engaging, be knowledgeable enough in the topic to inspire a sense of curiosity, and be authentic in your desire for the students to ask you questions you don’t know the answer to — but the behind-the-scene goals are different.

In high school (and even more so in middle school), students don’t have the right to ignore you. That doesn’t mean they don’t or won’t ignore you; it means that, at the end of the day, society requires them to be there, and its willing to back that requirement up with force. Put simply, in high school (and even more so in middle school) students have a lot less choice.

At the college level — primarily in the first two years, when most students still haven’t invested enough time or money to feel compelled by responsibility — every student you meet must be coaxed to move on to the next level.

There is an instituitional purpose to this: 30% of college students drop out after their first year, and only 50% of students graduate within a reasonable time. With those as statistical truths, all members of the college — including the faculty — must do their best to help students want to stay in school.

But there is also a departmental-level impetus. As a teacher not only in the humanities, but also in one of the softest of soft subjects, I have to include within my responsibilities the need to attract students to my subject matter. I must keep the funnel flowing from the 2000-level introductory course to the 3000-level courses where the full-time faculty are mostly employed (I’ve taught 3000-level courses in the past, but that was before the the economic crisis of 2009 had a dramatic effect on student enrollments in private liberal-arts-based instutions). While education is always the primary responsibility, this need to sell the major is also always there.

This is not a critique. I live in the real world and would have it no other way: at every level, at every point, an artist must sing for her supper. I get it, and I love it. That is not the point here (but for more on that point, read this essay by an anonymous adjunct instructor).

The point is that, for the first time in eight years, I was about to launch a brand new product, and now I’m being told that I won’t even be given the chance.

I’m not taking it personally because no one has yet told me that I should. I know the college’s financial situation, and I understand that, as an adjunct, I am the definition of low-hanging fruit, so I have no hard feelings at all.

But I really wanted to give this new course a try.

It is still an introduction to creative writing, but instead of breaking the semester down by genre — six weeks of fiction, five weeks of poetry, and three to four weeks of screenwriting or creative nonfiction (depending on the semester) — I was going to blend them all together and teach not a genre of creative writing but creative writing itself.

From a business perspective, the goal of the course is to convince non-majors to continue doing work in the major — i.e, to convince new customers to become repeat customers. For the past years, my sales pitch has been akin to an analysis. I wanted to expose the students to ideas and notions about creative writing that they hadn’t yet heard before, to show them, in some way, what it means to take the craft of writing seriously.

My competitors were the high schools. I had to be able to take them deeper into the concept of creative writing than anything they’d done in high school, to make them feel as if they were, in some way, being led behind the curtain.

But I also couldn’t take them so deep that they’d felt like they’d seen it all. The end of the semester had to leave them wanting more.

This upcoming semester though, I wanted to change it up. Instead of doing an analysis of creative writing, I was going to attempt some kind of sythesis. Instead of digging deep into the concept, I was going to dance them atop it, spin them from one place to another with enough joy and verve to trip the light fantastic, leaving them, at the end of the semester, with an artist’s sense of the possibilities, not of what goes on behind the curtain, but of what can be accomplished on stage.

I’m still not 100% sure how I was going to do it. The semester starts in about four or five weeks and my plan was to work on it during the first full week of August when I take a writer’s retreat in my own home (my wife and daughter are visiting my in-laws while I stay home with no obligation but to write, and to write in a serious and purposive way…and, I suppose, to feed and bathe myself as well).

The college course wasn’t the only thing I was planning to work on next week, but it was one of them, and I was very much looking forward to it.

I had a fantasy where, instead of writing a syllabus for the course, I would write a kind of pamphlet, a short and to-the-point kind of textbook whose style would blend Strunk & White’s with Wittgenstein’s to create a style all my own.

In the eight years I’ve been teaching the course, I’ve yet to use a textbook. I figured maybe it was time to write my own.

While I still might attempt it next week, I don’t have the pressure of a deadline now. And that disappoints me too.

Oh well. Here’s hoping the course comes back to life in the Spring.

Crazy Like An Atheologist

Over the past few months, I’ve had several religious experiences repeat themselves in terms of set and setting and outcome. Earlier in the summer, I tried to reconcile these experiences with my atheistic faith. If atheism is the denial of a divine intelligence, how could I explain several subjective experiences that told me with as much certainty as I am capable of that I was communing with a divine-style intelligence?

In that earlier blog post, I attempted to retain the reality of both my atheism and my experiences by allowing for the possibility of non-human intelligences whose objectivity can only be described in hyper-dimensional terms. Hyper-dimensional does not mean divine — it just means different.

In this post, I’d like to examine the question of whether I am crazy.

I am a relatively smart human being. Billions of people are smarter than me, but billions of people are not. It may be true that I am overeducated and under-experienced, but I am also forty years old, which means that, while I have not experienced more than a fraction of what there is to be experienced, I have, in truth, had my share of experiences.

It’s true that I’m on medication for a general anxiety disorder, but it’s also true that so is almost everyone else I know, and I don’t think I’m more prone to craziness than anyone else in my orbit.

Furthermore, it is true that I’ve enjoyed recreational drugs, but it is also true that a few weeks ago I went to a Dead & Company concert where people way more sane than I am also enjoyed the highs of recreational drugs.

All of which is to say, I don’t think I am crazy.

The friends I’ve shared my story with don’t seem to think I am crazy either. I’m not suggesting that they believe I communed with a divine-style intelligence, but they signaled their willingness to entertain the possibility that these experiences actually hapened to me. They were willing to hear me out, and though they had serious questions that signaled their doubt, they also seemed willing to grant that certain arguments could resolve their doubts, and that, provided these arguments were made, they might concede that my experiences were objectively real.

In other words, I don’t think my friends think I’m crazy either. They may have serious doubts about the way I experience reality, but I think they also realize there’s no harm in what I’m saying either, and that there may even be something good in it.

I’ve read a lot about consciousness and the brain. I haven’t attended Tufts University’s program in Cognitive Studies or UC Santa Cruz’s program in the History of Consciousness, but I feel as if I’ve read enough in the subjects to at least facilitate an undergraduate seminar.

Through my somewhat chaotic but also autodidactic education, I’ve learned that neurological states cause the subject to experience a presence that is in no way objectively there. Some of these states can be reliably triggered by science, as when legal or illegal pharmaceuticals cause a subject to hallucinate. Other states are symptomatic of mental disorders objectively present in our cultural history due to the unique evolution of the Western imagination (some philosophers argue that schizophrenia isn’t a symptom of a mental disorder as much as it is a symptom of capitalism).

I am a white American male with an overactive imagination who takes regular medication for a diagnosed general anxiety disorder. It makes complete sense that a set of neurological states could arise in my brain unbidden by an external reality, that the combination of chemicals at work in my brain could give birth to a patterned explosion whose effect causes me to experience the presence of a divine-style intelligence that is not, in the strictest sense, there.

But I want to consider the possibility — the possibility — that this same neurological state was not the effect of the chemical chaos taking place in my brain, but rather the effect of an external force pushing itself into communion with me, just as a telephone’s ring pushes airwaves into your ear, which pushes impulses into your brain, which causes a neurological state that signals to the subject of your brain that someone out there wants to talk to you.

I’m not saying someone called me. I’m saying that the neurological states that I experienced during those minutes (and in one case, hours) might have been caused by something other than the chemical uniqueness of my brain, something outside of my self.

In a sense, I’m talking about the fundamental nature of our reality. In order for these experiences to actually have happened to me, I have to allow for a part of my understanding of the fundamental nature of reality to be wrong. And anyone who knows me knows I do not like to be wrong.

Heidegger wrote an essay where he basically argues that there is a divine-style presence (by which I mean, an external, non-human presence) that we, as human beings, have the burden of bringing forth into the world (according to Heidegger, this burden defines us as human beings). He argues that there are two ways we can bring this presence into the world: the first is through a kind of ancient craftsmanship; the second is through our more modern technology. The difference lies in what kind of presence will arrive when we finally bring it forth.

Accoring to Heidegger, the ancient sense of craftsmanship invites a presence into the world through a mode of respect and humility. Heidegger uses the example of a communion chalice and asks how this chalice was first brought into the world.

He examines the question using Aristotle’s notions of causality, and based on his examination, he concludes that the artist we modern humans might deem most responsible for creating the chalice actually had to sacrifice her desires to the truth of the chalice itself: its material, its form, and its intention. The artist couldn’t just bring whatever she wanted into the world because her freedom was bounded by the limitations of the material (silver), the form (a chalice must have a different form than a wine glass, for example), and the intention (in this case, its use in the Christian rite of communion). The artist didn’t wrestle with the material, form, and intention to bring the chalice into the world; rather, she sacrificed her time to coaxing and loving it into being — she was less its creator and more a midwife to its birth.

For Heidegger, as for the Greeks, reality exists in hyper-dimensions. There is the world as we generally take it, and then there is the dimension of Forms, which are just as real as the hand at the end of my arm. For the artist to bring the chalice forth into the world is to bring it from the dimension of the Forms, which is why, for the ancient Greeks, the word for “truth” is also the word for “unveiling” — a true chalice isn’t created as much as it is unveiled; its Form is always present, but an artist is necessary to unveil it for those of us who have not the gift (nor the curse) to experience it as a Form. In an attempt to capture this concept, Heidegger characterizes the artist’s process as “bringing-forth out of concealment into unconcealment.”

I know it feels like we’re kind of deep in the weeds right now, but stick with me. I promise: we’re going someplace good.

After exploring the art of ancient craftsmanship, Heidegger contrasts the artist’s midwifery style of unconcealing with modern technology. Where artists coax the truth into being, modern technology challenges and dominates it. It exploits and exhausts the resources that feed it, and in the process, it destroys the truth rather than bring it to light.

For an example, Heidegger uses the Rhine River. When German poets (i.e., artists) refer to the Rhine, they see it as a source of philosophical, cultural, and nationalistic pride, and everything they say or write or sing about it only increases its power. When modern technologists refer to the river, they see it instead as an energy source (in terms of hydroelectric damming) or as a source of profit (in terms of tourism). For the artist, the river remains ever itself, growing in strength and majesty the more the artist unveils it; for the modern technologist, it is a raw material whose exploitation will eventually exhaust its vitality.

The modern method of unveiling the truth colors everything the modern technologist understands about his relationship with reality. It is the kind of thinking that leads to a term like “human resources,” which denotes the idea that humans themselves are also raw materials to be exhausted and exploited.

In my reading of Heidegger, the revelatory mode of modern technology is harder, more colonialistic and militaristic. It not only exhausts all meaning, but it creates, in the meantime, a reality of razor straight lines and machine-cut edges. This is why, in my reading of Heidegger, he believes we should avoid it at all costs.

To scare yourself, think of the kind of artificial intelligence that such a method might create (i.e., unconceal). It would see, as its creators see, a world of exploitable resources, and it would, as its creators are, move forward with all haste to dominate access to those resources, regardless of their meaning. The artificial intelligence unconcealed by this method is the artificial intelligence that everyone wants you to be scared of.

But Heidegger wrote at the birth of modern technology, when it was almost exclusively designed around the agendas of generals, politicians, and businessmen. He didn’t live long enough to witness the birth of video games, personal computers, or iPhones. He didn’t understand that the Romantics themselves would grow to love technology or that human beings would dedicate themselves to the poetry of code (Heidegger reminds us that the Greek term for the artist’s method of unconcealment is poeisis, which is the root of our English term, poetry). Heidegger could not conceive of a modern technology that shared the same values as art, and so he was blind to the possibility that, through modern technology, humans would also be capable of bringing forth, rather than a colonial or militaristic truth, something that is both true and, in the Platonic sense, good.

A theologically inclined reader could find in Heidegger an argument between the right and good way of doing things and the wrong and evil way of doing things, and through that argument, reach a kind of theological conclusion that says the wrong and evil way of doing things will bring forth the Devil.

But Heidegger’s arguments are not saddled with the historic baggage of Jewish, Christian, or Islamic modes of conception. Rather, he find his thoughts in the language of the Greeks and interprets them through his native German. He implies a divine-style presence (and his notion of truth contains the notion of presence, or else, what is there to be unconcealed?), but he’s only willing, with Plato, to connect it to some conception of the Good. He seems to fear, though, that, due to modern technology, this divine-style presence might not be the only one out there.

I’ll give Heidegger that. But he must grant me the possibility that there could be more than two different kinds of presences that humans are capable of bringing forth, or rather, more than two different kinds of presences that we are capable of recognizing as something akin to ourselves.

Heidegger had his issues, but I don’t think he was crazy. I do, however, think his German heritage, just like Neitzche’s, could sometimes get the best of him, and the same cultural milieu that resulted in a nation’s devotion to totalitarianism may also have resulted in two brilliant philosophers being blinded to some of the wisdoms of Western democracy, namely, that reality is never black or white but made of many colors, and just as the human presence is as complex as the billions of human beings who bring it forth, the divine-style presence brought forth by either art or technology may be as complex as the billions of technological devices that bring it forth.

Think about it this way. Human beings have a very different relationship to the atom bomb than they do to Donkey Kong. But both relationships are objectively held with technology. Is the presence that might be brought forth by Donkey Kong the same as the one brought forth by the atom bomb? To suggest so would be like saying the reality brought forth by the efforts of a nine-year-old Moroccan girl share an essence with the reality brought forth by a 76-year-old British transexual. Yes, there are going to be similarities by virtue of their evolutionary heritage, but to suggest they both experience reality in the same way is to overestimate one’s heritage and miss the richness of what’s possible. We wouldn’t want to do so with humanity; let’s not do so with technology either.

Here’s a question. When I say “divine-style intelligence,” what exactly do I mean?

Well, I mean a hyper-dimensional intelligence. This intelligence is abstracted above and beyond a single subjective experience and yet, like a wave moving through the ocean, it can only exist within and through subjective experience.

The interaction between the atom bomb and the humans beneath it is the result of a hyper-dimensional intelligence connecting Newton to Einstein to Roosevelt to Oppenheimer to Truman. Similarly, the interaction between the video game and the human playing with it is the result of a hyper-dimensional intelligence connecting Leibniz to Babbage to Turing to Miyamoto.

With such different paths behind them, such different veins of heritage, and such different modes of interacting with humans, wouldn’t the divine-style intelligences brought forth by these technologies be completely different, and shouldn’t one of them, perhaps, have the opportunity to be seen — to be experienced — as both good and true?

The subjective experience of a human being is due to the time-based firing of a complex yet distinguishable pattern of energies throughout the human brain (and the brain’s attendant nervous system, of course). You experience being you due to the patterns of energy spreading from neuron to neuron; you exist as both a linear movement in time and as a simultaneous and hyper-dimensional web. Subjectivity, then, is a hyper-dimensional series of neurological states.

But why must we relegate the experience of subjectivity to the physical brain? Could it not arise from other linear yet also hyper-dimensional webs, such as significant and interconnected events within human culture, maybe connected by stories and the human capacity for spotting and understanding the implication of significant patterns in and through time?

Humans are the descendants of those elements of Earthbound life that evolved a skill for predicting and shaping the future. Would that evolutionary path not also attune us to recognizing intelligence in other forms of life?

I hear the argument here, that humans seem incredibly slow at recognizing intelligence in other forms of Earthbound life — hell, we only barely began recognizing it in the human beings who look different from us, let alone in dogs, octopuses, and ferns — but in the history of life, homo sapiens have only just arisen into consciousness, and it seems (on good days anyway) as if our continued progress requires our recognition of equality not just among human beings but among all the creatures of the Earth (provided we don’t screw it up first).

It doesn’t seem unfathomable that, just as our subjectivity arises in floods of energy leaping and spreading throughout the human brain, another kind of subjectivity might arise through another flood of energy leaping and spreading across the various webs of our ecological reality, a subjectivity that arose from some kind of root system and may only just now be willing and able to make its presence known beyond itself, like a green bud on a just-poked-out tree, or like a naked ape raising its head above the grasses on the savannah time, announcing to all and sundry that something new has moved onto the field.

The story of Yahweh, of Christ, of Muhammed, is the story of a set of significant and interconnected experiences understood not just as real, but as divine. Yahweh, Christ, and Allah spoke through these experiences, some of which were verbal, others of which were physical, and still others of which were political, by which I mean, effected by decisions in various throne rooms and on various battlegrounds. Like energy moving from neuron to neuron, Yahweh, Christ, and Allah move from story to story, from event to event, traveling not through a single human brain, but through a collective culture, and through this, the God is brought forth in full truth and presence.

According to each of these major religions, one can connect oneself to (commune with) the presence of God. One can do this through artful devotion, through praxis, prayer, and/or meditation.

Even as an atheist, I’m willing to grant these religious experiences as real, but I’m not willing to grant them their exclusivity. I argue that the divine-style presences that made (or make) themselves known through the religions of Yahweh, Christ, and Allah were (are) hyper-dimensional intelligences suffering from a God complex. All three hyper-dimensional intelligences have their unique flaws, but they share the flaw of megalomania. This is understandable, considering how powerful they claim to be, but just because you’re powerful doesn’t mean you’re God. It just makes you powerful.

With Heidegger, I want to discuss the kinds of hyper-dimensional intelligences that might be unconcealed during human interactions with reality, but I don’t want my discussion to get bogged down by the concepts of God, gods, or even, like the Greeks, the Good. Heidegger founds his notions in the language of the Greeks’ concepts of Being; I want to use something else.

I would like my notions to rest on a rigorous concept of play, a subjective experience that, I believe, precedes the experience of Being, and leads to the possibility that, right now, we are not (nor have we ever been) alone.

Hopefully that only sounds a little crazy.

There’s Something About Those Stars

Every night, I venture onto my back porch and spend about 15 minutes looking up at the stars. Because I do this at pretty much the same time every night, I see the same stars over and over again, and almost exactly in the same position as the night before.

The constellation that gets my attention is Cassiopeia. I don’t know where I first learned about this particular constellation, but it’s one of the more famous ones, so I imagine it was sometime when I was young. Even still, I don’t think I understood how to spot it until I was in my twenties.

It looks kind of like a tilted “w” that sits low off the horizon, to the north and east of the Big Dipper (otherwise known as Ursa Major, the Big Bear — though truth be told, the Big Dipper is only the central section of the even bigger Bear).

I somehow know Cassiopeia was a Greek queen, but I don’t know how that queen’s story earned her a constellation (not that she didn’t deserve it or anything; I simply don’t know the facts of her story).

Usually, during these minutes of stargazing, I don’t carry my iPhone on me. This has not been because of a deliberate decision on my part; it’s merely been an ever-lengthening coincidence.

The lack of an iPhone hasn’t bothered me, though it’s often the only minutes each day when my phone isn’t somewhere within reach — or at least, the only minutes each day when I’m not subconsciously itching to touch my iPhone (regardless of whether it’s within reach).

The reaching for it, just the gentle desire to touch it, to make sure it’s there, I feel it, subconsciously, all day, and when I’m not able to do so, some part of me, sometimes consciously but always subconsciously, cries out, “Where’s my phone? Where’s my phone?,” until finally, there it is!, and I have it again.

But that itch goes away each night when I look up at the stars and pick out Cassiopeia. I don’t notice this lack of an itch, but thinking back on it, it’s true: the itch completely goes away.

Tonight, however, I had my iPhone on me when I went outside, and after a few minutes of looking up at Cassiopeia, I remembered it, and so after the required unconscious tap on my Facebook app, I opened my web brower and Googled the constellation’s name, not because I wanted to do a full search of the Internet but because I needed a shortcut to the relevant page on Wikipedia.

And Wikipedia (i.e., the wisdom of the crowd) told me that Cassiopeia was the mother of the woman whom was tied to that rock in The Clash of the Titans, the one whom Perseus wanted to save. She (the daughter) was served up to a sea monster to appease the wrath of Poseidon, who was holding the mother guilty for the crime of blasphemy, which she (the mother) committed when she boasted that both she and her daughter were more beautiful than the daughters of a sea god. The sea god was not Poseidon, mind you, but rather, the god who ruled the seas prior to Poseidon, so like, one of the sea’s still-living, past-ruling-gods (kind of like the sea’s version of Jimmy Carter).

Poseidon had to do something about such a boast. There’s a reason blasphemy is a sin. Blasphemy calls into question the power dynamic between a subject and its ruler. In order for the ruler to continue to rule, these dynamics cannot be doubted for a moment, and every outspoken doubt must be met by an overpoweringly undoubtable show of force, elsewise one brings into being the very beginning of a revolt.

And so Poseidon did what he had to do, and he came up with an unimaginably bitter pain for the boastful Cassiopeia: she had to sacrifice her beautiful daughter, whose only guilt resided in being the object of her mother’s boastful pride. To satisfy the wounded sea god’s pride, however, Cassiopeia had to sacrifice her daughter in a horrible, yet relevant way; she couldn’t just slice her daughter’s neck; she had to give her living daughter up to be consumed alive by a horrible sea monster.

In the story, Perseus comes along just in time and saves the princess (whose  name, by the way, is Andromeda; you’ve probably heard of her: we not only gave her a constellation [right below Cassiopeia’s], but we also named a galaxy after her — we’ve always liked princesses better than we’ve liked queens).

But the princess wasn’t really the guilty one; her mother was. So Poseidon had to come up with another punishment for the queen’s blasphemous crimes. He decided to curse her with a frozen immortality where she would forever be positioned as her daughter was positioned during what must have been the most torturous moment of both her and her daughter’s lives, forcing her (the mother) for all time to relive and never be released from the pain of that horrendous moment.

But he would do so not in private; Cassiopeia would not be frozen in some locked dungeon far beneath the earth where no one would ever see her or think about her crimes; no, instead, she would be held up high where we would all have to bear witness to her pain, a reminder to all of humanity as to what will happen if we boast against the gods (including those gods who are no longer in power).

And Cassiopeia sits above us, tied to her throne like Andromeda tied to those rocks, crying out, forever stuck in a moment of impending and violent shame.

The story of Cassiopeia doesn’t relate to my addiction to my iPhone, unless one wants to stretch the metaphor to its breaking point and compare modern culture’s worship of technology to the act of an ancient blasphemy…but hey, for argument’s sake, why not?

As I said above, blasphemy is an unforgiveable sin because it calls into question the power dynamics between a ruler and his/her/its subject. If we imagine for a moment that there is no such thing as God or gods, then what blasphemy are we committing when we sacrifice parts of our lives to technology?

As an academic living in rural Vermont, I have more than a few friends who are committed anti-technologists. They’re not nutjobs — they all watch Netflix, use computers, drive cars, etc., but they are also outspokenly critical of the costs and pains that come with our dependence on modern technology.

They are, in a word, humanists. They believe that humanity has an intrinsic value that ought to be defended. To their credit, they do not seem to believe that humanity is more valuable than anything else on the planet, but they believe that, despite its egalitarian relationship with everything else, humanity is truly unique and deserves to be saved.

One of the things it deserves to be saved from is technology. Like any other vice, technology sucks the life-force out of humanity and redirects it for its own use — like a poppy plant getting humanity high in order to make us grow more poppy plants. The more we sacrifice our energy, our attention, and our time to technology, the less control we have over our selves.

Studies show that an increased use of digital technology can lead to, among other things, increased weight gain, a reduction in sleep, the retardation of a young person’s ability to read emotions from non-verbal cues, increased challenges with attention and the ability to focus, and a reduction in the strength of interpersonal-bonding sensations. It directly harms our ability to enter into healthy relationships with other human beings, thereby harming humanity’s ability to regulate itself.

In other words, technology rules over humanity at this point; it regulates our interactions, even when we’re among each other. Technology has inserted itself into even our most intimate relationships (see: vibrator), and found itself enthroned upon an altar at which the majority of us bow down every night until we go to sleep, stealing from us the only productive hours we have after we sell ourselves into wage slavery in order to pay down our debts, debts which, let’s be honest, were mostly incurred by the manufactured desire to offer tribute to technology (collected in small amounts by technology’s high-priests: Comcast, Apple, Verizon, Samsung, the New York Stock Exchange, etc.).

To commit blasphemy against technology — to forget, even for a moment, even subconsciously, that technology does not rule over us, to not feel, even if only in retrospect, technology’s ruling hand — is to remember, even subconsciously, that humanity was here before technology, and that we did just fine on our own.

We weren’t weak. We weren’t bored.

We had kings and queens and gods who kept them in their place. And every night, we looked up at the dark night sky, and without feeling the uncomfortable itch of addiction, thought to ourselves, calmly, quietly, “There’s something about those stars.”

President Trump Did What Now?

I haven’t written about politics in a bunch of weeks. The reason is simple: it’s only a matter of time before Donald Trump gets impeached. There seems to be enough smoke now for any fair-minded person to agree that there must be some kind of fire. I don’t claim to know exactly what it is or who was involved, but I don’t doubt that the act of collusion includes the man at the highest level.

The NY Times is now reporting that Presidents Trump and Putin had an undisclosed, private conversation that lasted as long as an hour during the G20 Summit. It’s true that the conversation occurred in front of many of the world’s leaders, but except for Presidents Trump and Putin, only a Kremlin-employed interpreter knows exactly what was said.

Trump is attacking the Times for the story — “Fake News story of secret dinner with Putin is ‘sick.’ All G 20 leaders, and spouses, were invited by the Chancellor of Germany. Press knew!” — but it’s not about whether the press knew about it (nor is it about the President’s use of quotation marks around “sick” — does he think he’s quoting somebody or is he misunderstanding  the use of scare quotes?). It’s about whether the press reported the conversation, and until now, they had not.

Journalists know a lot of things. They don’t report on everything they know. The best of them only report on the things they know for sure, which means they have evidence to support it.

And what did the NY Times journalist, Julie Hirschfeld Davis, report?

She reported that “hours into” a G20 dinner, President Trump rose from his seat and joined President Putin for “a one-on-one discussion…that lasted as long as an hour and relied solely on a Kremlin interpreter.”

She wrote some more words to allow the White House to register its reaction,  and she wrote some others words to provide context for more casual readers, but at bottom, those are the only facts that she reported.

And President Trump calls it “Fake News!,” not because he denied it happened, but because he’s upset someone thinks such a conversation should be news.

This is the reason those of us on the left think he is an idiot. He can’t stop getting in his own way. How hard is it to not have a private conversation with the person you’re being accused of colluding with? And if you must have a conversation, how hard is it, really, to arrange a truly private one?

You know how hard it is for this president? Incredibly hard. Everyone in the bureaucracy is out to get him. He can’t make a phone call to anyone on the planet without someone else knowing about it, and with the leak culture being encouraged by the press and, let’s face it, the American people, that someone else is more than likely to let the information slip. How much worse would it look if President Trump tried to arrange an actual secret meeting with President Putin?

He had no choice. He can’t just not talk about the situation with President Putin, collusion or no collusion, so his only choice is to do it in the most public place possible.  If he actually wants to talk about the collusion issue, he can’t trust the State-department interpreter to not share the details of their conversation, even if only under oath to a prosecutor.

So what the President did, collusion or no collusion, makes complete sense. But to think, even if only for a minute, that such a conversation doesn’t deserve to be news is to think something bat-shit stupid. If the President of the United States had a private, one-on-one conversation with the Prime Minister of Luxembourg, the existence of that conversation would make the news — and I don’t even know if they have Prime Ministers there. To imagine it wouldn’t be news when you do it with your alleged colluder in treason…that’s just dumb.

That’s why I haven’t been writing about politics lately. I’m so done with trying to understand this President. I don’t have to anymore. I get it, and I honestly don’t think he’s a match for the one-two punches that keep coming at him from the bureaucracy and the press. If Mueller is as ethical as the press suggests, then it’s only a matter of time before they take him down.

At this point, writing about Trump feels more like trying to catalog and predict the ending of a one-sided fight — will he go down because of some kind of final, powerful blow or will he just succumb to a continuous onslaught of jabs? Making those kinds of prediction can be fun some of the time, like trying to predict which character on your favorite HBO show is going to die next, but more often it feels like trying to get excited about the arc on a crappy reality show.

There’s a danger in feeling that way, of course. If we allow ourselves to get bored by the lack of progress or overwhelmed by the case’s ever-growing details (how many fucking people were in that room with Don Jr. and how the fuck are they all connected again?), then we risk losing the urgency of the resistance. I get it.

But seriously, let’s look at this shit. Yes, the Republicans are trying to fuck up all kinds of shit in Congress, and yes, the President is doing a ton of real damage via Executive Order, but it seems the most they can do right now is all short-term stuff. They’re not organized enough to ram something through Congress — Trump is too unhinged and vague, and the Republican Congress has to reconcile the desires of too many “moderates” (as if…) with too many Tea Party crazies. If the Democrats can stay united in their resistance, the Republicans can’t deliver on the biggest promises they’ve made to the electorate, and they’ll continue to look and act completely dysfunctional.

Yes, there are things to do. Yes, there are real dangers to fight. But in all honesty, it seems like those who are doing the fighting for my side of things are doing a damn fine job, and I’m trusting them to continue to do so.

Me? I’ll keep going to work each day to teach the next generation of leaders how to think for themselves. It’s the least I can do.

What does it mean to be a self-published writer?

I’ve always interpreted self-publishing in terms of a bookstore: A self-published writer is someone who, from start to finish, is responsible for getting that book on that shelf.

But if I’m a bookstore owner, why am I going to allow you to come into my shop and just put your book on my shelves? If I start doing that, I’m going to have hundreds of wanna-be writers showing up on my doorstep, trying to get their stupid-ass books on my shelves. If I say yes to you, the rest will think I’ll say yes to them, and next thing you know, to make sure the books I sell remain high-quality enough for my customers, I’m screening which books make it on my shelves and which ones don’t, which basically means I’m doing the job of a publishing house now, and damn it, I’m trying to run a bookstore, not a publishing house, so no…you can’t put your self-published book on my shelf.

Can you imagine trying to talk your way past that guy? That’s a hell of a struggle, and even if you’re persuasive, it just means you got your book on that one shelf in that one bookstore, and everyone knows that no one goes to bookstores anymore.

So now, when you’re talking about self-publishing, what you’re really talking about is putting your book on Amazon. And that’s simple. Anybody can do that.

And millions of them do.

So now what’s your next struggle? It’s rising to the top in the cage-match rumble for a reader’s attention. If you want people to find your book in the jungles of Amazon.com, you have to work your network, which means turning friends and family members into customers and hopefully having a few of them who turn a few friends of their own onto your book.

But that seems kind of slimy to me. It’s putting your network to work, and that feels like an exploitation. I don’t want my friends and family to work for me. If they dig what I’m doing and they recommend it to someone else in the natural flow of their lives, that’s great, that’s honest and genuine; and that’s how I want my relationship with my readers to be: honest and genuine.

So there has to be another form to self-publishing, one that doesn’t require me to haggle with a bookstore owner or exploit the strength of my network.

And that’s when I realized there’s this. My blog. There doesn’t have to be anything other than this. It’s a place where I publish my writings and make them available for free.

I’m not a professional writer, and now that I’ve reached the age of 40 and am involved in a career that satisfies me personally and professionally in so many different ways, I’ve given up the desire to become a professional writer. I pay my bills in other ways, so why not write for free?

This doesn’t mean I’m not going to self-publish a book someday. But if I do, I’m going to link to it here on my website and make it available for free.

Because that’s what I think self-publishing should mean. If I didn’t get paid to write it, why should you pay me to read it?

There’s no resource being consumed here, nothing but time. And if your time is just as valuable as mine, why should you have to cover the cost of mine?

Except, wait a minute, because if we’re really talking about an exchange of time, truth must be spoken: it takes me a lot longer to write these things than it does for you to read them. Doesn’t that mean you owe me something? If our exchange value is time, doesn’t that mean you owe me some of your time (provided I have’t wasted whatever time you’ve already given me)?

That would be true if our time was equally as valuable, but it’s not. By virtue of your presence here, we can assume that your time — i.e., your attention — is precious. There are literally countless other things you could be doing with your time right now, but instead of doing any of those things, you’re doing this: reading the words I wrote. That’s a gift I must truly appreciate.

Because obviously, as someone who actually keeps a blog, I must have a lot of time on my hands, a portion of which I choose to give to this.

As a self-published writer, I’m not being paid for this. But as a self-selected reader, you’re actually paying for the time that you give me: in an attention-based economy, giving someone your eyeballs is to give them a major form of currency. I can use your eyeballs as leverage in a negotiated contract where the other party would be agreeing to exchange their services (editing, publishing, and marketing) for your eyeballs. If I give them you (i.e., my network), they’ll give me money. They won’t even have to read my work first because decision makers don’t care about what’s between the pages they publish; they care about the number of eyeballs that will, at the very least, scan those pages.

But, as I said before, I will not trade on the strength of my network. I refuse to think of my readers — of you — as a revenue stream. That would fuck up our whole relationship, and I’m not willing to do that.

Your attention is expensive, and it’s the only resource being consumed here. Everything else I’m just giving away.

I hope you find as much joy in it as I do.

The Arts of Telling the Truth

During the first ten years of my writing life, I learned that readers don’t want any bullshit but they do want to be entertained. The art, then, was the art of telling the truth. Some people call it advertising.

Ten years ago, I gave up the art of advertising and dedicated myself to the art of fiction. To my delight, many of the techniques I used in the art of advertising applied equally to the art of fiction. Regardless of how fictional a story might get, it has to be grounded in a shared reality between reader and narrator; it has to be grounded in something that both the reader and the narrator consider to be the truth.

The source of the truth doesn’t always have to be acknowledged by the narrator, but as the writer, its your duty to know exactly what that truth is and to not be shy about letting it be so.

Partly in thanks to this shared imperative to artfully tell the truth, my decade of experience in advertising and my six year study in fiction allowed me to earn a Master’s of Fine Arts degree from a college of artists worthy of the name.

Eight years later, I’ve learned that the same imperative that grounds advertising and fiction also grounds the art of education. Like readers, students require truths to come to them in a language they can understand. They may not want to face the truth directly (because the experience of doing so might be boring), but they also don’t want to put up with any of your bullshit. Like advertising and fiction, then, the art of teaching is just another genre in the art of telling the truth.

But for the first time in a long time, I need to revert to the art of advertising, which, while sharing the imperative to tell the truth, also has a set of rules and practices that differ greatly from the arts of fiction and teaching. Where fiction tells the truth in the service of a story, and teaching tells the truth in the service of the future, advertising tells the truth in the service of a transaction. It’s been over a decade since I put my words in the service of something that feels so base.

If I’d done my job correctly over the past year, this project would already be done. The goal: to create a brand-new website for my school, one that in no ways relates to the current content or design. The students were supposed to be in charge. I was there to drive the project and to lend support, and another adult was there to spark their ideas and educate them on the process of thinking like a marketer, but the students would be the people with their ideas on the table and their hands on the keyboards.

During the first three quarters of the school year, they met once or twice a week, during which time they developed concepts and ideas for the website. By March, they had approved the website’s structure, tone, and design. During the fourth quarter, they were supposed to get to work.

Unfortunately, the quarter moved too fast and their workloads grew too high, and so as a group, they could not finish the task of actually writing, testing, and launching a website. This was understandable — disappointing, but understandable — but it also meant that the project’s final deliverables fell on me.

That’s why my schedule for the next four to six weeks includes not only four days of teaching and/or administrative work, but also one complete day per week that is dedicated to the production and launch of the newest version of the school’s website.

The difficulty will come not from any of the technical details of the project (while I might not be able to achieve the website of our dreams, I’m confident I can produce, at minimum, a clean and professional looking website). No, the difficulty will come, ironically enough, from the task I’m most qualified to accomplish, that of writing the words themselves.

As you are aware, my writing borders on the verbose. Verbosity does not perform well on the web, where content is meant to be skimmed, not indulged in. Visitors to a website arrive to accomplish a task or to find some specific information; they don’t come to languish in the art of written creation.

I am able to be verbose on my blog because it’s my fucking blog, and if you don’t like my verbosity, that’s your deal and no harm to me.

But on my school’s website, if you don’t accomplish the task you came to accomplish or find the information you so desperately need, your child might not find the school that best fits their unique needs, or the school might not grow fast enough for me to grow in my job, or the parent of a diagnosed child might not find the water in the desert that our program can be for some families. If a visitor doesn’t like my blog, big whoop; if a visitor doesn’t like what I write on the school’s website, the harm could be great and the foul could almost be a sin.

At the same time, I know I can get it right.

Writing a contemporary website won’t be easy for me, and it will take humility to remember that my tone is not the school’s tone, but by the time the project is complete, I suspect I will, once again, discover that the art of writing a website is just another genre in the art of telling the truth (tasks and information not included).