Joe Hall Interview
More Human than Humans
Amy Wright: How did you get to be the mayor of a trailer park?
Joe Hall: Pre-existing condition applied retroactively to several thousands of dollars worth of medical bills helped land me in a trailer park. I then appointed myself mayor.
AW: If you had to narrow it down to one figurehead, whose likeness would you bear on the front of your bow?
JH: Joe Hill.
JH: It makes sense to put an itinerant person on the bow of a ship, my ship. And if I have to have a figurehead, I like Joe Hill because he didn’t try to become a figurehead and most modern people wouldn’t recognize him. Which maybe doesn’t make him a figurehead. But at a certain point in time he was so dangerous as a IWW martyr that the USPS seized an envelope with his ashes in it, considering these contents possibly subversive. These ashes eventually found their way into the national archives and also Billy Bragg’s stomach. Which seems about right. Having Joe Hill on my bow would also lead to increasingly less diverting conversations about our similar names. It would also remind me to put my values in action outside of the arena of the page. Otherwise, I’ll never appear in a folk song. Oh, I’ll never appear in a folk song.
AW: Would you rather be a spider or a snail?
JH: The problem is I’m snail-like. Wanting to carry my home on my back. But it does not work. I move and move and never quite take it with me. So I would like to be a spider. A wild spider that destroys weaker things.
AW: Flann O’Brien says “A boil is a fright if you get it in the wrong place.” Have you ever found yourself in the wrong place, and how’d you get there?
JH: I have been in strange places—at the end of a conveyor belt at a Toys ’R Us shipping plant, working a portable sawmill, driving a forklift (briefly). But the strangest place I have been is the Paper Mill building in Georgetown. That was the fall of 2004, and I was working as a production assistant for a video production company working on commercials for state and national political campaigns. Propaganda, basically. Our messages weren’t just lies—we also had to lie in putting together the commercials. It was great fun hammering Republicans and using campaign contributions to buy sushi from Dean and Deluca’s, but I also felt like I was losing my mind. I didn’t belong there. It was absurd. I remember my boss being shocked that I didn’t know how to cut an avocado for salad.
AW: If you and some friends were in a band, what would you name it?
JH: This happened. We were called “Nanobots: A Rock Opera.” We wrote about twenty songs. They were mostly folk songs about nanobots that built robots who were more human than humans.
AW: How so?
JH: This was a collaboration with my friend Tom, so I always risk getting this wrong. Essentially, we agreed that humans have largely failed at being human. We’re boring, insensitive, and egotistical. Once we identify ways in which to decrease our capacity to feel and think, we relentlessly pursue those ways. That’s at least how humans are in the dystopic future-present of our folk-rock opera. So the more human than human robots are the opposite. They feel and react to things in intense, fluid ways. They’re also very folksy. Often the most sophisticated technology they use is themselves. Or something along those lines. But why try to explain this in prose? The lyrics of “Behind Robot Lines” speak for themselves. Here a human who has infiltrated a robot family is reporting back to the remaining humans on robot culture in song:
They’re really earnest speakers
They’re jokers with their friends
Most are good to their families
Most look out for their kin
They raise robot families
They keep robot dogs
They write robot poetry
They live in robot nature
They cut robot logs
AW: What do you know you still need to do before you die?
JH: I know that I want to undertake a writing project so gargantuan and absurd that it is doomed to failure.
AW: What would that mean—for it to fail?
JH: My friend Chad recently reminded me that William Carlos Williams’ friends often tried to talk him out of going forward with Paterson. A successful failure should probably make people mad even hearing about it. So perhaps, first and foremost, it should fail in a very fundamental way. It should be “a bad concept.” And pursuing this bad concept should threaten to disrupt not only one’s sense of poetic agency, but it should also threaten to reformulate one’s own social relations in unpredictable ways. Every public failure is a secret success.
AW: So maybe the axiom is wrong—to err is human, but to fail divine, and robots wouldn’t be afraid of failure.
JH: Yes! Absolutely. The robots would be afraid, but they would deal with that fear. Robots are not self regulating individualists; robot culture can be tightly knit. The web connections between robot friends and family are dense and multistranded. The robot would know that if he fails, other robots will be there to try to get him on his robot feet again, so the robot would go ahead anyway and risk failure—thus achieving robo-humanity.
Forgiveness depends on violation and having a significant relationship with the violator. Most of the time we’re sitting around without anyone to forgive, making the axiom—if not wrong—mostly irrelevant.