For Duke Magazine:
For Crooked Scoreboard:
- About Uber and “The Process”
For this blog:
- About Susan Collins (from February!!)
- And finally, perhaps most importantly, some warnings about soup consumption and state fairs
For Duke Magazine:
For Crooked Scoreboard:
For this blog:
I joined a soccer league this winter. It’s a combination of the division I played in this fall and the two above it; during the season opener a few Saturdays back, my team, which had a number of holdovers from the fall squad, spent the first forty-five getting summarily stomped. Rarely did the ball leave our side of the field; every couple minutes, it seemed like our goalkeeper was having to valiantly snuff out an odd-numbers rush. By the time we gathered at halftime, a number of us were talking, with varying levels of jocularity, about quitting the league and resting up for the lesser demands of the spring season.
At that time, the score was 1-0. Had that been an American football game, we’d have been on our third stringers, having lost our first two lines on the depth chart to catastrophically galling injuries. Our coach would have sworn at us at halftime, blaming a lack of effort for our crippling talent deficit; we would have come out, ready to kill, and continued to run hopelessly into a buzzsaw.
But soccer, of course, is a different game; more so than brute relentlessness, offensive success in the sport requires a modicum of a creative burst–of inspired, atypical performance. There is no three-yards-and-a-cloud-of-dust strategy in soccer that begets sustainable, positive outcomes. We know that it’s not a solvable game, and certainly not solvable by effort alone. It requires brilliance–and luck.
Maybe those elements make soccer unpalatable to Americans–success in the sport runs counter to American dogma. While perhaps the most American element of football is its territorial field layout–you can read (as many have) yardage gains as representing imperialistic successes–I’d argue the second-most is its incentive structure: the first down. Football simultaneously favors grinding consistency and devalues intermittent brilliance. Gaining four yards every play, ad infinitum, is vastly superior a twenty-yard chunk every five plays. Being solid is preferable to being streaky.
Football inherently promises great things to those who don’t stop–who make use of every play, who don’t cheat (and earn penalties), who don’t set their sights too high (and risk incompletions or back-breaking turnovers). The optimal offensive strategy, in theory, is boring; there are no highlights, just a culmination of sustained brutality that results in a seven-point bonus.
This strategy, though, does not express itself in any modern iteration of the game. Perhaps one of Alabama’s recent championship teams comes closest–steadily, unrelentingly plying the opposition into submission. But teams, increasingly, pass the ball, guaranteeing failure roughly forty percent of the time. It’s a bastardization of the sport’s ethos–remember, the forward pass didn’t exist in the game originally–and yet, now such a gameplan is optimal. The de facto best strategy has risk. The de facto best strategy seeks moments of brilliance. There are diminishing returns, inevitably, from ramming one’s helmet into a defensive line time and again. Ultimately, even in the sport designed to showcase this very virtue, hard work and consistency are simultaneously unachievable and insufficient.
The juxtaposition of football’s incentive structure and its optimal strategy represents, I believe, the false promises made to America’s underclasses and their reality. We learn to value the worker who dutifully punches the clock each day, while shunning the flighty idealist. We defund the creative outlets–first arts and music classes for our youth, then the national endowments for adults–in favor of insipid but utilitarian alternatives. We treat our post-school lives as an exercise in avoiding setbacks–be they crippling debt, persistent unemployment. We take solace in following a path of small success after small success, steadfast in our belief that such plodding progress will be enough.
Meanwhile, we–and I’m primarily talking to my fellow millennials here–are not getting ahead in any meaningful way. We’re routinely underemployed, and those who have jobs are taking a smaller and smaller share of the profits. We’re playing by the rules of a system designed for a reality that doesn’t exist, still telling ourselves we’re on the path to prosperity when that path has been abolished. We’re trying to eventually buy houses when the affordable housing we want doesn’t exist; we’re putting our money into our 401(k)s when those returns aren’t what they once were, and retirement doesn’t start when it once did. Essentially, we’re running the ball on every down against a stacked box and expecting that to be enough.
What’s the solution? Well, this is the sad thing: There isn’t one, or, at least, there isn’t a sustainable one. Like a successful football offense, to get ahead, one has to take risks. One has to accept that some unemployment will lead to a greater payoff down the road, that going into debt won’t be a permanent burden. By taking on this risk, some will get screwed. Inevitably, it will be those who are already worse-off–those who weren’t born on third base with a familial fallback of wealth already in place, those who have to provide the land’s fat rather than live off it themselves–they will be the ones who fail more frequently.
The other solution is broader, both idealistic and naive. It’s a perverse, broken sport that the working class is forced to play: We could work to change the rules, so that not only the extremely talented manage to advance down the field. We could give everyone a wider safety net–perhaps a universal basic income, perhaps free college for all. We could make unions more feasible and popular, earning workers more benefits. We could, of course, tax the 1% much more progressively. But mostly, we could awake from our collective daze and stop pretending like this current system is worth keeping. This game has rarely been fun to play; now, it’s worth than ever. The time has come, I think, to at least ask if there’s something better out there.
For the most part, India pale ales are fine. Some, in fact, are really, really good: Maine’s Lunch and Dinner, for example, fit the bill if you’re into ELITE IPAs; most any mainstream Sierra Nevada offering does a bang-up job, and you don’t have to get a goddamn cash advance to afford one. People like to hate on IPAs–mostly because they want to hate on the people who love IPAs, which I totally get, and partly because the hop-industrial complex that is overrunning craft beer is both concerning and insane. On the merits of the India Pale Ale genre itself, though, it’s a weak case. IPAs are harsh, and they’re a bit of an acquired taste. But, Christ, isn’t all beer?
So I’m not here to denigrate the entire genus of IPAs; I’m also not looking to wrap myself to become its flagbearer in the beer debates that reach a level of, like, near-jingoism. Both IPA lovers and detractors make good points, although maybe not quite the points they think.
I think the argument that bros try–and fail–to make when they worship at the altar of IPAs isn’t that IPAs are the best beers. When they unwittingly advance such an argument, it comes across, like a bad IPA, overly strong and unpleasant. The more subtle justification is that IPAs–due to their more complex rendering of hops and ancillary ingredients–have a higher floor than any other genre of beer. The worst IPA undeniably beats the worst lager. This isn’t surprising: IPAs are more costly, more expensive. There’s a minimum threshold required to make, I don’t know, even Abita’s Andygator IPA (the worst beer I’ve ever willingly purchased), and that’s a taller task than Milwaukee’s Best is able to clear. If you’re making a blind choice–assuming cost is no object–and you’re hoping to minimize your risk of selecting a terrible beer, get an IPA.
Note that this is different from saying IPAs are the best. That’s an argument my brother made a few weeks ago, over Thanksgiving, as we stepped inside the esteemed Sam’s Quik Shop in Durham to select a few six-packs. Look at all the IPAs here, he said. When a shop like this has that many IPAs–I’m paraphrasing at this point, but it’s necessary to convey the pigheadedness of the sentiment–that’s just proof that IPAs are the best.
First of all, his claim reeks of confirmation bias; I highly doubt Sam’s selection is so imbalanced. But even if it were true, that’s a classic misconstruing of supply and demand. IPAs get stocked because they’ll inevitably get bought. While Sam’s tries to find the best beers in a particular genre, the store’s still resigned to offering what it thinks it can sell. And, as the number of people convincing themselves that IPAs are superior swells to a critical mass–which, clearly, is what’s happening in the American beer markets–the best strategy for a shop like Sam’s is to heavily concentrate on IPA sales. (I’d also imagine that IPAs, because of their perception of prestige, carry a healthier markup than other beers.) Said more simply, the phenomenon is demand-driven, and most prominently exacerbated by people firmly entrenched in the “IPAs are great” camp.
IPA drinkers also like to construct impenetrable feedback loops. They flock together. “I prefer IPAs,” says one, by way of small talk. “Me too,” says the second, in this very boring yet starkly realistic scenario. Then they abscond to the bar, offering each other recommendations on which beers are best, never fathoming dropping below 6.0% ABV or 40 IBUs. Perhaps someone joins them for the second round and offers to buy them a drink. “Another one of these,” they’ll say, posturing for their IPA compatriot while ignoring their amber-ale-drinking or, god forbid, hefeweizen-loving friend.
Entire classifications of beer, then, get tossed out in the contest of who can consume the most hops.
But this myopia cultivates something more displeasing: beer-chasing. Not the beverage itself, but the inexhaustible pursuit of having all the best beers. One of my old coworkers once talked about he preferred scotch to whisky, not because the former was better in his mind, but because it was more manageable to drink an encyclopedic quantity. Whisky (and whiskey) spans the globe, but scotch is at least confined to one country. Given enough time, you could drink your way across Scotland. Having every type of the other liquor, though, is an obvious fool’s errand.
Untappd has unlocked such a phenomenon for lager lovers: The popular app, designed for the user to rate beers and select, again, the best beer from a range of options, makes it easy to build a robust portfolio. The in-app badges, too, incent novelty in beer choice: A “Master” has 200 different beer samples; An “Elite” user has 2500. Each decision at the bar, then, contains a trade-off: Do I want what I enjoy, or do I want what will impress my digital and real-life peers?
I think, fundamentally, this is the problem with beer drinkers–and IPA drinkers, in particular. Too much choice has proved crippling; anyone who has visited Sam’s, or even just the neighborhood gas station’s now-mandatory “Beer Cave,” knows how the duration and agony of the selection scales with the breadth of options. Consuming beer is rarely never not a good time, but the average drinker needs direction. More often than not, that entails putting on blinders. Only drinking IPAs. Only drinking new beers. Only drinking the selections from a particular craft brewery–Maine Beer Co., Wicked Weed, Stone, etc.
What I hope for is a return to simpler times. It’s the golden age of drinking craft beer. There needs to be a way to simply have a favorite yet standard beer–a reliable choice, readily available. Something to drink regularly and not feel like a pud. Not every purchase needs to be a momentous event. Not every pint glass needs to come with a checklist attached and an Instagrammable pour. Corona encourages customers to Find Your Beach, but my plea actually takes a step back from that: Please, Just Find Your Beer. I’m sure it’s fine.
I think I would like Raleigh, in the next round of Major League Soccer bids, whenever those come to pass, to make the list of expansion cities. (The city was passed up this week in the first wave of finalists, but a second selection period is on the horizon.) As a soccer fan and Durham resident, I’d be delighted to have a top-tier team–even if it must compete in the insipid MLS–within driving distance. And as someone who increasingly feels a sense of civic and regional pride, I’d love to claim a local professional team in a sport that doesn’t seem–sorry, ‘Canes fans–horribly anachronistic in North Carolina.
A quick summary: The bid that Steve Malik, the local tech entrepreneur and owner of the area’s flagship soccer franchise, North Carolina FC, crafted doesn’t rely on bottomless founts of public financing; he promises to privately fund the $150 million bill a new stadium would entail. The location would not grossly disrupt a neighborhood–like, say, the Verizon Center (the Capital One Arena as of August) did in Washington D.C.’s now-parodic husk of a Chinatown–but rather nestle between the existing commercial spaces where Seaboard Station and Halifax Mall exist. The proposed buildings to be razed are strictly government facilities. I’m skeptical the stadium can create the revenue and jobs Malik says–most projects fail to hit these marks–but by and large this proposal is as innocuous as expansion projects come.
Still, I hesitate. Raleigh and Durham face major questions now and will continue to face them in years to come. They are vibrant communities now earning just recognition for that vibrancy, perennial features on the clickbait rankings circuit–among the best cities for millennials, young entrepreneurs, anyone who has ever used a Juicero while texting, etc. They are changing rapidly: Durham projects a population increase of twenty-two percent by 2030; Raleigh-Cary is the fastest-growing metro area in the state and the fourteenth-fastest-growing in the country. In the mayoral elections this past November, in Raleigh and Durham alike, the lack of affordable housing was arguably the most ubiquitous and pressing issue.
With that topic in mind, focusing on a luxury like the MLS seems irresponsible. But the MLS isn’t the most harrowing potential guest from afar: No, that would be Amazon.
Raleigh and Durham are still semi-longshots to land the Seattle-based company’s second headquarters, but they’re pining for the opportunity to dole out tax breaks and other incentives to Amazon, were Jeff Bezos to be sufficiently wooed. The consideration of such a development is frightening to anyone hoping the Triangle will maintain a semblance of livability and culture. The 50,000 jobs (heavy eye-rolling) the company promises will, if Seattle’s experience proves an accurate harbinger, be given disproportionately to white, well-to-do males; in concert, rents will skyrocket (in Seattle it has doubled in just the past five years), homelessness will rise, traffic will grind to a halt, and public resources will be consumed at a rate beyond that which the higher tax base can support. (These negative factors–among others–contributed to the decision from San Jose and San Antonio not to bid for Amazon.) This process has already begun, but if and when Amazon comes to the Triangle, the gentrification of D.C.’s Chinatown, and Brooklyn, and post-Katrina New Orleans will quickly loom over the Oak and Bull Cities much like the inevitably-erected $2,000-a-month apartment buildings will.
There is, undeniably, an excitement to these bids. Both Amazon and the MLS are big names. It would be *cool* for the Triangle to host either of them: Seeing the somewhat-overlooked region mentioned in national headlines, just by virtue of its being considered, provides a nice dopamine drip.
What is the opportunity cost, though? In that downtown Raleigh space Malik proposed, what else could go there? Affordable housing stock? A no-ticket-necessary public park from which everyone can derive some benefit? And the eight million square feet that Amazon will reportedly need for its new campus: If that can be carved out for a corporation, how can the city make use of it to benefit its current tax base?
Or will the influx of outsiders be the only ones to enjoy the land’s newfound fruits?
In a sense, the MLS bid is fine–in a vacuum. But if getting a sports team becomes merely another step in a large-scale revitalization–an influx of disruptive investment, a whitewashing of culture that is an inherent byproduct of urban privatization, a commitment to building glitzy new toys for some citizens to enjoy while ignoring the essential services its other citizens lack–it’s much less palatable.
And the alacrity with which Raleigh, Durham, and the entire region have sought to please this corporate behemoth–with apparently little consideration for the downstream effects Amazon’s presence will have on its citizenry–can only be viewed with trepidation. Economists have evinced, time and time again, the fallacy of sports stadiums as a magnet for investment. I’m thrilled that the Malik’s MLS bid, then, is perhaps the most sensible possible model. But I’m growing afraid that, in a future urban analysis, cities like Raleigh and Durham will feature in the telling of a cautionary yet not unfamiliar tale, the one where a tornado of a corporation rolls through a town of storm chasers.
Let me start by saying I’m no puritan. I’ve been known to have a bit of a devilish tongue. From time to time, my thoughts have maybe even breached the boundary of what one considers “unwholesome.” So I write here not to litigate, judge, or certainly censor, but rather advise: Don’t eat soup in public. It’s disgusting.
This isn’t a clever, sly, in-crowd euphemism. No, I mean this literally: Don’t eat soup–or bisque, or chowder, or even chili below a certain viscosity–in public. I know that we’re steadily approaching winter. I know you’re cold and hungry, maybe even parched, and that soup–that dietary utility player–seems like the solution to all of your problems. Do it. Go buy some soup. I respect your choice as a consumer.
But I don’t respect your right to consume that slop in my general vicinity. Much like electricity, water, and internet prior to next month, we should treat a genial lunchtime–one without neighbors interrupting conversations every ten seconds with a puckering minestronal SCHLURPPP–as a utility. We should respect the sanctity of the public dining table, be it in the office, the cafe, or the food court, and refuse to besmirch it with some pud who has to furiously blow (look at him, he’s NURTURING THE SOUP) on his disgraced slurry to make it palatable.
I’m spilling no secrets when I say soup has inherent structural flaws. Once soup has become the right temperature, it is then too cold. To eat enjoyably, soup requires a mechanic’s tinkering, a Buddhist’s patience. Its fluidity necessitates the manual precision of a calligrapher–better yet, of a sculptor–to put the liquid into its vessel; the final transfer, likewise, begs Dizzy Gillespie’s lung capacity and embouchure to avoid disaster.
That’s before getting into the etiquette for eating soup–the delicate coiffing of the spoon away from oneself, the dizzying heights the spoon must rise before one can dip one’s head to meet it. It is a hopeless, frantic extraction that inevitably ends in dripping, slobbering imperfection. Yes, there are “proper” ways to eat soup. But there’s no way to eat soup and retain one’s dignity. If you engage in this act in my line of sight, I’m afraid I can no longer respect you.
The truth is, when eating soup, you’re consuming a scalding liquid from a height at which liquid shouldn’t be consumed. There’s a reason drinks arrive in graspable, vertical containers. There’s a reason drinks, McDonald’s coffee aside, invariably arrive within an acceptable range of temperatures. Most food and drink make sense; soup doesn’t. (That sentence also presents a tertiary sub-argument against soup, that your public consumption of it will invite the galling, faux-intellectual pondering of whether one “eats” or “drinks” soup. The correct answer is neither: When having soup in public, one “burdens society.”)
Perhaps this take comes across as elitist. After all, public soup kitchens are a great, necessary service. For others, soup is simply an economical option, the most rational budgetary choice. Surely, Lucas, you’re not advocating these soup-buyers are forced to languish alone, shamefully downing their entrees in solitude whilst the Paninarazzi gaggle together in unfettered gaiety?
My response is three-pronged: First, my advice is mainly targeting those foodies in Corporate America, for whom I am skeptical that soup is always the cheapest option: take Panera Bread, for example, where soups, salads, and half-sandwiches are priced equivalently in the “Pick 2” menu, despite the caloric deficits that soups offer. Second, I think a large-scale coordination of purchasing decisions can change the way in which we consume soup. Should bowls become cylindrical? Should spoons be replaced with very wide straws? Should we revive Juicero by reducing every soup to its late-capitalism, inevitably all-liquid state? These are the questions we need to be asking.
Third, soup consumption can be inoffensive in public–but only when undergone unanimously. When the souping is wholly communal, that is, when everyone is being disgusting, there is no dignity to be claimed, no pearls to be clutched. It is a barbaric experience and roundly objectionable to an outsider, but in the absence of none claiming offense, the outrage is vaporous like the famous Zen koan: the sound of one hand clapping back.
In that way, we need to treat eating soup like eating peanuts–hazardous in public. You can eat whatever repulsive things you want–however you want–in the privacy of your own home. But, please, as you go about your day, SCHLURPPP on this: As soon as you step outside, once you start eating soup, you start affecting people’s lives.
Kentucky Governor Matt Bevin on Monday tweeted, “To all those political opportunists who are seizing on the tragedy in Las Vegas to call for more gun regs…You can’t regulate evil…” He posted this at 10:38 A.M., fewer than twelve hours after the massacre in Las Vegas that left fifty-eight dead and over five hundred wounded, the deadliest mass shooting in modern American history.
I’m singling out Bevin for what is a pretty typical Republican* response in these all-too-frequent circumstances.** First comes the standard “thoughts and prayers” post, usually paired with how saddened the congressperson–and often his wife, also, because the republican is statistically a heterosexual, connubial male–are by the circumstances. Then, there’s the punting of the political football, the request to give a day’s breath, if not longer, before even attempting to discuss gun control in any serious capacity.
(*Yes, some Democrats, Bernie included, have complicated pasts when it comes to gun control legislation. On the whole, though, they’re greatly in favor of heightened gun control, much more so than their Republican counterparts.)
(**These circumstances being a white guy perpetrating a mass shooting, rather than a person of color and/or, god forbid, a Muslim. Those response playbooks are sinister in their own right.)
But inevitably, if anyone pushes back on these initial postings, what comes out next is the helplessness. The idea that this tragedy stemmed from a singular actor, a “lone wolf,” the mere bad apple of the harvest. The thought that no amount of paperwork and background checks could have prevented the final outcome. The conclusion that, given how preordained (and yet, somehow unforeseeable) these situations are, there’s very little to be done, except to pray for more “good guys with guns” next time. (Judging by the performance of publicly-traded gun stocks Monday, these hopes will be answered.) Mostly, there’s the finality and assured belief that something this horrendous, to paraphrase Bevin, can’t be regulated.
This attitude, of course, is a complete crock. It’s an attitude that, due to the sweeping, unfettered political lobbying power of the NRA, exists for questions of gun control but very few other issues. Does the right feel helpless when it comes to questions of abortion–after all, “nasty women” will manage to find the relevant doctors one way or another–or does it seek to restrict the funding for and accessibility to clinics throughout the country? Do Republicans throw up their hands on the issue of voter fraud–if you want to cheat the system, sooner or later you’ll manage to cheat–or do they milk every last ounce of difficulty out of the registration process to adversely affect the marginalized, time-strapped voter? Illegal immigrants will do whatever is necessary to ruin America, the Fox News-tinted logic goes, so, then, why bother building a wall or eliminating DACA?
The truth is these regulations, whether or not they’re 100 percent effective, have some impact. The victims and targets of these policies know this; republicans, certainly, know this, which is why in so many instances they eliminate regulations that hinder their goals. It’s for this reason that they want lower taxes for the rich. It’s for this reason they want to undo the Dodd-Frank Act and enable more perfidious banking practices. It’s for this reason that the right won’t abide the slightest, most token bit of lip service when it comes to combating climate change.
The regulations we have in place in America–and those we don’t–are no accident. Whether it’s regarding gun control or any other hot-button issue, the Republican party line is clear: Undesirable, unacceptable, and unfavorable actions and activities can be regulated to the point of effective extinction; everything else, well, can’t. As such, this dichotomy invites a ridiculous but necessary question. For the right, will the carnage that we saw in Las Vegas–not to mention Sandy Hook, Orlando, and the 272 other mass shootings that have already occurred in America in 2017–ever truly become unacceptable?
Most people are dunking on Taylor Swift’s new song, as well they should. More so than her typical overwrought, focus-grouped maudlin efforts, this song, “Look What You Made Me Do,” repulses in every sense of the word. It’s a steaming pile of factory-farmed, eastern North Carolina hogshit, whose downstream impact will be felt for generations.
But this song was doomed–beyond its aimless lyrics, uninspired production, and general absence of intent beyond an attempting photobombing of a disinterested newscycle–by its length. At three minutes, thirty-one seconds, Swift’s latest is right in the sweet spot of the worst songs that get produced. The good three-to-four-minute song, much like the effective 600-word essay, or the seventy-minute movie that clips along nicely, does not exist.
Songs trying to make it on the radio have coalesced to this length, a function of history (radio stations only could play music on 78s and, later, 45s, which held three or so minutes) and the insufficient time span and patience of listeners. As technology developed, average song length grew, but it has seemingly returned from its early nineties peak, settling in around 3:45.
It’s understandable, then, that music of this length has been produced in the past. But now, with the proliferation of streaming services and the decreased influence of the radio, there exists a chance to make better songs. Specifically, shorter or longer songs, because any song whose length starts with a 3 that’s immediately followed by a colon is–to use the technical term–trash.
The three-to-four-minute song can’t help being the way it is. It didn’t ask to be the awkward, gangly musical teenager that thinks it’s doing well but, in fact, is a tornado of idiocy, bad intentions and ill-conceived ideas. More to the point, the three-to-four-minute song inherently serves no purpose. Think of your typical radio play: Odds are it has a brief intro, two verses, each followed by a chorus, a third part with no purpose but to extend the song and, perhaps, invoke a key change, and a final chorus. If the artist is popular, maybe the third part features one of his/her friends; if the artist wants to have some flair, a saxophone. And then, after a seemingly brisk 3:39 (“Shake It Off”) or 3:46 (The Chainsmokers’ “Roses”) or 3:47 (“Despacito”) that simultaneously rambles on and on, the song draws to a close.
Friends, this is a horrible framework in which to operate. Let me be blunt: If an idea can be fully expressed in two and a half verses, perhaps it’s not worth writing about! In fact, most songs fall into this category. The easiest way to improve an overcooked piece is to simplify; you can edit those thoughts down to simply two verses, and we, the audience, can get out in a cool 2:18 (like with “She Loves You,” by a little band called the Beatles). That song simply starts with the chorus. It’s delightfully quick. Nat King Cole’s “L-O-V-E”? Just two verses, a trumpet break, and then those two verses again. The entire conceit is simple, and it didn’t require Sammy Davis Jr. stepping in to upset things by spelling some different word halfway through. Which is good, because even at 2:30 it starts to seem stretched. The Marvin Gaye and Tami Terrell duet of “Ain’t No Mountain High Enough” actually follows a lot of the foolhardy road map above, but since, on top of maintaining a healthy, robust BPM, it doesn’t dawdle on the intro–four bars–and the choruses–a mere eight bars for each–the whole thing is a scant 2:24. Even Bill Withers’ “Ain’t No Sunshine” ends in 2:04, despite eighty percent of it consisting of him saying “I know.” Short songs work.
Long songs work, too. Long songs–the good ones, at least–are the kinds of songs that don’t get merely tossed off. They’re produced by way of significant, strenuous mental effort; their length is justified, earned. Could you imagine a three-minute version of “Hallelujah,” or “Stairway to Heaven” or “Purple Rain”? Even a song like Pink Floyd’s “Great Gig in the Sky” which, honestly, has no real structure to it works at its extended length (4:40); to cut a minute would simply not make sense. These songs are statements, and more often than not, 200 seconds or so isn’t enough for a meaningful statement.
The artist can also take a note from EDM, which, for all of its drops and dopaminal excesses, at least knows how to create an effect on its listeners. Just say screw it. Do you know how long the intro to Earth Wind and Fire’s “Got to Get You Into My Life” is, before they actually start the first verse? One minute, eleven seconds. (The popular yet much inferior “September” is, of course, 3:35.) You ever hear “Vertigo and Relight my Fire” by Dan Hartman? No words until 3:57 in. It’s great, and the full album version is 9:44.
I will concede that good songs occasionally fall into the three-to-four-minute parameters, but not by design. The last example is a useful case study: The proper treatment of “Relight My Fire” is as the main course, following the four-minute appetizer of “Vertigo.” As such, most treatments pair these songs together. A single featuring simply the latter song exists, presumably for radio purposes (its runtime is 3:42). I have never heard that, because every version I’ve found on YouTube features the extended play. I’m sure the cleaved, barren version works; clearly, though, it’s inferior, as the fans have spoken and demanded the sprawling if indulgent effort, not the Garage Band-esque, training wheels-attached starter kit edition.
To cite a more recent example, take the Bruno Mars-Mark Ronson collaboration “Uptown Funk”: The album version is 4:28, yet the radio version is 3:57. From what I can tell, the intro in the radio version has been somewhat abbreviated but the ending has been completely neutered, with one of the best horn parts of the whole song (around 4:10) becoming a casualty in the race to the finish line. The 3:57 version still bangs–after all, it’s “Uptown Funk”–but it exists in lieu of a much better song that’s a mere 13 percent longer.
In that sense, I guess, a more conservative take is warranted; it’s also more damning. It’s not impossible to have a good three-to-four-minute song. But the easiest way to do that is to first create an incredible two-minute or five-minute song and then make it demonstrably worse. If you’re aiming for that sweet spot to begin with, then I question your taste, your level of mental complexity, and your general purpose, and I also reserve the right to turn off the radio, just like *that*.
“Oh, wow,” said the writer, now liberated from the incessant bleating on his eardrums, free to spend three-and-a-half minutes however he pleases. “Look what you made me do.”
The Steelers lost in the playoffs yesterday. There was a time when such an event would send me into a weeklong funk. That it doesn’t now can be chalked up both to my heightened age and my proportional apathy about sports–and relief that I don’t have to watch Tom Brady mince my team’s secondary for the umpteenth time next Sunday. But really, I think the loss to the Jaguars also laid bare some basic truths about football’s structure that warrant consideration.
Let’s start with the game itself. The most scintillating aspect of the Steelers yesterday were their receivers. Antonio Brown (2x), Martavis Bryant, and the versatile running back Le’Veon Bell all made tremendous touchdown grabs; Bell had a second score on a lateral where he caught the ball at the ten and made the remaining defenders look silly. From an aesthetic perspective, the Steelers shined yesterday: Pretty much every play they made was an overt demonstration of supernatural physical talent and coordination. It was–in combination with the Vikings’ dramatic last-second victory over the Saints–a day of football that reminds you that, oh yeah, football can be pretty fun.
So we’re all good, right? Well, even in that list of sterling performers were some insidious strains of the NFL’s flawed ethos. Brown was playing through a torn calf muscle that–although it didn’t hinder his bottom line–nevertheless seemed to affect him throughout. On the Steelers last drive, with them down ten with a minute to go and the game essentially out of hand, Brown wasn’t out there even though Bryant, Bell, and JuJu Smith-Schuster all were. He did what fans–and coaches, and owners–expect of their players, which is to play through injury. He played yesterday at “not close to 100 percent,” according to a source. But, the logic goes, that’s fine: The bill will come later. For now, he needs to do his job.
It’s a tortured philosophy. Players risk injury every game; when they inevitably suffer setbacks, they’re expected to risk further injury for the betterment of the team by playing before they’re ready–even though they’re endangering their future income when making such a decision. NFL contracts aren’t fully guaranteed; most players can be cut at little-to-no expense.
Bell, the inimitable running back and yesterday’s other star, has made airwaves with his upsetting of the apple cart that is the NFL labor market: He’s talking retirement, despite being 25, in the prime of a running back’s career. The reasoning? He played this season under the franchise tag–a one-year contract that, for superstars like him, represents wage suppression and a source of long-term instability. (A franchised player’s pay is the average of the top five paid players at the position or 120% of his previous pay, whichever is greater. For Bell, arguably the best in the league, such a sum won’t be close to his true market compensation.) The Steelers are threatening to hand him the franchise tag again, which means he must play productively and healthily for another year to earn the multi-year contract he deserves. Again, the player must sacrifice in the short term, with little guarantee–or likelihood–things will work out for him.
But the most uncomfortable element surrounding the NFL and the Steelers, though, remains the spectre of Ryan Shazier. Shazier, the middle linebacker who suffered a spinal injury six weeks ago and has just recently regained feeling in his legs yet is still wheelchair-bound, continues to maintain a positive disposition despite his condition. He’s shown up in the press box for multiple games; he spoke to the team at halftime during Sunday’s loss. Team members wore his shirt featuring the motto “Shalieve” before the Jaguars playoff game; a portion of the shirt’s proceeds will go to spinal research and The Boys and Girls Club of Western PA. Shazier in recent weeks has become a mascot, a Gipper-like figure for the team to rally around. Had the Steelers won, and even continued on to the Super Bowl, I have little doubt he would have been a Media Day staple.
What’s unfortunate, though, is that this storyline of esprit de corps almost allows the Steelers and, by extension, the NFL to turn a catastrophe into a branding mechanism. A disastrous injury–rather than sparking conversations about improving player safety, difficult discussions about whether and how inextricable elements of football’s violence can be eradicated–merely becomes a tangential movement, much like the blackballing of Colin Kaepernick for protesting police brutality somehow turned into Jerry Jones taking a knee for “unity.” Shazier’s paralysis feeds into a Rovellian blog post. A player, with his agency lost, transitions to a motivator for the betterment of the team. A man becomes a brand, and the league manages to shy away from any individuality and humanizing elements, or acknowledging–yet again–a blemish on its shield.
I don’t mean to blame Shazier for remaining positive, or for making T-shirts for this good cause. I’m happy–and amazed–that he’s able to turn a negative into some semblance of a positive like this. I blame the league, though, for being complicit in the aftermath of his injury, for not directly addressing the risks football poses for players and taking responsibility for them. On top of the other compensation disadvantages the NFL bestows upon its labor pool, NFL players on average have a career of 3.5 years, and the league’s average pay and minimum pay are the lowest among the big four U.S. sports. I’ve focused on stars in this post, but for the NFL’s second and third tiers, the situation is even more dire.
The solution or sacrifice can’t always come from the individual. The plights of three great players–Brown, Bell, and Shazier–all indict the system. Which means, quite simply, the system has to do better. I’d be lying if I said I was hopeful.
My previous job was as a management consultant. For the record, I don’t suggest it. As a profession, it’s ripe for parody. Consulting work can loosely be bucketed–that’s Office suite slang for grouped–across three…buckets: The first containing recommendations that confirm the company’s prior knowledge and can now, weaponized by the newfound confirmation, be acted upon; the second filled with ideas that are intriguing but, whether due to internal politics, inertia, or general distrust, will never see the light of day. (A third, much more diminutive bucket boasts interesting proposals that get deployed and substantially change the business.)
Very few of the findings consultants provide are actually incremental. Their analyses package neatly into PowerPoints, layered with matrices and flow charts that belie any sense of complication, but at their core, these slideshows provide besuited executives a chance to nod their heads at the prevailing wisdom that they’ve just now heard in a marginally novel form. Alternatively, the executives might also shake their heads at counterarguments, at which point the consultants blinkingly read the room, saying they can revisit the numbers and “touch base” next week.
That doesn’t mean, however, that consultancies aren’t valuable to firms: Businesses have recognized the power an outside voice can have in reaffirming conventional concepts. For example, if a company needs, or just wants, to downsize, it’s easier to point to a sobering McKinsey analysis than to start pink-slipping and white-boxing folks on a random Friday. (Not every consulting project is like Up in the Air, but still.) There’s an insulating layer–from anger, from accountability–that companies are buying for themselves when they hire consultants. Consultants invariably arise from the top universities, and from the tops of those universities’ classes. They are, by any quantifiable metric, smart. Smart is good. If things go awry after hiring smart people and cushioning them with luxurious resources and bottomless expense accounts, then, well, shucks.
The executives were just gallantly trying to follow their consultants’ wisdom, the logic goes. What else were they supposed to do?
Religion works the same way–that is, it occupies the same role–as consulting, but for individuals. That’s not a knock on religion: Many of its tenets, like much of the advice consultants provide, are positive, good, cromulent ideas. Consultants try to improve the business world; religion works on the social, moral world. The latter is invariably more noble, but it’s not divorced from the unsavory practices of the first. And too often, the veneer and surface-level virtuosity that religion offers merely enable bigots to engender an excuse.
It’s hard for me to believe, for instance, that an unblemished soul would pick up the Bible and decide to hate gay people based on a few choice paragraphs–that those paragraphs would outweigh much of the balance of the Bible’s other teachings of, roughly, “try to be a good person.” The idea that Masterpiece Cakeshop’s Jack Phillips could look at a couple and, because they were two men, refuse them a cake seems less an act that would inherently offend God and more, like, well, Jack just being a dick.
But religion provides a level of detachment. The cakemaker isn’t hateful; instead, he’s devout, principled.
Some religious figures accept gay marriage–⅔ of Catholics, for example, and a higher percentage of white mainline Protestants–working from very similar texts as those who merely believe it a sin. God’s word, then, isn’t absolute. There’s wiggle room.
I also struggle to believe God has strong thoughts on abortion, or vaccinations, but these, too, are arguments the religious right advances. There are ways to stretch God’s word–his recommendations, essentially–to fit one’s agenda: The Bible doesn’t mention abortion, but a control-F for “murder” turns up a few results.
Likewise, there are ways to fill in the gaps in the good book. When the Bible certainly does not address all these phenomena–the book’s authors unsurprisingly had little concept of, say, penicillin–a Principled Man can say that any sort of vaccination is against God’s Will: That declaration, alone, makes the practice haram. Making such a case, when considering children who are at great risk for particular diseases, arguably clashes with one of the Ten Commandments (“Thou Shalt Not Kill”).
Here, we see, an internal argument can now be waged for either side, if one wants. How does one decide? My best guess is the ultimate recommendation, then, will hew closely to what one had predetermined was right and wrong. And if the evidence doesn’t quite match the conclusion one prefers, then perhaps the “numbers”–the passages of interest–can warrant a closer examination until they nicely coalesce.
These Principled Folks are just trying to obey God’s wisdom, the logic goes. What else are they supposed to do?
Most things in life don’t have intrinsic value: Ideas, currency, and skills, for example, all only become worth something when a critical mass of people decide they’re worth something. The corollary, though, is that once something becomes “worthy”–say, a consultant’s advice–then a secondary halo of worth can be culled from a surface-level adherence to the first. In the past decade, as “big data” became hot, companies could become “cutting-edge” because they were “data-driven.” It didn’t matter how the data would be applied, or even how well it would be used, just that the data and the company name were included in the same search-engine-optimized press release.
In the corporate world, such a pattern is odious, but the harm, though, tends to be limited to the firm itself. If a company runs itself into the ground with deleterious decision-making, no one’s life is significantly hindered.
When considering religion and the broader world, though, it’s a concern. God is good, we have all agreed; by listening to God’s word, people can be good. And, well, good people try to do good things–even if that requires them to hold some beliefs that hurt others. So much so that, in America, the right to practice those beliefs was noted before any other in the Bill of Rights.
But what they’re doing–bad things, over and over again–can’t be excused because it’s in the name of God. It’s no different from–or better than–a corporate calculation, a mere way to delegate touchy subjects and defer hate. By couching social (and scientific) issues in questions of God’s Will and Principles and Ethics and all this Pence-ian dogma–and not just an analysis–we do ourselves and society a disservice. We’re letting people pass off their misbegotten, bigoted beliefs on a higher power. We’re letting them evade the argument, allowing them to hide behind a facade of an ideological fortress.
I’d argue the saddest part, though, is we’re letting them off easy because we don’t question them. We don’t ask whether these people are listening to God, or simply waiting for an outside voice to tell them what they want to hear.
That guy who beat the absolute piss out of Rand Paul
Someone who has read and, like, understands Gravity’s Rainbow
I don’t know, maybe a person of color or something
A Mets fan (not Bernie Madoff)
A Browns fan not from Cleveland
Literally anyone involved with the OJ trial
The cast and crew of Ballers
Oh, and if that person of color is a woman, that’d be awesome
David Lynch’s personal assistant
Guy/gal with a foot fetish
Norm Macdonald, again
Those big bad, scary antifa folks
Especially the dude in Charlottesville who had a homemade flamethrower
Guy with a handlebar mustache
Guy with a sole patch
Person who serially over-shares on Facebook
Person who RTs Twitter bots
Someone who genuinely enjoys Andy Borowitz, Darren Rovell’s analysis, and/or Nickelback
Anita Hill, for example, wouldn’t be a bad choice
The guy who invented Crocs
Any of the writers who were passed over for David Brooks’ job
Tuesday’s a big day, in the sense that each opportunity the Democrats have to rebalance and ever-shifting electoral map constitutes a big day. There’s a sick voyeurism to the proceedings, if you’re not in a major state like Virginia, New Jersey, or Alabama: In North Carolina, for example, I’ll only be voting on a new mayor and city council. Any right-wing carnage that emerges in the state to my immediate north is, to some degree, not my problem. I can take it in as a soccer fan does a last-second opposition goal to put his team who, already down 5-1, now down 6-1. It sucks, but if the comically disastrous outcome results in the hiring of a new manager (read: a DNC shakeup, new platform, etc.), then maybe it’s actually worth it.
Nevertheless, though, I feel compelled to deliver the paternalistic advice that wherever you are, you should vote. It’s easy. This year more than any, the lines won’t be bad. (And similarly, this year more than any, your vote is likely to make a difference.) You’ll get a sticker. You’ll skip work. It’s not cool, exactly, but it’s surprisingly useful.
And if you’re bothering to get out the vote, please do one small-but-important thing: Vote smartly. Look to your local papers for advice, for details on candidates’ platforms. Pick one of the favorites who sounds the most appealing. Protest votes, on the national level, are merely a gesture; at the local level, they have a more tangible, damaging impact.
Depending on where you’re coming from, this is likely obvious or an incredibly specific warning, but there’s a reason for it, I swear.
Just this past week, Maine governor Paul LePage–a slightly more human version of Mitch McConnell–vetoed a voter-approved bill that would legalize recreational marijuana in the state. He linked it, spuriously, to the developing opioid issue in Maine, an issue that he’s fought by denying those without insurance treatment and merely looking to expand prison facilities. But by LePage’s standards, his failure to respect the will of Maine voters is rather far down on his list of crimes and embarrassments.
He’s a governor who, during his first month in office, went out of his way to blatantly ignore MLK Day.
He’s a governor who, in an unerring case of foot-in-mouth disease, has compared the IRS to the Gestapo.
He’s a governor who has invoked a record number of vetoes for a governor, to the degree that he, leaning fully into his role as a political heel, named his dog Veto.
He’s a governor who has blackmailed a school charity organization–threatening to withhold state funds–over their selection of a new president. He’s a governor who, after that episode, can claim to have survived an impeachment attempt.
He’s a governor who cried fake news before anyone else, overtly telling newspapers he didn’t want their endorsement in the run-up to his 2014 election.
He’s a governor who, before taking the Blaine House, made his living running Marden’s, the local Maine surplus chain. It’s a depressing place, a Salvation Army with no mission beyond profits, where you paw through crap that’s only available because some flatbed tipped over on I-95. It is the perfect place for LePage to thrive: Second to the now-abandoned paper mills that dot Maine’s eastern corridor, it’s the purest, saddest distillation of capitalism in the Pine Tree State.
On that note, he’s a governor whose perhaps only feasible promise was job growth; and yet, from 2009-2014, while the U.S. economy grew nearly 10% and the New England economy grew nearly 6%, Maine’s economy shrunk.
He’s a governor who blamed the growing drug trade in Maine on “guys with the name D-Money, Smoothie, Shifty” who “come from Connecticut and New York, they come up here, they sell their heroin, they go back home. Incidentally, half the time they impregnate a young, white girl before they leave, which is a real sad thing because then we have another issue we have to deal with down the road.” He, when asked about these comments, claimed to have a binder of such criminals; when reporters filed a request to see the binder, LePage dialed up one of them and called him a “son of a bitch, socialist cocksucker.”
LePage is, in short, horrible. He is so bad that the state voted in 2016 to form a new way to vote–an instant runoff vote–that would prevent such an catastrophic, undesired figure to take control in subsequent elections; in short, he is so bad that, like a bumbling time traveler, he has killed off his future self.
You see, in 2010, LePage won a de facto 3-way race with a mere 38.1% plurality, as independent Eliot Cutler and Democrat Libby Mitchell split 54.7% of the votes among them; fewer than ten thousand votes separated LePage and Cutler. In 2014, LePage won again, but still with Cutler and the lead Democratic candidate splitting the majority of votes.
The new system would punish candidates with “high negatives,” making it much harder for an odious figure–one who ostracizes sixty percent of the electorate but currently allows almost any alternative to seem better, thus splintering the opposition vote–to prevail. Alas, this system–the first of its kind in the nation–hasn’t yet been implemented; based on a recent ruling from the Maine Supreme Court, no elections will be affected until 2021 at the earliest. There’s no guarantee that other states will follow suit.
Really, the only reason it has received such a groundswell of support and made it to the ballot stage is because Maine’s current governor is an absolute clownshow. On one hand, it’s nice to see those on the left, when pushed to the brink, actually doing something. But of course, from Maine’s perspective, these ends really, really, really don’t justify the means.
With instant runoff voting on hiatus, preventing future episodes like Maine’s requires contributions throughout the electoral process–from candidate selection to town hall participation to local advocacy. Today, this wave must culminate with smart choices in the voting booth. There are countless more LePages waiting in the wings–Roy Moore in Alabama, Ed Gillespie in Virginia, and many sprinkled throughout city council and mayoral ballots. This Tuesday is a chance to learn, to reflect, to respond.
I don’t particularly care for Halloween. Much like puns, April Fool’s Day, or describing literally anything as “fake news,” it presents an opportunity for the masses to clear a very low bar of both cleverness and pop culture fluency. The end result, then, equates to a clunky game of Cards Against Humanity, where all the cards are written by the players, players who, on average, constitute the core demographic for Alec Baldwin’s insipid Trump impression. It is a holiday for everyone–most of whom are unfunny, few of whom are witty–to try to be funny and witty.
Effectively, it is a roving open mic night that has merged with Comic-Con. It’s awful.
I’m not the first Halloween curmudgeon, though. There have always been people who are lazy with costumes, those who piss on others’ jack-o-lantern carvings, or the neighbors whose houses wouldn’t get knocked on for fear of razor blade apples. But those attacks are, to be honest, played out. As the rest of our holidays have evolved–Thanksgiving with Turducken, Columbus Day with not existing–it’s time for the haters of Halloween to look themselves in the mirror; get a tummy tuck, some yoga pants, and maybe a new and ethnically-ambiguous personal trainer; and start moving to make this October 31 one to remember with these suggestions: