I confess: I am but one week away from commemorating my 40th year on this planet, and I have yet to ever play The Game of Life. This is not due to some ethical or existential objection to simulating the course of one’s existence upon a square slab of cardboard, but rather due to my friends and I having spent our youthful recreation time with Star Wars toys and kindly ol’ Super Mario. I never got around to playing Candyland either.
As beloved as this board game may be, with its plastic minivans, its cruel cash-drains and generous paydays, buried deep within its roots is a transformative story. The original version of the game, concocted by Mr. Milton Bradley himself, elevated the concept of gaming from prescriptive quests for moral elevation to a more practical and modernized measure of success. More importantly, it came packaged with choice.
The Game of Life as we know it (well, as you probably know it, since I’ve never played the thing) features one early decision: go to school or get a job. After that, each soul is subjected to the whim of the spiteful spinner, suggesting that life is but a cavalcade of random collisions, and that we are always at the mercy of the fickle flick of fate. Mr. Bradley’s outlook on destiny was far more empowering.
Tracing the Bradley lineage would suggest that a rather dreary definition of “life” could have taken center-stage in his outlook. The family tree was planted in America in 1635, and since then its bark shows the hatchet-marks of murder, Indian attack, kidnapping, and at one point hot embers being poured into an infant’s mouth. When Milton finally squeezed his way onto the planet in 1836, the Bradleys were a little less prone to being butchered, but far from being economic titans. Read more…
Here’s the part where the guy twitching in the hungry crosshairs of 40 tells you how the music topping the charts these days won’t inspire so much as a quiver of his trigger finger. But really, who cares? The purveyors of popular song have no interest in capturing my iTunes money. Just as my parents wondered desperately who on earth would want anyone to Rock Them Amadeus, I can’t fathom why Iggy Azalea’s “Fancy”, a piece of simplistic monotony with a Clueless rip-off video, spent a month this summer at #1.
Ever since the end of the halcyon days of 80’s pop, the soundtrack that flipped the pages of my childhood, I have paid scant attention to the Billboard Hot 100 chart. While MC Hammer’s parachute pants flapped in the raucous wind of his success, my high school friends and I were discovering the mystical quests within the grooves of Led Zeppelin and Pink Floyd records. So I guess I haven’t been hip in about twenty-five years. I’m okay with that.
I’d always been a trifle suspicious of this chart anyway. What is it counting? Sales? Radio airplay? Likelihood of ending up as a parody on Weird Al’s next album? There is actually some math to this madness, and it’s far too complex for my mid-week brain to tabulate without a nap under its belt. But I’ll do my best.
For almost two decades prior to the Hot 100’s debut in the pages of Billboard (yes kids, Billboard was and is an actual magazine. A magazine is kind of like Buzzfeed.com made out of trees), the chart tabulators kept track of three separate stats: the best-sellers, the songs most played by disc jockeys and the songs most played in jukeboxes. That last one was key, as a disgraceful clump of radio stations were refusing to play rock ‘n roll in the mid-1950’s. Billboard had to track what was big with the kids. Read more…
Do you sometimes feel as though you’re haunted by bad mojo? Do you sense a crinkly shadow slurping up your footsteps, stalking you with hand-wringing deviousness and an insidious yen to muck up your days with the swift slap of a paranormal brute? Well, I have good news for you. You are almost definitely wrong, and it’s entirely possible that you’ve been soaking your brain too long in the tart brine of unjustified paranoia.
While it’s true that some people appear dogged by a mystical and unspoken conflict with their electronic devices, watching them break down at a rate far exceeding average, no one sporting an official science-badge in the brim of their hat has stepped forward and confirmed this phenomenon. There is no bio-electronic battlefield, no psychic-binary clash of DNA and circuit-board synapses. Yet most of us can relate stories of friends or relatives whose luck with electronics is notoriously foul. Folks who cycle through crapped-out cell phones more frequently than shampoo bottles, or whose computers are swimming in a vortex of perpetual blue-screen mayhem.
Maybe there’s something to this madness. It’s not like all of science has ruled this out. Take, for example, Austrian theoretical physicist Wolfgang Pauli.
Wolfgang was no slouch. Nominated by Albert Einstein, he snagged the Nobel Prize in Physics for developing the Pauli Exclusion Principle, which has to do with quantum mechanics, spin theory, and a star-studded cast of concepts I won’t pretend to understand. Pauli’s lasting reputation among those of us whose brains aren’t tuned to the frequency of theoretical physics is his bizarre effect on lab equipment. Read more…