The guy who invented email died earlier this month. Although at this stage in email’s evolution, we probably would rather bury Raymond Tomlinson than praise him, there is no denying its influence on how we conduct our personal and business correspondence today.
I having been using email since 1985, giving Ray a bit of a head start to perfect things after he first taught computers attached to ARPANET to address each other individually in 1971. By 1986 I was actually in charge of our entire corporate email system, a responsibility I was as far from being qualified for as being in charge of refueling the space station. Luckily being in charge meant having a group of people who did the actual work of making sure the system was alive and well and accurately transmitting electronic messages coast to coast. My crowning achievement was overseeing the day we had to break the system into two parts to accommodate the growth in user base, which happened on a weekend so as not to prevent inconvenience. And that in itself tells you how far we have evolved (or actually, devolved): it was perfectly okay to go without email on a Saturday and Sunday and the world did not end. Fortunately the electronic surgery went off without a hitch because I have no idea what I could have done if everything had gone south except head south myself to Paraguay or somewhere else without extradition.
Email initially replaced paper memos as the primary device for internal communication. Like email, memos arrived in your personal inbox. Like email, memos were often useless distractions, such as announcing a redesign of the spaces in the parking lot (an actual memo I actually got in 1979). Unlike email, it was easier to say you didn’t get the memo and therefore willfully ignore the shift in lines of the parking spaces.
But to give it its due, email is perhaps one of the few electronic innovations that actually contributed to the (as yet unattained) goal of the paperless office because it made the paper memo and its cousin the three-part inter-office communication form obsolete. Oh, except for the guys (and they were all guys) who had their secretaries print out their emails before they would read them, and then have them filed after duly stamping them with the ‘read’ stamp.
Of course email may have made a dent in paper consumption but it created an entirely new problem: inbox bloat. That’s because it’s possible for anyone to issue memos to anyone else all day and, even worse, all night. It also allows the dreaded ‘reply all’ function, which we all know is the 10th circle of hell.
For reasons that escape me, if you search for ‘memo template’ you will get 535,000 results. I personally have not seen the electronically created version of a paper memo since about 1990. However, one thing that was good about them was there were rules about when to use them, rules what to say in them, and especially, rules about who to send them to – part of what we used to call the ‘rules of business’ communication. And that’s really what companies are trying to re-institute when they make forays into trying to reign in rampant email. If they would just go back to first principles of memo best practices, the problem would be solved. Oops – gotta go – my inbox just pinged me.
We should all know the difference between what happens online or on screen and what actually happens In Real Life (IRL) but somehow it is sometimes very hard to distinguish between these two things. Isn’t IRL where we spend all of our minutes, hours and days, and isn’t NRL (Not Real Life, a term I may have just coined and don’t even think about trying to steal it) the antithesis of actual things that matter. But I digress. Here are some things we see every day on various screens and devices that I wish actually existed in real life.
1. Good wardrobes for working women. Although apparently, The Good Wife gets most of her clothes off the rack, the outfits sported by all of the women on Suits and by Claire Underwood as she swans around the White House are all figments of someone’s very customized imagination. There are endless instances of beautiful neutral colored shifts, effortlessly glamorous trench coats and leisurewear that never ever existed IRL. If TV shows can’t even find clothing appropriate for women in positions of power, how could any normal human being ever hope to look as well turned-out? And even if Alicia does manage to make do with ensembles that do exist IRL, she can only do it by spending $1000 on a blouse (double that on shoes).
2. Technology interfaces. Have you noticed that any technology that appears in a movie or television show bears no resemblance to functionality available on your IRL devices, even if it isn’t science fiction? For example, any time something is copied from one place to another, the documents show as a thumbnail and then whisk themselves neatly off the screen in a chic little ‘whosh’. Or if something is being transferred secretly from a phone to a thumbdrive (something I’m sure that happens every day), you see each individual file being transmitted in a line-by-analog-line to the transportable device. IRL we still get the ‘blue screen of death’ more often than we thought was possible in the 21st century.
3. Apartments that people who do not come from a family of billionaires can afford New York, LA, San Fran (fill in the desirable city here) etc. I’m sure that Carrie Bradshaw’s midtown bachelor pad would rent for at least $10,000 a month while her earnings as a freelance writer would be more in line with having 5 roommates in a two bedroom slum. People on TV also never have dirty dishes, piles of laundry or cat barf on their carpets. Or maybe they do, and they just have an army of cleaning people that works away quietly in the dead of night to make sure everything is shipshape by morning. IRL we wake up to the same mess we left the night before, compounded by fresh cat barf to step in while searching for something relatively clean to wear.
4. Pets that make money. Of course a handful of people have been able to monetize their pet’s cuteness, but if you do the math it works out to about .0000001% of the pet population that is paying their own way. My cats, for example, are perfectly content to sit back (or lie down) and live a completely work-free life, even though they are at least as adorable as any star of the Internet Cat Festival. IRL pets remain on the wrong side of the household balance sheet.
5. Food that dances like no one is watching. The most recent issue of Bon Appetit is entirely devoted to taking pictures of food, including the ‘rules’ for taking photos at dinner. If you need proof that society as we know it has devolved to a dangerous level of vacuousness, you now have it. Photogenic food is now mandatory, as is the requirement to take and share photos of every meal, snack and raw ingredient. IRL, you are certainly allowed to admire what’s on your dinner plate, but then please just get on with the business of actually eating the food.
Yes, it’s a thing. And a very lucrative thing apparently. There is a job called being a ‘namer’. People pay people to come up with name for their idea, product, concept, movie, drug, colour, and I’m sure babies if you happen to be Kanye and Kim (do you really think they came up with North and Saint on their own?)
In fact the drug thing is kind of the Olympics of naming because there are so many rules around what you are allowed to call drugs (at least in the half of the world that claims to be civilized). One of the gauntlets a pharmaceutical must pass is the dreaded ‘doctor’s handwriting’ test: Whatever name in question must not be possible to be mistaken for something else when being interpreted by a pharmacist. Something like cyanamid (a deliquescent caustic crystalline compound) turning into cyanide (any chemical compound that contains monovalent combining group CN). Oh wait – maybe not a good example.
We do know, however, there are names that can go horribly wrong. The classic is the perhaps apocryphal tale of the Chevy Nova’s launch in Latin America (feel free to look that one up). Another of the better examples of ‘what were they thinking’ happened in 2002, when thanks to the Enron-related downfall of Andersen Consulting, the major accounting firms were all forced to send their ‘advisory’ practices to wander on their own in the desert. Pricewaterhousecoopers (in itself a poster child for naming gone very wrong) decided to call its newly minted consulting offspring ‘Monday’. According to those who do the math, this exercise in rebranding cost $110 million, or roughly 2 percent of the offshoot’s projected annual revenue. Much hilarity ensued. I can report, having been an actual employee of PwC (the post-Monday retrenchment name for the consulting arm), that I am not at all surprised they were hoodwinked into buying that particular version of the Emperor’s new clothes. But honestly, what would any sane person be thinking? Here are three examples of what they failed to anticipate.
1. “I don’t like Mondays.” They missed this one because of course no accountant would know anything about Boomtown Rats, only rat races in general.
2. “Monday, Monday, can’t trust that day.” In fact, as we know, every other day of the week is fine. So perhaps they should have moved further down the list of potential names.
3. And finally my personal favourite: I Bought Monday. Because probably to escape the shame of retrenching and recoup some of the tithe to the ‘namer’, PwC managed to convince IBM to buy its consulting arm and thus unceremoniously kicked Monday to the curb. In a more ironic turn of events, once the statute of limitations on accounting firms having advisory businesses expired, PwC proceeded to poach back its key employees from IBM. But that’s another story…
So next time you notice – or even more telling, don’t notice – a name, remember that it didn’t happen by accident. Imax and Jello and Coke and Apple and Starbucks and yes, even IBM, are all proof that names matter.
Blogs didn’t exist when Elvis and John Lennon died. There was no Instagram to fill up with selfies of chance encounters with newly dead music celebrities. And there was no Facebook on which to repost eulogies and elegies and bootleg YouTube clips of long lost performances of dubious provenance. Despite the handicap of no internet at the time, I do remember exactly what I was doing when I heard the news (oh boy) in both cases.
I was folding laundry in the late afternoon of August 16, 1977 when word of Elvis’ demise came over the airwaves. I can’t say I was a huge Elvis fan but he wasn’t completely absent from the soundtrack of my youth. Maybe it was his relatively young age combined with his larger than life persona that made it a shocking event. And indeed there were many who were unable to believe a less-than-immortal Elvis (come to think of it, that faction is still alive and well).
With John Lennon it was little more of a momentous event, him being a critical component of the most commercially successful band in the history of popular music (and like Elvis, has made way more money when dead than when alive). He was also young – just turned 40 – but the most unfathomable part was being taken out by a North American ‘fan’. And in the true litmus test of how impactful this event was, his death was announced by Howard Cosell on Monday Night Football. Since I don’t watch football on Monday nights or otherwise, I didn’t find out until the morning of December 9. 1980 when it seemed peculiar that all of the songs on the radio featured John Lennon. Lennon is on record as crediting Elvis for getting him out of Liverpool and he got a chance to meet Elvis when the Beatles were on their 1965 summer tour in the U.S., although apparently he found the meeting less than earth shattering.
Which brings us to David Bowie. I was also not a huge fan, him being more of a guy thing, what with his obsession with spiders and Mars and probably puppy dogs tails. I did warm to him slightly in his disco years, when I am sure his diehard fans just locked themselves in their basements and listened to endless loops of Space Oddity on their reel to reel tape players. But I digress. Although it might seem far-fetched, there are way fewer than six degrees of separation between Elvis and the Thin White Duke. First, Elvis supposedly approached Bowie in 1977 to produce one of his records. Second, ‘Golden Years’ is supposedly about Elvis. Now of course it is impossible to verify either of these ‘facts, being that dead men don’t tell tales, but I like them anyway. But what is true is that both David Bowie (Jones) and Elvis Presley were born on January 8.
And then even more dominos started to keel over:
• Glenn Frey from the Eagles, whose favourite way to sum things up was apparently “Ladies and gentlemen – Elvis has left the building”. Or at least that’s according to Joe Walsh so maybe take that one with a grain of salt.
• Dale Griffin from Mott the Hoople, who I am pretty sure never met Elvis but would not have had any semblance of success without David Bowie because they wouldn’t have had ‘All the Young Dudes’ handed to them on a silver platter, and also because Glam Rock would never have been invented.
• Mic Gillette from Tower of Power, who I am also pretty sure never met Elvis. But in case you are looking for a change of job, Tower of Power is currently searching for a new lead singer (not to replace Mic, he was a brass player guy) if you don’t mind being on the road 200 days a year.
And now it’s not because of too many deep fried bacon sandwiches or assassins wanting more than 15 minutes of fame or falling prey to doctors with unlimited prescription pads. Instead, it’s what takes us all out in the end – bad genes, bad timing and just bad luck.
I’m sure I’m not the only person who thinks that New Year’s Eve is a very strange construct. In fact I know I’m not because there was an article about this very thing in the paper today and there is probably a similar one published every December 31st. With apologies for bandwagoning, here are my several cents about this issue.
What is a year anyway? Since you asked, the scientific definition of a year is the orbital period of the earth moving around the sun. So that seems pretty straight forward and something we can all agree on. Oh, except of course if you are a follower of geocentrism (which yes – is actually a thing – that unsurprisingly is apparently tied up somehow with creationism and probably has Sarah Palin as patron saint). But putting aside ‘earth as the centre of the universe’ truthers, the concept of ‘year’ all goes out the window after we move beyond the orbit thing, because we insist on trying to subdivide and count time.
Time, of course, has its own existential challenges. According to Mr. Webster, “time is a measure in which events can be ordered from the past through the present into the future, and also the measure of durations of events and the intervals between them.” So in other words, time is something we invented in order to explain things like time. But I digress.
At some point we started to assign numbers to years, or orbital periods of approximately 365 days, although the 365 day thing is also arbitrary because the actual length of an orbital year varies depending on where you are situated on the longitude of the earth. For example, at the equator a year lasts 365.24219 days. Anyhow, when we started assigning numbers to years it was long after the start of our orbital birth, or about 4.5 billion years ago if you are in one camp, or 6 to 10,000 years ago if you are in another. So either way you slice it, we are way beyond any year starting with 2000.
Then let’s take the decision to decide a new year begins at the beginning of a month those of us who use the Julian calendar call January. January didn’t even exist until about 700 BC (or around 4,539,997,285 for purists) and it wasn’t until 46 BC (4,539,997,939 on the real calendar) that Julius Caesar decreed in his wisdom that forever more January 1 would be the start of a new year. Then it literally all went to hell sometime in the middle ages when it was decided that all things ancient Rome were pagan and we had to choose some other random point in the year to be the start of a new orbit around the sun, most commonly the vernal equinox which also pretty much corresponded to the Christian Easter observance. Which kind of makes more sense, because spring is much more like a new beginning than winter in my book. Alas, this temporary logic was reversed when towards the end of the 16th century (4,539,998,435 to be precise) the Gregorian calendar restored January 1 as the anointed start of a new year in the Christian world.
So have a great 2016, or 5776, or 1437, or 4,540,000,001. Whatever it is, it’s still the first day of the rest of your life.
Apparently George Boole has a crater on the moon named after him. But that is not his key claim to fame. Without Mr. Boole, Google would not exist nor would the internet itself because computers wouldn’t exist, which is perhaps why they recently honoured his 200th birthday with a doodle. Despite my dubious relationship with math, I can credit Mr. Boole with a large part of my career direction.
One arm of philosophy (and in case you didn’t know, it contains more arms than an octopus) is logic. Logic is, of course, the study of valid reasoning (as opposed to invalid reasoning, which of course is the study of Donald Trump). It goes all the way back to Aristotle or about as far back as you can go, philosophy-wise. I have an entire degree in philosophy in spite of my deep dive into logic, which danced too close to the flame of mathematics to be healthy for my GPA. And this is where Mr. Boole first tried to trip me up with his invention of Boolean algebra. In his own words: “No general method for the solution of questions in the theory of probabilities can be established that does not explicitly recognize, not only the special numerical bases of the science, but also those universal laws of thought which are the basis of all reasoning, and which, whatever they may be as to their essence, are at least mathematical as to their form.” Right. Got it. Is it lunch time yet?
But somehow I was able to grab my B.A. in good enough order to escape to graduate school. To be exact, to library school where, contrary to popular belief, you do not learn how to shelve books (well, I guess yes you do, in the form of learning the Dewey Decimal classification system and the Library of Congress classification system, both of which are powers to be reckoned with), but how to find the answers to questions by searching through the existing base of knowledge. So guess you could say I have an entire degree in research. A degree that has been lucratively applied only in the context of not being employed as an actual librarian (but I digress).
The internet is where we researchers live these days. Most people do not venture past the simple yet elegant Google search box or any other simple search presented on the front page of most websites. Pity the fools, for they do not know the power of dipping their fingers in the Boolean pool. Because Mr. Boole figured out the power of three simple words: And, Or, Not. Unfortunately the Boolean search capability is well hidden on the interweb, perhaps to prevent people from actually finding what they want as opposed to what the ‘sponsored content’ people want you to find. In fact, to get to the Google advanced search you have to go to google.com/advanced search. And so it goes. The more we know the more we can’t access it.
According to those that figure these things out, we are approaching the zettabyte threshold of information on the internet. Who knows what we’ll get to after zettabytes (aabytes?), but each zettabyte is 1,000 exabytes. One exabyte is 1,000 petabytes. One petabyte is 1,000 terabytes. One terabyte is 1,000 gigabytes. You get the drift. But all I know is the knowledge of how to access all this ‘knowledge’ (of course excluding zettabytes devoted to cat pictures and Kim Kardashian’s rear end) is sorely lacking. Maybe that library degree will become even more valuable as we move deeper into the new millennium. Or not.
The third most quoted collection of written works, behind the Bible and Shakespeare’s oeuvre, turns 150 this year. Although all of these contenders feature stories that stretch the boundary of credulity (Walking on water? Fairies doing matchmaking?) , Alice in Wonderland (and her further adventures on the wrong side of the looking glass) surely wins the prize for blatantly sheer nonsense. But, of course, in a good way. Here are some of the ways in which Lewis Carroll’s influence has permeated our culture over the 19th, 20th, and 21st centuries.
1. Carroll invented the ‘portmanteau’: combining two or more words and their meanings into a new word. Without him, we would not be bothered by smog, be able to stay in a motel, visit Tanzania, wear a skort to brunch, or eat turducken. Portmanteaus are also crucial to the very fabric of popular culture, as Bennifer, Brangelina, and TomKat can attest.
2. The effect that Alice and her friends have had on the English language ranges far beyond our ability to name celebrity couples. I would venture to guess that many of the internet quotes and aphorisms attributed to the Dali Lama or George Takei (which actually might be the same person) are straight from the pages of Alice in Wonderland. Ever put all the king’s horses and all the king’s men on an impossible job or realize that if you don’t know where you are going any road will take you? I rest my case.
3. As you may know, Through the Looking Glass is structured as a chess game. This led to creation of ‘Alice’ chess in 1953, which then morphed (literally) into Quantum Chess, where a chess piece is not a static thing but a ‘superposition’ of fluid properties that cause it to act in any chess role depending on circumstance. This makes Lewis Carroll an honorary physicist and by extension an honorary rocket scientist. Top that, Mr. Shakespeare.
4. Somebody has claimed every possible domain name related to Alice in Wonderland including Jabberwocky. The reason the website owner in question chose jabberwocky.com was because in the initial days of URL land grab, it was the only thing available that had any remote affiliation to his life (that being his favourite poem). As a result, he ended up with a click bait generator rivaling ’5 Things You Need to Know About Kim Kardasian’s 10 Favourite Vegetables’, because of the enduring fascination with all things Alice and the fact that no one can remember exactly how the poem goes when the occasion arises.
5. Finally, in perhaps the most sentient sentiment, Mr. Carroll penned the immortal words: “You used to be much more muchier. You’ve lost your muchness.” And indeed, how much muchness would we all be missing if we had never met the Mad Hatter, suffered the wrath of the Red Queen or believed in talking rabbits.
So now we begin our forced march towards winter. There is no choice but to put one foot in front of the other and soldier on into certain peril. November’s gloom looms on the horizon and March won’t enter the frame until six months from now. It’s almost time to pack up the cans and condiments, put away the deck chairs, and clean the fridge before sealing up the cottage time capsule for yet another year. A task that’s enough like every Sunday night to feel familiar yet so very different.
Everything takes on a particular gravitas when you know it will be the last time you will do it or maybe not really the last time it will be done but the last time before winter and the last time before summer rolls around again. Some of this is because of the delicate nature of certain end-of-season chores like winterizing the plumbing, which has clear potential for disaster, while others are equally liable to end in tragedy, but less obviously so. Things like:
1. Forgetting to repatriate the items essential for the winter beach vacation. Unless you wait until March (which everyone knows is not a valid substitute for a January or February trip), good luck trying to buy suitable clothes.
2. Putting the sheets away in a brand new place that seems so logical and such a brilliant solution to whatever the problem was with where you used to put them away. Only it won’t seem so logical come April.
3. Placing half-read books back on the bookshelf without a bookmark, especially if the only reason they were half-read is that they got forgotten when a new shiny object of reading desire shoved them down to the bottom of the pile. There is nothing worse than picking up a book in the spring and thinking you have read it already even though you only made it part way through.
4. Not adequately cleaning the oven. Thanksgiving oven detritus tends to become even more petrified over winter. And if you are under the mistaken assumption the oven was left clean in the fall and fail to do a spring inspection, be afraid, be very afraid.
5. Leaving the lounge chair cushions in the ‘dock box’, hoping they will remain unmolested by rodents (with and without bushy tails) that are known to prefer a comfortable winter camping spot.
But, as the saying goes, to everything there is a season. If not for winter how could we embrace spring? There’s a time to hunker down and a time to fling open the windows. A time to plant daffodils and a time to accidentally dig them up in the early spring while planting something else. And, of course, a time to just enjoy the time we are in right now instead of some distant glimmer of summer.
This just in: recent research has shown that the most significant rise in shopping dollar expenditure in the past 10 years has been at big box and warehouse stores, not via online retail purchases. At the risk of dating myself, I remember when the word ‘ecommerce’ was coined and the time when our firm’s nascent ecommerce consulting practice was shrouded in mystery and black magic: the blind leading the gullible with high hopes and deep pockets. Although certainly a large dollar volume moves through Amazon’s virtual stores (and I don’t think there is anything you can’t buy from Amazon, from coffins to plastic surgery), people spend much more time researching potential purchases online than actually getting out their credit cards to complete the transaction. The relevant statistic is that 60 to 80% of internet shopping carts get abandoned long before check out time. So one might say (and in fact I am about to) ecommerce is less than meets the eye.
But back to the whole big box thing. It used to be we had department stores where you could buy just about anything: hardware, wedding dresses, cheese and watches. Gradually (or in some cases drastically), these stores paired down and eliminated their wide range of departments. Suddenly you couldn’t buy fabric at the same place you bought sewing machines or pick up a cake and a cake plate at one go. There are about as many explanations for this as there are explainers. Competition from specialty stores, the prohibitive cost of managing too many SKUs, enthusiastically embracing the 80/20 rule, the high cost of downtown rents, and of course, the rise of internet commerce making both physical location and inventory location irrelevant.
Another factor at play here is of course the pace of suburban sprawl. Downtown is far from the only place to shop and why would you drive downtown and pay for parking when you can drive to your local big box plaza and park as long as you like for free. That’s the good news.
The bad news is your shopping choices consist of Costco and Super Walmart. Costco has to be the ultimate flag bearer for conspicuous consumption. Anyone who has shopped there knows the exact reason why it contributes immensely to the increase in retail expenditure: they don’t call it ‘big box’ for nothing. It is impossible to buy less than two of most things, and certainly impossible not to buy less than very large quantities of the things they sell by the each. And of course that means it is impossible to get out of there for less than triple digits. But look what you can buy: electronics, jewelry, furniture, lawn mowers, garden sheds, mattresses, clothing, books, toys, flowers, etc. Wait a minute – doesn’t that sound kind of like a department store?
Except the department stores I remember didn’t have florescent lights, warehouse shelving and aisles clogged with large shopping carts overflowing with more food than any family should be consuming in an average month and lots of screaming children (who previously were only a palpable presence in the children’s clothing department). Then, instead of dealing with an efficient, smartly dressed, pearl clad saleswoman, you get to line up like cattle being ushered through the slaughterhouse gates to get processed through the payment process.
So today I start my one person crusade to bring back the small box store. The store where you can find small quantities of things. Where they wrap them in tissue paper and place them in a bag. The store around the corner from where you live. Where they even might know your name.
About 60 years ago the world changed. And not because that was when Jonas Salk’s polio vaccine was brought to market, almost eradicating the danger of juvenile paralysis or because it was when the St. Lawrence Seaway opened up commercial shipping between Montreal and Lake Ontario. It’s because that’s when ‘rock and roll’ was officially born, courtesy of a movie called Blackboard Jungle, which featured a song called “Rock Around the Clock”. Unfortunately, with all due respect to Bill Halley and the Comets (and of course, by that I mean not much respect was due), it was a particularly bad example of the new genre of music that was about to overtake popular culture for the foreseeable future.
Apparently, though, ‘they’ have recently (and apparently belatedly) announced the death of rock and roll dominance in popular culture. Rather, according to the people who know these things, we have been in the post-rock era ever since 1991 when Niggaz4life, by N.W.A., sold nearly a million copies in its first seven days and claimed the number one spot on the Billboard 200 – the first time that a rap group had accomplished this feat in the 45 year history of album rankings.
For those of us who were born on the cusp of one of the most disruptive ages of popular music and who have never known a time when ‘rock and roll’ did not exist (and indeed literally grew up with it), it is hard to fathom waiting with baited breath for a new single from Ke$ha (who I assume is part of the hip hop genre, but admit I’m not ‘hip’ enough to know for sure), not only because it doesn’t qualify as something that might be anticipated, but also because the notion of delayed gratification has completely gone out the window. You can download the latest tunes even before they have officially been released and that’s kind of the way it works these days: Consuming tune by tune rather than album by album.
But that was also true back in the day when we pooled our allowances to buy and trade the latest singles at about $1 each. Come to think of it, I’m sure it wasn’t a coincidence that iTunes launched at 99 cents per song, tying into the same psychological marketing trick that if something costs so little it is easy to consume in bulk. The difference was that when we spent our hard hoarded money we got two songs for the price of one – the A side and the3 B side. When you bought ‘Kind of a Hush’ you also got ‘No Milk Today’. When you bought ‘Yellow Submarine’ you also got “Eleanor Rigby’. And so on and so on.
And I think reports of the death of rock and roll are somewhat premature. The list of major tours in 2015 include the Rolling Stones, Van Morrison, The Who, AC/DC, U2 and Smashing Pumpkins – admittedly an eclectic lineup of ‘classics’ but proof of longevity none-the-less. Only time will tell whether anyone will show up to see Kanye or Shad or Jay-Z the equivalent number of years from now, but my guess is their appeal will be about as faded as their tattoos and as tarnished as thirty year old bling.