Jan 30, 2009

Advertising’s Opportunity


One of the recurrent “silver lining” themes of the current economic meltdown is the notion that our best and brightest minds will no longer be sucked in by the lure of Wall Street and will actually gravitate towards jobs where they can make a difference and impact the culture.

And while no one would put advertising in the same league as teaching or medicine, it does offer a chance to impact the culture.

So how does the industry start attracting the best and brightest once again?

There are several issues that need to be overcome before advertising once again becomes an attractive industry to work in:

Salary (or lack thereof): A combination of shrinking profits and pressure from the ubiquitous holding companies has put agency compensation in a downward spiral over the past two decades. So what was once a road to a very comfortable lifestyle back in the Mad Men days is now a path to just-barely-doing-okay. If advertising is going to be competing for talent with consulting firms, internet companies and Hollywood, there needs to be a pot of gold at the end of the rainbow to attract people long term.

Evil: From the new series Trust Me that’s set in a Chicago ad agency to the 40+ comments on an Adweek story about Ogilvy & Mather, the word “evil” comes up a lot. As in ad agencies have devolved into notoriously evil places to work in ways that few other industries can match. That’s saying a lot. There needs to be a serious reevaluation of the culture of backstabbing and prima donnas and disrespect if we are to attract the best talent, especially from a generation that’s grown up in a culture of praise. Ditto the culture that calls ageism, racism and sexism part of the landscape. That has to stop too.

Relevance: Agencies have stopped being relevant. Much of what they produce is reactive, out-of-date and lacks any real thought process. Hence their reputation for the ability to focus on finding the exact right shade of blue while ignoring the actual purpose of the ad in question. Agencies need to get back to their position as thought leaders, as the people who invent trends and influence pop culture. That means hiring people who can think and actually giving them some authority.

All three of the elements I’ve listed are interdependent. In other words, good people may put up with a lower salary in exchange for a supportive work environment and a chance to do something that makes them famous.

But it does seem like it’s now or never for the ad business.

If advertising, as business, is to be anything more than an interchangeable collection of subcontractors, these changes need to start happening now. It’s not an impossible task. Just one that needs to be acknowledged and worked on.

And Super Bowl weekend, when the ad industry is at the pinnacle of its relevance, would seem an ideal time to start.

Jan 28, 2009

The Shelf Life of Revolutions – Part 2


The rapid growth of the web in the mid to late 90s resulted in a fundamental shift in the way products were marketed and the value we attached to them. By allowing access to a vast selection of merchandise, the companies that gave birth to what’s now known as “Web 1.0” made price a secondary focus.

The forty-year run of our Western consumer society meant that an entire generation had grown up with a steady supply of well-priced consumer goods, making their existence less of a novelty than a given. So selection was the next frontier: once we’d become accustomed to having all these goods, we wanted the exact ones we wanted when we wanted them. And Web 1.0 businesses like Amazon.com were only too happy to oblige.

This focus on selection was echoed offline as well, with the growth of superstores like Wal-Mart, who, in addition to low prices, offered an incredible selection of basic consumer goods: every kind of soap known to man (or at least to the marketing departments at Unilever and Procter & Gamble) as well as every conceivable type of consumer good right down to groceries. While low price was Wal Mart’s price of entry, their vast selection was what made them famous.

In the media, this shift to a selection-based model was mirrored by the proliferation of cable TV channels. Whereas the Price Era was defined by the three TV networks, the Selection Era, with it’s line-up of single-interest networks like CNN, ESPN and The Weather Channel told us that we could pick and choose at will from a growing array of content. Our choices would no longer be dictated to us.

That explosion, which Bruce Springsteen referred to as early as 1992, in his hit, “57 Channels And Nothing On” paved the way for the rapid growth of the internet several years later. The explosion of the web not only allowed us to buy anything we wanted, it also allowed us to read about anything we wanted, as an army of professionals and amateurs rushed to throw up content on any and all topics. On the internet, it wasn’t necessarily about quality but about choice: we were no longer limited by the confines of our local libraries or bookstores: we could read whatever we wanted whenever we wanted. Media had become all about selection.

Advertising soon followed suit as the lowly banner ad became the perfect vehicle for The Selection Era. Rather than have to sit through whatever commercials the TV networks threw at us, we were able to select the banner ads we wanted. And initially we did select: in the early days of Web 1.0 following a banner was as interesting as anything else online. It wasn’t until later, when content started to have real value, that clicking on a banner became beside the point, something advertisers are still coming to grips with ten years later.

Politically, the brief period between the end of the Cold War and the 9/11 attacks provided a respite from any real global conflict as the entire world seemed headed for freedom and democracy. This interbellum Pax Americana gave birth to the concept of “globalization,” as new markets opened up and emerging middle class societies in Asia, Africa and Latin America provided us with millions (if not billions) of new consumers for all our goods and services.

The Selection Era proved to be short lived, however, as consumers quickly grew used to being able to have whatever goods they wanted, whenever they wanted them. It was a behavior that survived both the shock of the post-9/11 landscape and the collapse of the Web 1.0 bubble. Having gotten used to the benefits of selection, consumers were looking beyond that and demanding something of more value along with their products, something more satisfying and personal.

And so began the Service Era, which will be the subject of next week’s post.

The Shelf Life of Revolutions - Part 1
The Shelf Life of Revolutions - Part 3

Jan 26, 2009

Those Who Ignore History Are Doomed To Repeat It

Dave Trott is a UK ad legend whose influence extends way beyond the London ad scene.

His blog, simply called Dave Trott's Blog is a real inspiration in that he shares all he's learned over the past decades and it's all still incredibly relevant and important to the challenges we face today.

Anyone who's read this blog for a while knows that I am skeptical of "experts," "thought leaders" and the like.

But Trott is the real deal. So much basic knowledge of how marketing should work is lost on those working in digital media and (particularly) social media because of their insistence that all the old rules should go out the window. Oh, and because some of them never actually worked in marketing, advertising, PR or corporate communications.

But I digress.

Read Trott's blog. You will learn so much more from it than from any self-evident list of "Ten Great Ways To Improve Your Tweeting."

Really.

Jan 25, 2009

Tipping Point

Now that a goodly percentage of my Facebook friends are people who don't work in advertising, marketing or anything remotely related, I've stopped having Twitter automatically update my Facebook status.

The result is my Facebook status updates are mostly about broader life events while my Twitter updates are primarily work related.

It's a bit more work, but I find that my friends and family members are now actually reading my Facebook updates (and commenting on them) rather than dismissing them as "more of Alan's work-related geekery." And since I finally figured out how to link Facebook to Delicious, I can share articles I like without overwhelming people, since Facebook posts them as a list. (A trick I learned from Noah Brier, who I often joke serves as my personal online librarian.)

Not sure if this is a harbinger of things to come or my own personal tic, but curious to hear how others are bridging the gap.

Jan 24, 2009

Social Media Is Still Only Social If You’re Alone


There’s a lot of hoopla around the CNN/Facebook pairing during the inauguration and I admit to participating in it both as a user and as a blogger.

But before we get carried away with calling this is dawn of a new era, let’s remember why this proved to be such an attractive option: the inauguration was a daytime event that took place on a workday. That means a whole lot of people were alone in their offices and welcomed the chance to interact with their friends.

If those same people were home, watching the inauguration with their families and/or friends, they would not be talking to people on Facebook.

Okay, maybe in certain dysfunctional families they would be, but you get the picture. Twittering, status updating, blog commenting all involve taking yourself out of whatever real life situation you are in and inserting yourself into a virtual one. It’s every bit as annoying and disrespectful to the other people in the room as the coworker who feels compelled to answer several personal cell phone calls in the middle of a meeting. And let's be honest: most people's real world friends are not going to stand for that sort of narcissistic behavior.

TV/Social media integration is great for shared events like the Super Bowl or the Oscars and a real boon for anyone without a viewing companion. It’s even a great way to do a quick, discreet check to see what “everyone else” is thinking. But as vital and useful as they are, our virtual communities are never going to replace our real ones. And those of us at the forefront of the social media movement only look foolish when we insist otherwise.

In other words, it's time to put down the Kool Aid.

Jan 20, 2009

I Had No Idea

Yesterday I wrote a post asking why Facebook hadn't launched its own Twitter-like service for the Status Update, given how many people already use Twitter to update Facebook.

I had no idea that Facebook was planning on answering that question today via their link-up with the CNN.com coverage of the inauguration.

The CNN feed wasn't perfect-- it dropped out a lot-- but the Facebook integration worked really nicely and allowed for real time conversation of the sort that went on via Twitter during the debates. (David Burn of AdPulp articulates some of the reasons here.)

But what was probably the most crucial thing here is that many social media newcomers got to see what a shared experience felt like. Regardless of platform. And while clearly the inauguration was far more significant, in our little corner of the world, this felt pretty significant too.

Time For iYahoo?

So I had really come to rely on the iPhone optimized iGoogle home page.

I had it set up so that every morning I could read the New York Times headlines, the local weather and my Google calendar (with appointments) all from the same page.

It was a great solution and the page looked really nice. But this weekend, Google pulled the plug on the iPhone optimized page, redirecting it to the cheesy looking mobile page. Where Google calendar and the Wall Street Journal feed don't work and the rest of the functionality is pretty useless.

What's worse- they didn't tell anyone- like nearly every other pissed-off user on this Google help forum page, I thought my phone was broken for a day or so. Really a bad move on Google's part.

This creates a golden opportunity for others, Yahoo in particular, to create iPhone optimized versions of their portal pages. Let's see if they grab it.

Jan 19, 2009

Will Facebook Be The Death of Twitter?


An epidemic of endless retweeting is making Twitter a whole lot less fun at exactly the same time the “Here Comes Everybody” rush to Facebook (allegedly 600,000 people a day) is making that platform a whole lot more fun. And it’s left me wondering what that means for Twitter.

Those of us in social media have all witnessed the recent arrival of dozens upon dozens of our real world friends and relatives on Facebook. Everyone from old classmates to far-flung relatives to current day friends and neighbors seems to be on there. And as this article from Time magazine notes, while other social media platforms encourage people to connect with strangers, “Facebook is more geared toward helping people maintain existing connections.”

That’s a huge plus in our time-starved world and Facebook is rapidly becoming the default way for people of all ages to stay in touch with their real-life friends and relatives, the people who really do care what they made for breakfast and what their vacation pictures look like. Cementing these relationships is enjoyable because interacting with people we know and like is enjoyable. (For the most part.) And the one-stop-shopping aspect of Facebook makes that platform all the more appealing.

Twitter, on the other hand, is becoming a lot less fun.The Carnival Barkers have invaded, as users bent on “creating value for others” and showing off their vast knowledge of social media are turning the Twitterstream into a virtual version of the county fair. Step right up ladies and gentlemen, get your retweets here!!!

“RT: Another amazing insight from the king of insights: http://tinyurl.com/xxx!!”

“RT: YOU MUST READ THIS POST IMMEDIATELY: http://bitly.com/yyy !!!”

“RT: The Most Useful Post You’ll Read This Year: 5 Ways To Make Your Tweets Count!! http://is.gd/zzz”


No one seems to have told the Carnival Barkers that there just aren’t 37 brilliantly insightful blog posts being written each day. (Most weeks, we’re lucky to find just one.) And while it’s easy enough to filter out the noise they're making, the resultant atmosphere has a chilling effect on conversation and can make Twitter feel like a giant echo chamber.

But that’s just part of it. Facebook already does all the things that Twitter does: status updates, links and conversation, and it does them better and more efficiently.

I can post what I'm doing (in more than 140 characters) to my Facebook status update

People can comment on my update and see their comment as part of a threaded conversation.

I can see links people have posted and rather than the Twitter formula of clicking some oblique is.gd or tiny.url address, I get the headline, first paragraphs and comments others have left about the link before I decide whether to click through to it.

I can chat and email with my friends over Facebook, one on one, which is a much more effective way to have a conversation.

What’s more, I’m sharing all of these things with actual friends, people with whom I have a real world relationship. That means I can put all of their output into context, something I can’t really do with complete strangers on Twitter.

So then what exactly is the point of Twitter?

I don’t have an answer for that one. Right now, the main benefits seems to be that it’s searchable. And that it’s easily accessed from my phone or a free standing app.

The latter isn’t particularly difficult to replicate and frankly I’m surprised that Facebook hasn’t attempted to do so (though I assume that’s what their attempt to purchase Twitter was all about since many people already use Twitter to update their Facebook status.)

The former-- searchable status updates-- falls under the issue of privacy concerns and in reality is of far more value to marketers seeking free research than it is to actual users, who likely prefer their status updates remain private.

Twitter does encourage us to interact with complete strangers and that alone-- the chance to expand our networks-- may be reason enough for us to use it. But I find that most "power users" use apps like TweetDeck to segregate the strangers in their Twitter streams, a functionality that's far less elegant than Facebook's "less about this person/more about this person" options.

So here's my question: if Twitter devolves into a series of personal press releases then who is actually going to read them? I mean it’s one thing to broadcast your own PR, another to spend your day perusing everyone else’s. That feels like work and work isn’t a whole lot of fun. It may be that Twitter works best as a news broadcasting service, similar to the one I mentioned in this post. And that the serial retweeters become just another broadcast channel.

I’m not going to make any predictions here-- the only certain thing about new technology is the uncertainty. But it will be interesting to see how it all plays out.

UPDATE: In talking with commenters on here this morning, it's becoming clear that Twitter's most distinguishing characteristic is the way it encourages connections between complete strangers. If a significant number of people feel this is something they feel strongly about, then Twitter may be able to carve out a niche for itself, despite (or in spite of) its asynchronous follower/following protocol.

UPDATE 2: If you don't feel like reading through 30+ comments: Where we're netting out is we're wondering why Facebook doesn't launch something very much like Twitter using the Facebook Status Updates feature. (e.g. a free standing app that auto-updates a la Twhirl or Tweetdeck.) The difference would be that on the Facebook version, the conversation would be between people who all knew each other, wheras on Twitter it's often between strangers.

My take is that there would be room for both, but the Facebook version would be much more popular, given the over 30 crowd's aversion to strangers online and desire to spend what time they have connecting with their existing networks.