Archive for the ‘ Contemporary Media Issues ’ Category

Taking TV off of the small screen

More than three years ago, I was navigating the sensory overload that is Times Square when a rather ordinary print billboard managed to stand out among all the digital clutter.

The advertisement touted Ajira Airways and was nondescript in just about every way — except that I had never heard of Ajira Airways. Crunched into Times Square’s abundance of flashy advertisements and noise, it’s minimalism was striking. And then there were those numbers scrawled along one of the corners, looking like graffiti at first: 4, 8, 15, 16, 23, 42.

If you’re a fan of the hit television drama Lost, you know those numbers are not random graffiti. They’re the giveaway clue that Ajira Airways is yet another in an intricate network of fake websites, videos, companies and characters deployed by Losts’ producers to deepen the show’s mythology. It would be another two seasons before Ajira even emerged as a major plot point. By then, the observant and obsessed contingent of Lost’s fanbase had already noticed the billboard and grown familiar from Ajira through its website.

Lost’s commitment to multi-platform storytelling isn’t unique. More and more programs are expanding from television into multimedia platforms to richen the viewer experience. It’s a method taken to extreme lengths with The Matrix Trilogy — which incorporated anime, comics, video games and websites as supporting narratives in the cannon of the three films. Today the producers of dramas like Lost and comedies such as How I Met Your Mother create dummy websites and videos so the virtual world inhabited by their characters if more fully realized.

It makes for some neat content, and it plays right into the trend of participatory storytelling where the audience seeks out new information rather than just sits back and listens. Presentations from several of my classmates this semester have detailed the ways these programs seek to interact with their most avid viewers using online media.

But does it serve the television show? That’s a tricky objective that requires a precarious narrative balancing act. Duplicate the same stories from the TV show and you’re just wasting space. Offer too much new information on alternate platforms and you risk alienating viewers who now can’t make sense of the show because they didn’t follow all the websites. This was a critical hang-up with the latter two Matrix movies, as they glossed over key parts of the story that were addressed through other media channels.

Even Lost has scaled way back in its use of supporting media. In early seasons there was not only a fake website for the mysterious Hanso Foundation that was referenced in the show, there was also a television commercial and even a late-night TV interview with a “spokesperson” of the fictional foundation. But as the program winds down to the season finale May 23, it has put all the focus on the show itself being the conduit to provide answers for all of Lost’s questions.

And maybe that’s where supporting media falls short. It’s wonderful in building intrigue and mystery by fleshing out fictional worlds. But when it comes time to closing a narrative, the entire audience must be on the same page. A comedy can get away with supplemental content online because it won’t ever be necessary for casual fans to get the jokes. But a serialized drama can’t give away big secrets to only the fans who search for clues outside the television screen. Even the most elaborate stories need a single place to conclude.

Tweets have more historial value than you might think

If every communications platform got judged by the standard we imposed on Twitter, they’d all be rendered trivial.

Newsprint would just be a forum for gossip and sleazy scandal. Projectors would just be the vehicle for brain-dead dialogue and needless explosions. Telephones would just be a conduit for endless teenage chatter on the superficial.

All those mediums get used for such trivial purposes. A lot. It doesn’t make the medium itself trivial. Instead we recognize all the groundbreaking journalism, innovative films and meaningful conversations that are facilitated as a result.

Twitter too often doesn’t get that appreciation. Because it’s used so often to broadcast meaningless minutia, the general public renders it silly before they stop to see all the benefits of good tweets.

And there’s plenty of good. Twitter is it the most effective way to share links and information with colleagues. It is unrivaled in delivering instant reactions to breaking news. It establishes a direct connection between public figures and their fans that bypasses all the old filters. Even the worst cases of narcissism it inspires can have value as pure entertainment.

Given Twitter’s often under-appreciated status in the public sphere, it’s nice to see our government recognizing its potential. The National Archives announced last month that it will start saving each and every tweet sent from a public account (those with privacy settings won’t be gathered) and preserved for posterity. It’s the type of thing that has inspired instant snickering. Even my iMedia class, which has produced some outstanding research presentations on the value of social media marketing, reacted to the news with many a smirk.

But block out the mundane quality of boring tweets and take the macro view for a moment. When in history have we ever had a larger collection of public views, opinions, thoughts and feelings all recorded? Sociologists and historians no longer have to extrapolate from anecdotal evidence on how certain publics reacted to critical events. There’s a wealth of primary source data that is easily searchable. It doesn’t represent all of society, to be sure, but for certain demographics it provides the kind of in-depth look into our culture and collective pulse to a degree expensive surveys only hoped to reach. That group even includes prominent political figures. What scholar wouldn’t want to preserve their reactions to critical events in the country’s history?

Don’t discount Twitter’s potential for political change either. The protests in Iran last year organized through Twitter are the most prominent example, but you don’t have to go across the world to see an impact. Now that tweets show up in real-time  search engine results, they have the potential to instantly shape what information (or misinformation) is spread. A research paper on how this may have been a crucial factor in Scott Brown’s upset victory in this year’s Massachusetts Senate race was presented at last week’s WWW2010 Conference in Raleigh. Of all the papers from researchers across the globe featured at the that conference, it was the one Word Wide Web Founder Tim Berners-Lee immediately cited when asked what study most interested him.

Of course this post  comes to you from an unabashed Twitter fan since 2008 who now has two accounts, one for my professional interests and one for personal interests. Am I overselling the medium? Are there ways that Twitter falls far short of other mass communications platforms?

If you like Apple, don’t dismiss Flash

Asked the doozy of a question “what advice would you give young developers?” at the international WWW2010 Conference last week in Raleigh, Tim Berners-Lee gave a fairly generic response about joining up with organizations seeking to improve the world online.

Then Berners-Lee, the inventor of the web, raised his voice with an addendum: “especially those focused on building with HTML and CSS.”

If you read that comment out of context, it’s not very striking. But if you were with me in the audience that afternoon, then the remark immediately connected with the mantra uttered by numerous speakers that week. Open source code = beneficial for the public good. Proprietary code = detriment to society.

It’s a sentiment that isn’t just playing out at tech conferences. The belief in open source is the rallying cry that has the multimedia world buzzing about the supposed death of Flash. For a web purist, anything other than a universally accepted code puts a commercial roadblock between developers and their audiences. In 1,000 years, as Internet pioneer and Google VP Vint Cerf remarked during the conference’s keynote, the information held in that code might be unreadable on modern devices and thus lost forever.

But it’s not open sourced purists who have struck the severe commercial blow to Flash. It’s Apple, a company that exerts the strictest control over its products. The company’s iPhone and iPad have rigorous standards for apps that makes Apple the sole arbiter over what can run on the devices, raising antitrust concerns. Locked out entirely is Flash, which Apple founder Steve Jobs sharply criticized as obsolete in a public statement issued last week.

Jobs has some valid arguments on why Flash shouldn’t run on the iPad or iPhone: it drains battery power, has a greater tendency to crash compared to HTML websites, and wasn’t developed with touch screens in mind. But Jobs also takes the high and mighty route by railing against Flash for not adopting open web standards. That’s highly hypocritical for a company that takes draconian measures when its own proprietary information is leaked.

Lost in all this posturing is the simple fact that Flash does things you can’t do with open sourced software … at least not yet. HTML5 offers the promise of driving the kind of dynamic video and animation that Flash is known for, but that doesn’t mean it will be easy for non computer science types (designers, photographers, journalists) to code. Flash built a following for the relative ease it injects into an otherwise complex coding process, allowing developers to direct their energy towards content and design.

Even Jobs concedes this point at the end of his statement, calling for Adobe to “focus more on creating great HTML5 tools for the future.” Software that topples the coding barrier of HTML would indeed be welcome, opening up web creation to billions of people who never took a computer science course. But will any company produce that software without the profit motive that drove Adobe? Apple’s proprietary products made it possible for millions to create multimedia for the first time. If we’re willing to concede Apple’s self-interest in exchange for the great hardware it creates, why must the web community judge Adobe software by such a harsh standard?

Will movie rentals see the death of the disc?

Forget Facebook. For my money Netflix is the most sure-fire service for turning technophobes onto the potential of web-based content. In the last four years, Netflix has tripled its subscriber base to 12.3 million despite the availability of movies on iTunes and abundance of free content online.

Count among those new users the respective mothers of my girlfriend and I. Neither has ever embraced using the web for much of anything, but both enthusiastically renewed the gift subscriptions we recently gave them. The appeal for them, as for many in their demographic, is the seemingly endless list of obscure titles unavailable at nearby stores. It’s no wonder the company is held up by bestselling technology author Chris Anderson as a shining example of  the Long Tail, a business model in which companies profit not by just offering the biggest hits, but rather the vast quantity of niche titles that can now be stored and distributed at a relatively miniscule cost.

Yet there’s a gaping inaccuracy in the profile of Netflix as a web-based business, perhaps the same “flaw” that makes it instantly appealing to the boomers reluctant to migrate online. The Netflix model of subscribers sorting, selecting and ordering their titles online is purely digital. But the primary method of delivery — The United States Postal Service — is something out of the 20th century. Or even the 19th.

That’s a problem, given that the postal service is losing billions annually. Any of the fixes being considered by Congress, from postal rate hikes to eliminating Saturday deliveries, would have a severe impact on Netflix’s bottom line.

The solution for Neflix is simple. Instead of mailing out discs, get consumers to stream movies instantly via the web, a process that has virtually zero distribution costs. The company has already tried to do this by offering a number of major titles for instant streaming and selling a special device for around $100 that plays these titles on your TV.

Now the company is making some new moves in this direction. It has recently signed deals with major studios agreeing not to distribute DVDs that have just been released in exchange for getting the rights to instantly stream more titles. Netflix has also put out programs that allow video game console owners to stream movies via their Xbox, Playstation, and as of this past weekend, Wii. I tried out the new Wii program last night — with the excellent dark comedy Big Fan as my choice — and was pleased with the results.

But forget about my tastes. What about my Mom’s? Will the boomer demographic raised on physical products really embrace digital delivery? Or will they flee back to the old days via companies like RedBox and the still-kicking Blockbuster? The result could say a lot about the value of producing information and entertainment in a physical form. Maybe we’re not ready to go all digital all the time just yet.

Stand back and watch the iPad war play out

Photo illustration by Brook R. Corwin. Original photo licensed by Creative Commons for commercial use with modification

The iPad hits stores today, and in making the choice to buy one, you’re not just picking out a product.

You’re picking a side.

The $500-$800 tablet computer is perhaps the most polarizing device to hit the communications industry in the past decade. It has spawned both adoration and disgust among veteran tech writers. Journalists are having a field day speculating on how it might redefine the industry, with views highly mixed. Web content developers have either cried fair or foul with the iPad’s inability to run Flash, depending on what they think of the software program that runs the vast majority of animated and multimedia content on the Internet.

(Full disclosure: most of the websites I design, including my portfolio site, have heavy Flash components. So it’s hard for me to get too excited about a product that, if successful, would force me to redesign my work or pay money to another company to convert for an iPad audience)

But these are debates for tech heads and newsies. How will mainstream consumers pick their side? It won’t come down to whether it helps the journalism industry. It certainly won’t come down to love/hate of Flash.

Design and functionality will play a huge role. If society is ready to abandon the click culture that has conditioned our computer use for decades, then the iPad’s slick use of touchscreen will forever change how we access and share information. The iPad will never replace a mobile phone (it’s simply too big to take everywhere) but it could replace a laptop.

That gets us to the big question, a cultural query that ultimately decides whether the iPad redefines all media or fails miserably. How exactly do people want to use the web? Are they looking for an open-ended, creative experience without expectations? Or do they want a simplified, controlled environment streamlined to their established tastes?

Those falling in the first category will be frustrated by the iPad’s limitations. It runs only one application at a time, can’t handle complex software and doesn’t even have a U.S.B port. Applications must be purchased from the Apple app store to make the device useful. In short, it’s pretty lousy for those who want to build their own content, ironic since the initial appeal of Mac computers was all the built-in software that let you start creating right out of the box.

An iPad dominated world also severely restricts the audience for amateur content. Applications must be approved by Apple. The company’s rejection of Flash, one of the world’s most popular software tools, shows just how far it can flex its muscle to restrict what gets seen on its devices. Is this the next generation of the AOL model, which relies on herding consumers to partner websites rather than the web’s wild frontiers?

Then again, maybe this is just what the general public is looking for after a decade of exhaustion trying to navigate the web’s limitless choices. If all you want out of your computer is the ability to send email, share photos and get information online, then the iPad makes life easier. Its company-approved apps will tailor web content to meet your daily needs. Information will be personalized to the extent that you’ll never feel the need to wander. And how many people really need to use a built-in webcam anyway?

I can’t predict which camp attracts the majority of consumers. But the iPad will do a wonderful job of sorting, and from there we can start tailoring our communications strategies to fit the revamped model of Internet consumption. Hedge your bets and stay neutral for now. We’ll know soon enough which side wins out, and how they’ll rule the new era.

Will the Internet remain an “all you can download” buffet?

Last week I wrote about the incoming FCC plan for universal broadband access across the U.S., an expensive and somewhat controversial initiative that begs the question of whether high speed Internet access is not just a privilege but a right.

The plan has been hit by fierce pushback from major cable companies since it was announced Tuesday. No surprise there. A major provision of the FCC’s plan is to auction off spectrum once used for broadcast in order to expand wireless networks to rural areas not serviced by the private sector. The broadcasters who now own that spectrum aren’t about to just give it up for nothing.

Ironically enough, I found myself out to dinner with a group of Elon faculty and communications professionals that included a Time Warner Cable regional executive on the day the FCC’s plan was announced.  Naturally, he didn’t see the FCC’s current plan as the answer to the issue of broadband access. Nor was he crazy about Google’s initiative to poke its head into the ISP business by laying out “dark fiber” cables that can offer much faster access in urban areas.

But he didn’t dismiss the issue either, instead proposing that the problem with high-speed Internet isn’t one of access, but of demand. ISPs can extend broadband to rural areas but aren’t likely to get everyone to sign up, either because they can’t afford it or don’t see the value in the service. Even some in favor of universal access acknowledge, as Slate tech wrtier Farhad Manjoo does at the end of this piece, that only about two-thirds of those with access to broadband actually sign up. If the government can find a way to subsidize the cost through some form of tax credits, it would be easier for private ISPs to bring broadband to more of the country.

This concept is similar in spirit to one proposed in Jonathan Zittrain’s “The Future of the Internet: and how to stop it,” a book I’m now reading for class that offers a thorough look at how online access has been shaped through the decades. Zittrain is highly critical of non-generative models that limit what a consumer can gather online (think AOL 15 years ago or perhaps the iPad today). But he does point out that ISPs who wall off parts of the Internet could subsidize the cost of access by charging websites or search engines for a spot in their “walled gardens” of content accessible to customers. The revenue stream would allow the ISP to offer access at reduced rates, or maybe even for free.

Proponents of net neutrality probably cringe at this thought. The Internet should be open and free for all, they argue, without restrictions on content. That’s laudable in spirit, but the reality is that someone has to pay for the high speed access. If the government can’t afford to pay for the infrastructure, and it’s not profitable for the private sector, than the only solution is charging websites themselves for distribution. As long as consumers know they’re only getting a la carte portions of the Internet, they might be OK not having the all you can eat model. It’s better than having no access at all, which is the status quo for far too many in this country right now.

High-speed Internet: a right or a privilege?

Americans have long accepted K-12 education as a right for all. Debate now rages on whether basic health care also qualifies in that category.

But what about high-speed Internet?

That’s the question that will leap to the forefront with the F.C.C. poised to announce this week an ambitious plan to spread broadband access to the entire country. This proposal has been eagerly anticipated for several months, and it came up frequently during my research last semester on the Digital Divide as sign of hope on bridging the technology gap in this country.

It’s not as simple as a government patch, however. The involvement of the public sector in the business of Internet service providers not only costs taxpayers billions, but it tiptoes into the realm of free-market meddling that arouses intense criticism for hampering business growth. Already major cable companies are lining up in opposition for the F.C.C.’s plans to auction off areas of the broadcast spectrum to allow more space for wireless networks.

Yet if governments don’t take some action, rural Americans will almost certainly keep lagging behind in connection speeds. It simply doesn’t make good business sense to invest in Internet infrastructure servicing areas with a low density of potential consumers. That’s why the F.C.C. wants to offer subsidies to companies that offer high-speed access to rural America. It’s the same principle that has caused nations like South Korea and Finland to roll out expensive initiatives to wire their entire countries so high-speed Internet access is a universal right.

Some more competition among the private sector could also help. Most ISPs have a near monopoly on their markets, leaving them little incentive to improve the speed and breadth of their networks. That’s a big reason why Google is entering the industry through buying up “dark fiber” cables capable of delivering Internet 50 times faster that what most customers are used to receiving. Some have speculated this is a bid not so much to break into the ISP industry but instead to force cable companies to improve their connections speeds, since a faster Internet directly benefits Google’s core products.

Yet even if Internet connections get blindingly fast in the city, they’ll still lag way behind in the country without some government intervention. That changes the way students in rural areas learn, what services rural businesses can offer and what kind of digital content rural residents are capable of receiving. A PR or advertising firm might have to deliver one set of heavy multimedia content (videos, interactive microsites) for one audience while similtaniously running a different campaign for rural areas that doesn’t involve files that won’t download fast enough on the rural connections.

There’s no question that life on the slow end of the Internet is drastically different, and the divide will only grow more pronounced as more and more of society’s business and social functions migrate online. But are the consequences of no high-speed access severe enough to merit major intervention by the public sector? Is it a right we must establish for all no matter where they live? The answers to those questions will ultimately determine whether the digital divide narrows in this country or turns into a chasm.

Unmasking our privacy, one tweet at a time

Illustration licensed through Creative Commons for transformative use. Find the original at: http://www.flickr.com/photos/carrotcreative/ / CC BY 2.0

It was a brutal loss, the kind that sends UNC basketball fans into an emotional tailspin. Georgia Tech walloped the Tar Heels two weeks ago, adding insult to a season already full of physical and emotional injury.

There was no shortage of reactions. Commentators howled. Fans scowled. And the coaching staff seemed on the verge of exasperation.

The next day, two of the players found time to laugh at the misfortune.

It’s not a response you could have heard at a press conference or read in a newspaper article. The laughs came from the boisterous Twitter account of freshman guard Dexter Strickland, who posted a goofy pic of how he and teammate John Henson attempt to attend class incognito on days after losses.

Such a lighthearted tone fits right in with Strickland’s twitter stream, no matter how rough the season gets. He’s joked about the embarrassment of being stared at by the bus stop after a loss, or in hearing ESPN commentators criticize the team. That’s when he’s not tweeting about class, his pet snake, boring hotel rooms or accidentally walking out of the grocery store without paying for bottled water.

Strickland is no anomaly. His account is merely the most lively example of an entire team (notably Henson, Will Graves, Ed Davis, and Larry Drew) that has embraced Twitter as the method of choice to broadcast the daily details that never make it into on-the-record interviews. Following them this season has softened my perspective on Tar Heel basketball. As a UNC grad I still get emotional watching games, but the losses have been much easier to take after hearing unfiltered accounts from the players on the court.

We forget they’re just kids, barely out of high school. Twitter makes that fact abundantly clear, and instantly puts their growing pains on the hardcourt into perspective. It personalizes the players and makes it more enjoyable to root for them even during bad seasons.

It also takes fan voyeurism to a whole new level. Pro athletes have learned to filter their Twitter accounts somewhat, pressured by the leagues and franchises who write their multi-million dollar checks not to post anything offensive. But in the college ranks, even players for a program that has millions of devoted fans garner just a few thousand Twitter followers. It’s just far enough under the radar that their every word won’t be scrutinized.

But I wonder, is it only a matter of time before one errant post is torn apart online?

Twitter is no stranger to sports. I blogged several weeks ago on the BCS’ attempts to use it for PR purposes. Georgia Tech’s coach, following that lopsided win against UNC, even used it to lash back at critics and defend the performance of his team.

But when you’re dealing with youngsters, privacy becomes an issue and a concern. We’ve all said stupid things in college. We probably said stupid things last week. What if they got out into the blogosphere for the world to see? It’s an almost inevitable result once you’re accustomed to immediately broadcasting every thought. The nightmare happened just last month for University of Oregon wide receiver Jamere Holland, whose profanity-laced Facebook status updates cost him his collegiate football career.

The Internet is littered with such horror stories of the private error becoming a unerasable public humiliation. Many of the most horrifying are chronicled in The Future of Reputation, an excellent account by law professor Daniel Solove on the dangers of spreading too much information online. Discussing the book in class this week has served as an interesting counterpoint to the ethos of The Cluetrain Manifesto (another book assigned in my curriculum), which preaches the virtue of letting employees communicate directly to consumers without restrictions or filters.

Ideally we’d all like to hear an unedited account from the companies we buy from, the colleagues we spend time with or the athletes we cheer for. The tweets from UNC’s players have proven a wonderful way to lighten up an otherwise gloomy season. I hope they continue. But there’s a risk that can’t be forgotten with social media. As effectively as it can connect in an instant, it has the power to decimate reputations just as fast. That kind of power demands some degree of caution, or else victory in real life could be forever tarnished by stupidity in cyberspace.

Online: where advertisers no longer fear to tread

Imagine for a moment that you’re a small business owner. If you’ve survived this economic downturn, you’ve no doubt got passion, work ethic and a real talent in delivering what your company offers. What you probably don’t have is much of a marketing budget, or a lot of experience in conducting advertising campaigns.

Ten years ago advertising options would have been limited. You wouldn’t have the cash to be on the radar of major media companies, and there was no easy model to advertise online. It was pretty much Yellow Pages or billboards.

But now you’ve got two substantial choices. Those big media companies are desperate for ad dollars since car dealerships and retail chains are no longer thriving. So they’ve dispatched their attractive, charismatic ad reps onto your office with sweet talk on the “magic” of having a place in an established media brand. You probably grew up reading the newspaper or watching the TV program they’re selling space for, so the prospect is alluring.

On the other side is a nameless, faceless, utterly lifeless computer interface. In static pixels it promises to pair your ad to whatever specific audience you want to reach online. You decide where the audience will live and what they’re looking to find. You set your budget, and you only pay when potential customers click on your ads.

The verdict’s in, and it’s option two that is overwhelmingly proving the favored choice. That’s the approach taken by Google through its AdWords and AdSense programs that pair online ads to relevent search results or places them directly onto websites that have content matching the advertiser’s target demographic (my colleague Cory Morrison has a nice post summarizing how these programs work). Those programs now account for more than 95 percent of Google’s revenues, the cash cows for a company that rakes in several billion dollars of profit annually. For all the campaigning by legacy media that advertising can’t work online and nothing competes with the magic of a TV commercial, legions of business owners have voted otherwise with their marketing budgets.

This seismic shift in ad dollars is chronicled with precision in the fantastic book Googled. Written by Ken Auletta, a veteran in covering media issues, the book tracks the meteoric rise of the search engine giant, with the key turning point when the company figures out how to monetize its free search service by selling targeted ads.

For all the ways Google has upended the media pecking order, it’s in advertising where I feel it has made the most profound impact. Traditional media always depended on ads to survive. Newspaper circulation only accounts for around 20 percent of revenues. TV networks broadcast for free. Music labels and movie studios produce hits only if backed by expensive marketing campaigns.

But Google trashed that model by proving how a massive data network could more efficiently deliver ads. No more middlemen were needed. No need even for a big budget as long as you knew your target market. This single-handedly rocked the foundation of old media’s business model, and sent them scrambling to get online whether they liked it or not.

This has major implications for journalists and PR professionals, since the platforms where they deliver content are financed by ads. For too long their bosses haven’t embraced online platforms for fear they won’t produce ad revenue. In the pay-per-click model Google uses, it’s a self-fulfilling prophecy. If a website isn’t filled with good content it will get little traffic and thus produce little revenue.

Today, most legacy media companies say they can only charge 5 to 10 percent for online ads compared to print. But it’s hard to see that continuing. As the book points out, people on average consume 20 percent of their media online, yet only 9 percent of advertising spending is placed there. For all the billions the search giant has raked in by making online advertising simple, there are billions more to be claimed if someone figures out how to follow Google’s lead.

The objections to objectivity

At the dawn of my journalism career, I sat in a college classroom and listened as a central tenant of my new profession was ripped to shreds.

Ripped apart by a Pulitzer Prize nominated journalist no less.

That man was Allister Sparks, a distinguised reporter and editor from South Africa whose front-line work exposing the corruption in his nation’s apartheid government had won him international respect and acclaim. Sparks was now a visiting professor at UNC-Chapel Hill, teaching a small seminar course to a handful of undergraduate journalism majors that included myself.

Just a sophomore at the time, my reporting experience was limited to a single internship and a bunch of articles for the Daily Tar Heel. It was still enough time to have the idea of objectivity firmly entrenched into my code of ethics, to the extreme that I would measure the column inches I gave each speaker at a debate — regardless if one had more interesting or insightful things to say — to make sure my coverage was equal and no one could accuse me of bias.

But the notion of objectivity, so engrained in the fundamentals of my intro journalism classes, was promptly tossed out the window by Sparks. Over the course of several classroom periods we debated how merely providing balanced coverage of both sides does readers a disservice. Sparks argued that only subjective choices regarding what is honest, morally sound and relevent produced reporting on controversial issues that made a difference. Had he taken the neutral route, Sparks said, his reporting would have merely propped up the injustices of apartheid rather than pushed the nation to democracy.

That class didn’t instantly make me a convert. But it planted a seed that grew over the next eight years as I worked in print journalism. Time and time again the “objective” reports simply transcribed two opposing views without any insight into which had merit, while the meaningful investigative pieces carried a fact-based viewpoint that highlighted the important information while dismissing the trivial. Over time I came around to the idea that true objectivity in journalism, even if obtainable, could never deliver the change and enlightenment the profession seeks to create.

Others monitoring the industry have also come around to this idea. Among those is Robert McChesney, whose exhaustive critique of the modern media landscape continues to dominate much of my class discussions for graduate school. McChesney derides objectivity as a tool inserted into the standard protocol of “professional” journalism by elites hoping to maintain the status quo. If reporters are shackled from revealing too much information that favors a particular side, then even dishonest positions can maintain credibility in the eyes of the public.

It’s one thing to be fair and accurate in reporting. It’s quite another to ignore obvious truths because doing so would reveal your preconceived beliefs and bias on a topic.

Strangely enough, it’s this desire to be freed from objectivity that has propelled me into public relations, an industry McChesney (and many of my old journalism colleagues) deride as ethically bankrupt for spinning facts to suit an agenda. There’s no doubt that does take place in some corners of the industry. But at its heart PR involves spreading information and viewpoints that need to be heard. If I help a courageous non-profit raise its voice to a large audience — as my fly-in project to Panama seeks to do — then I am reporting with a subjective agenda, but one that helps a worthwhile cause.

Objectivity has its merits, especially with regard to research papers and governmental reports. But when it comes to making a difference in communications, often the subjective approach proves more powerful.