YouTube channels McDonalds: “Billions and billions served.”

YouTube logoIn case anyone doubts the significance of YouTube as a media platform … the video sharing service just announced that it is now serving in excess of 2 billion video views per day.

For an entity that’s barely five years old, this statistic is pretty incredible. But it becomes easier to believe when the full extent of YouTube’s video inventory is understood.

In fact, these days nearly 24 hours of video footage is being uploaded to YouTube every minute. That’s more than 34,000 hours of video each day.

Plus, there appears to be no end in sight to the growth of YouTube’s video library, as the rate of uploading has increased by nearly 20% over the past year.

The fact that the vast majority of YouTube videos are hardly worth the time it takes to watch them makes little difference. Far more than Yahoo Video or Hulu, this site has become the “go-to” place for finding everything from old TV commercials to short clips from movies or shows. Or to engage in the guilty pleasure of browsing around and viewing everything from news anchor bloopers to boring college commencement speeches and embarrassingly bad student dance recitals.

Actually, the number of people who visit YouTube to “channel surf” is astonishingly large. It’s become the new pastime that TV watching once was.

And the “social” aspect of YouTube is important as well, as people love to pass links to their favorite videos on to their friends. Or to “broadcast yourself,” as the site’s tagline states. YouTube makes that process easy and effortless, contributing to the burgeoning inventory of new video material.

When Google acquired YouTube in 2009, more than a few industry observers wondered about the rationale behind the purchase and questioned the effectiveness of YouTube’s business model. Looking back one year on, it’s hard to understand what the fuss was all about!

And just yesterday, Google announced ambitious new plans for YouTube. It’s begun converting the entire library of videos to its new WebM video format that incorporates VP8, special codec compression software that facilitates the delivery of smooth, high quality video images.

In yet another swipe at its rivals, Google is offering VP8 royalty-free, in a bid to knock Flash (Adobe) and H.264 (Apple) platforms off their current top perch. Will they be successful? Well, based on history …

Craigslist riding high … but clouds on the horizon?

Craigslist logoNow here’s an interesting statistic about Craigslist, the online classified advertising phenomenon and bane of newspaper publishers across the country. Online publishing consulting firm AIM Group is forecasting that Craigslist will generate nearly $125 million in revenues this year.

But here’s the real kicker: Craigslist is on track to earn somewhere between $90 million and $100 million in profits on that revenue. That kind of profit margin is basically unheard of – in any industry. And the fact that it’s happening in the publishing industry is even more amazing.

What’s contributing to these stratospheric results? After all, Craigslist bills itself as a “free classified” site. That may be, but the publisher derives a huge portion of revenue – more than 50% – from paid recruitment advertising, much of it coming straight out of the pockets of the newspaper industry.

And the rest? Chalk up most of that to advertising let’s euphemistically label “adult services.” (AIM Group calls it something else: “Thinly disguised advertising for prostitutes.”)

Of course, these lucrative revenues and profits have come at a price. Craigslist has developed a reputation – not wholly undeserved – of being a virtual clearinghouse for anonymous hook-ups and other forms of vice. Complaints of Craigslist becoming a haven for scam artists, thieves – even the occasional murderer – have become more common as the site has expanded its reach into more cities and regions — now in excess of 500 communities.

And here’s another interesting finding from AIM Group. It reports that Craigslist’s traffic peaked in August of last year (~56 million unique visitors that month), but has fallen since then. In fact, monthly traffic has dropped and now plateaued at ~48 million since February.

Why? AIM speculates it’s the result of an “antiquated” user interface, along with a proliferation of “spam & scam” advertising. You start getting a lot of that … and you’re bound to start driving some people away.

Still, it’s pretty hard to argue with profit margins hovering around 75%.

Newspapers crash … Online news soars.

The latest annual News Users report by Outsell, Inc. predicts additional declines in print newspaper circulation as consumers continue to gravitate to online news. It is the third annual report issued by this marketing and communications research firm, which is developed from findings gathered in consumer surveys.

Outsell projects that Sunday newspaper readers will drop to ~43 million by 2012. That would represent a decline of some 20 million readers from Sunday papers’ circulation heights in the 1990s.

But what’s even more noteworthy is the continuing evolution in online activities. Today, nearly 60% of consumers report that they go online for “news right now.” That’s up from 33% just a few years ago.

And where are people going for their online news? By a large margin, it’s to aggregator sites like Google News, Yahoo and Drudge Report rather than to newspaper sites. As an example, 44% of the people who go to Google News scan the headlines there, without clicking through or accessing the newspapers’ individual sites.

Other key findings from the Outsell survey:

One in five consumers now go to online news aggregators for their “first in the day” news, up from 10% three years ago. TV/cable still leads with 30%, but that margin has been shrinking dramatically.

Paid online content is not a picking up the slack for newspapers, with participation rates of no more than 10% of consumers.

Newspapers retain strengths in reporting local topics (e.g., local news, sports and entertainment), even as national topics have gone pretty much all-digital.

That being stated, if a valued local online news site were to put up a pay wall – or require a paid subscription to the print paper in order to gain free online access – three out of four respondents claimed they would go somewhere else to find the news free of charge. (That’s despite the fact that good alternative news sources at the local level are usually not so numerous.)

The Outsell study found that consumers continue to believe printed news is worth paying for … but they expect the news they get online to be free of charge.

The big problem: It looks like it’s too late for publishers to “transition” reader willingness to pay for print news over to now paying for that same content online.

Nope, that train’s already left the station.

Stanford Weighs In on Web Credibility

Web credibility.With more than 200 million web sites in cyberspace these days, what makes the difference between the good ones and bad ones? That’s what Stanford University has sought to find out through a multi-year research project.

The Stanford Persuasive Technology Lab conducted research with a cross-section of more than 4,500 web consumers over a three-year period. Distilling the mountain of information gleaned from this sample, the Lab issued its Stanford Guidelines for Web Credibility. It boils the research findings down to ten basic guidelines that, if followed, mean that a web site will be viewed as credible and authoritative.

In reviewing the Stanford guidelines, some of them stand out to me as ones that are too often missed by web developers. In particular:

Show there’s a “legitimate” organization behind the web site. Listing a physical address – not a P.O. Box – is one way to do this.

Make it easy to verify the accuracy of information. Third-party citations and data references, along with links to other respected web sites, are ways to accomplish this.

Make sure the site is “intuitive” and easy to navigate. People’s tolerance level for a web site that doesn’t follow a clear, logical thought path is low.

Update site content often. People put less credibility in sites that look like they’re informationally static or stale.

And especially important: Stanford’s research found that content errors should be fixed, no matter how inconsequential they might seem. It turns out that broken links, bad spelling, poor punctuation and other typos negatively affect the credibility of web sites more strongly than many other factors — and yet they’re among the easiest items to fix.

When you look at the overall web credibility guidelines from Stanford, they aren’t particularly challenging – and they shouldn’t be hard to put into practice.

But when you consider how lame many web sites actually are, it’s another reminder of how – in web design as in so much else in business and government – “best practices” too often fall victim to expediency or just plain slipshod execution.

Online Customer Review Sites: Who’s Yelping Now?

The news this week that social networking and user review web site Yelp® will now de-couple the presentation of reviews from advertising programs comes as a rare victory for businesses that have been feeling more than a little pressured (blackmailed?) by the company’s strong-arm revenue-raising tactics.

The web has long had something of a “Wild West” atmosphere when it comes to reviews of businesses helping or (more likely) hurting the reputation of merchants.

Yelp is arguably the most significant of these sites. Since its inception in 2004 as a local site search resource covering businesses in the San Francisco metro area, Yelp has expanded to include local search and reviews of establishments in nearly 20 major urban markets. With its branding tagline “Real people. Real reviews®,” Yelp is visited by ~25 million people each month, making it one of the most heavily trafficked Internet sites in America.

Yelp solicits and publishes user ratings and reviews of local stores, restaurants, hotels and other merchants (even churches and doctor offices are rated), along with providing basic information on each entry’s location, hours of operation, and so forth – with nearly 3 million reviews submitted at last count.

Predictably, user ratings can have a great deal of influence over the relative popularity of the businesses in question. While most reviews are positive (ratings are on a 5-point scale), Yelp also employs a proprietary algorithm – some would say “secret formula” – to rank reviews based on a selection of factors ostensibly designed to give greater credence to “authentic” user reviews as opposed to “ringers” or “put-up jobs.”

Not surprisingly, Yelp hasn’t disclosed this formula to anyone.

So far, so good. But Yelp began to raise the ire of companies when its eager and aggressive advertising sales team began pitching paid promotional (sponsorship) programs to listed businesses that looked suspiciously like tying advertising expenditures to favorable treatment on reviews as a sort of quid quo pro.

Purchase advertising space on Yelp … and positive reviews miraculously start appearing at the top of the page. Decide against advertising … and watch the tables turn as they drop to the bottom or out of site altogether.

Concerns are so strong that three separate lawsuits have been filed this year already, culminating in a class-action lawsuit filed in February that accuses Yelp of “extortion,” including the claim that Yelp ad sales reps have offered to hide or bury a merchant’s negative customer reviews in exchange for signing them up as Yelp sponsors.

“The conduct is an offer to manipulate content in exchange for payment,” Jared Beck, an attorney for one of the plaintiffs, states bluntly.

As for whether Yelp’s announcement of new standards will now curb the rash of lawsuits, it seems clear that this is the intent. But so long as Yelp offers to do any sort of manipulation or reshuffling of reviews in exchange for advertising, the lawsuits will probably continue – even if there’s only the appearance of impropriety.

Oh, and don’t look for Yelp to provide any additional revelations regarding how reviews are sequenced to appear on the page. Too much transparency, and it’ll only make it easier for people to figure out how to “game” the ratings.

Newspapers Turn on Each Other

Dinosaurs in Disney's FantasiaLast week, the Associated Press reported that U.S. newspaper advertising revenues declined dramatically in 2009, bringing ad receipts to the lowest level recorded in nearly 25 years.

In fact, newspaper publishers’ total advertising revenues last year came in below $28 billion, down $10 billion from 2008. According to the Newspaper Association of America, annual ad revenues have now fallen by nearly $22 billion – a whopping 44% — since 2006.

And now, amid this toxic environment comes word that The Wall Street Journal has declared an all-out war on The New York Times for local advertising. In mid-April, the Journal — up to now focused almost exclusively on national and international news — is set to introduce a New York-focused section as part of its paper. Outside observers believe this will put as much as ~20% of the New York Times’ retail advertising revenues at risk.

And this isn’t a minor foray on the part of the WSJ, either. It will be spending upwards of $15 million to produce the new 12-page section which will cover local business, real estate, sports and cultural events. The financial outlay includes salaries for ~35 editorial writers – surely one of the few instances of new editor jobs actually becoming available.

The WSJ action couldn’t come at a worse time for the Times, which has experienced sharper ad revenue declines than the industry average. It’s responding by launching a major trade marketing campaign of its own, touting its audience strength with female readers and “high culture” afficionados.

But just how effective this countermove will be is debatable, as recent moves by the paper haven’t exactly telegraphed a continuing commitment to the local news scene. In the last few years alone, the Times has consolidated weekly sections covering specific regions of the New York metro area (Long Island, Westchester, Northern New Jersey), as well as axing its stand-alone “City” and “Metro” sections.

Over the coming months, it’ll be interesting to see how effective the WSJ is with its new local-focused section – whether or not it’ll land a major blow on its rival.

Either way, the vision of two venerable newspapers engaged in a Herculean struggle, fighting over an ever-shrinking advertising pie is isn’t exactly a pretty sight.

It reminds me of the famous scenes in the Disney movie Fantasia of the huge dinosaurs furiously going after one other – even as the world’s changing ecosystem is rendering the entire species extinct.

The End of Privacy

An article by technology author Steve Lohr published last week in The New York Times caught my eye. Titled “How Privacy Vanishes Online,” it explores how conventional notions of “privacy” have become obsolete over the past several years as more people engage in cyber/social interaction and web e-commerce.

What’s happening is that seemingly innocuous bits of information are being collected, “read” and reassembled by computers to build a person’s identity without requiring direct access to the information.

In effect, technology has provided the tools whereby massive amounts of information can be collected and crunched to establish patterns and discern all sorts of “private” information.

The proliferation of activity on social networking sites such as Flickr, Facebook and LinkedIn is making it easier than ever to assemble profiles that are uncanny in their accuracy.

Pulling together disparate bits of information helps computers establish a “social signature” for an individual, which can then be used to determine any number of characteristics such as marital status, relationship status, names and ages of children, shopping habits, brand preferences, personal hobbies and other interests, favorite causes (controversial or not), charitable contributions, legal citations, and so on.

One of the more controversial experiments was conducted by MIT researchers last year, dubbed “Project Gaydar.” In a review of ~4,000 Facebook profiles, computers were able to use the information to predict male sexual preference with nearly 80% accuracy – even when no explicit declaration of sexual orientation was made on the profiles.

Others, however, have pointed to positive benefits of data mining and how it can benefit consumers. For instance, chain grocery stores can utilize data collected about product purchases made by people who use store loyalty cards, enabling the chains to provide shoppers relevant, valuable coupon offers for future visits.

Last year, media company Netflix awarded a substantial prize to a team of computer specialists who were able to develop software capabilities to analyze the movie rental behavior of ~500,000 Netflix subscribers … and significantly improve the predictive accuracy of product recommendations made to them.

To some, the Netflix program is hardly controversial. To others, it smacks of the “big brother” snooping that occurred in an earlier time during the Supreme Court confirmation hearings for Robert Bork and Clarence Thomas, when over-zealous Senate staffers got their hands on movie store rental records to determine what kind of fare was being watched by the nominees and their families.

Indeed, last week Netflix announced that it will not be moving forward with a subsequent similar initiative. (In all likelihood, this decision was influenced by pending private litigation more than any sort of altruism.)

Perhaps the most startling development on the privacy front comes courtesy of Carnegie Mellon University, where two researchers have run an experiment wherein they have been able to correctly predict the Social Security numbers for nearly 10% of everyone born between 1989 and 2003 – almost 5 million people.

How did they do it? They started by accessing publicly available information from various sources including social networking sites to collect two critical pieces of information: birthdate, plus city or state of birth. This enabled the researchers to determine the first three digits of each Social Security number, which then provided the baseline for running repeat cycles of statistical correlation and inference to “crack” the Social Security Administration’s proprietary number assignment system.

So as it turns out, it’s not enough anymore merely to be concerned about what you might have revealed in cyberspace on a self-indulgent MySpace page or in an ill-advised newsgroup post.

Social Security numbers … passwords … account numbers … financial data. Today, they’re all fair game.

The Mobile Web: Great Promise + Growth Pains

It’s clear that the mobile web is a big growth segment these days. Proof of that is found in recent Nielsen statistics, which have charted ~34% annual growth of the U.S. mobile web audience, now numbering some 57 million visitors using a mobile device to visit web sites (as of late summer 2009).

And now, a new forecast by the Gartner research firm projects that mobile phones will overtake PCs as the most common web access devices worldwide … as early as 2013. It estimates that the total number of smartphones and/or browser-enhanced phones will be ~1.82 billion, compared to ~1.78 billion PCs by then.

Gartner is even more aggressive than Morgan Stanley’s prediction that the mobile web will outstrip the desktop web by 2015.

So, what’s the problem?

Well … consumer studies also show that web surfing using mobile phones continues to be a frustrating experience for many users. In a recent survey of ~1,000 mobile web users, web application firm Compuware/Gomez found that two out of every three mobile web users reports having problems when accessing web sites on their phones.

Because people are so used to fast broadband connections – both at home and at work – it’s only natural that their expectations for the mobile web are similarly high. To illustrate this, Gomez found that more than half of mobile phone users are willing to wait just 6 to 10 seconds for a site to load before moving on.

And what happens after they give up? Sixty percent say they’d be less likely to visit the site again. More importantly, ~40% report that they’d head over to a competing site. As for what would happen if the mobile web experience was as fast and reliable as on a PC, more than 80% of the respondents in the Gomez study claim they would access web sites more often from their phones.

For marketers, this means that to maximize their success in the mobile world, they should reformat web sites to conform to the small-form factor of handheld devices. And Gartner also notes that “context” will be the king of the hill in mobile – more than just “search” – in that it will deliver a personalized user experience. New functionalities such as Google’s “Near Me Now” are providing information on businesses, dining and other services that are in the proximity of a mobile user’s location. These and other innovations are opening up whole new dimensions to “seeking and finding” in the mobile web world.

Facebook Continues on its Merry Way to Social Media (and Web?) Dominance

Here’s a very interesting finding ripped from today’s social media headlines: The Business Insider and other media outlets are reporting that Facebook now accounts for nearly one in four page views on the Internet in the United States.

So claims database marketing consulting firm Drake Direct, which has studied web traffic in the U.S. and the U.K. by analyzing data collected by Compete, a leading aggregator of web statistics.

Just to give you an idea of how significant Facebook’s results are: by comparison, search engine powerhouse Google accounts for only about one in twelve page views.

And Facebook is now closing in on Google when it comes to site visits – with each currently receiving around 2.5 billion visits per month. In fact, studying the trend lines, Drake Direct anticipates that Facebook site visits will surpass Google any time now.

Another interesting finding is that the length of the average Facebook visit now surpasses that of YouTube (~16 minutes versus ~14 minutes per visit), whereas YouTube had charted longer visits prior to now.

These findings underscore the continued success of Facebook as the most successful social media site, even as it has grown to 350+ million users, including more than 100 million in the U.S. with 5 million added in January alone. No doubt, it’s on a roll.

Where Does the News Begin? Pew Looks for Answers.

Pew studies news reporting today ... and who's crafting it.
You don’t have to be over 50 years old to be concerned about where the world might be heading when it comes to the generation of news stories and how they are vetted. As newspapers and other publishers have cut the size of their reporting and editorial staffs, the quality and consistency of news reporting has suffered in the eyes of many.

Recently, the Pew Research Center’s Project for Excellence in Journalism decided to take a look at this issue to see how it’s playing out on the ground by studying the “news ecosystem” of a single geographic region. The market chosen for the study – Baltimore, Maryland – just happens to be in my backyard, so I’ve been able to review the results with a good understanding of the dynamics of the region in question.

Pew’s Baltimore study evaluated the news environment during the summer of 2009 and came to some interesting conclusions. While the regional media landscape – print, web, radio and TV – has broadened considerably to include 53 separate outlets that regularly produce and broadcast some form of news content, much of what is truly “new news” came from the traditional news outlets and not from other media resources.

Six major local/regional news threads were studied, ranging from the Maryland state budget situation to crime trends, issues affecting the metro transit system, and the sale of the Senator Theater, a local historical landmark. An analysis of those news threads found that:

 More than 80% of the news stories were repetitive – just rehashes of someone else’s original news content that contained no new information.

 Of the ~20% of the news stories that did include new information, nearly all of the content came from traditional media, published either in conventional format (e.g., print) or in digital.

 General-audience newspapers like the Baltimore Sun produced roughly half of the news stories, followed by local TV stations such as WBAL-TV contributing ~30% of the reporting.

 Specialty business or legal newspaper outlets such as the Baltimore Business Journal and the Daily Record contributed just under of 15% of the news items, with the remaining news reporting coming primarily from local radio stations such as WYPR-FM.

 Interestingly, about one-third of the news coverage generated by newspaper publishers appeared on the Internet rather than in their print editions.

Thus, the Pew study demonstrates that “new news” is coming from the same sources as before, led by the local papers. But another clear picture to emerge from the Baltimore profile is that the scaling back of editorial staffs has resulted in less original reporting, with much heavier reliance on simply republishing stories that have appeared elsewhere.

At the same time, new interactive capabilities are giving “we the people” an unparalleled broadcast platform via the ability to post feedback and commentary, not to mention utilizing Facebook, Twitter and other social media platforms as a megaphone.

In today’s “everyone’s an editor because they can write” environment, no one can stop us from broadcasting our own opinions and analysis to the world. But that’s not the same thing as a properly sourced, properly vetted news story. And that’s what Pew sees falling away.