Search Engine Rankings: Page 1 is Where It’s At

All the hype you continually hear about how important it is to get on Page 1 of search engine result pages turns out to be … right on the money.

In a just-released study from digital marketing company iCrossing, nearly 9 million “non-branded” search queries conducted on Google, Yahoo and Bing were analyzed, with the clickthrough percentages from the first, second and third pages of the search engine results (SERPs) tallied.

It turned out that more than 8.5 million clickthroughs were made from the first page of results – a whopping 95% of the total. The rest was just crumbs: Clicks off the second page came in under 250,000, while third-page clicks clocked in at a paltry ~180,000.

The results were essentially the same for the three major search engines (all at 95% or 96%) – so it’s a clean sweep across the board and clearly behavior that fits all across the spectrum.

What this suggests is that when searching on generic or descriptive terms, most people will not go past the first page of results if they can’t find a site link that interests them. If they don’t hit paydirt on the first page, they’re far more likely to try another search using different keywords or phrases until they find a site on Page 1 that does the trick.

Comparing this newest iCrossing study with research from a few years back reveals that Page 1 clicks represent an even higher proportion today; earlier studies from a few years back had it pegged at 80% to 90%.

The implications of this study are be clear: if you’re looking to attract visitors to your site via generic or descriptive subject searches, you’d better make sure your site is designed so that it achieves first-page ranking … or your web efforts will be for naught.

That being said, the recipe for success in ranking hasn’t changed much at all. Despite all of the tempting “link juice” tips and tricks out there, the main keys to getting high rankings continue to be creating loads of good web content … speaking the same “language” as searchers (however inaccurate that might be) … and maintaining lots of good links to and from your site to increase its “relevance” to search engines.

No doubt, it’s getting tougher to achieve Page 1 ranking when there’s so much competition out there, but it’s well worth the effort.

The e-Commerce Hiccup

One of the bigger surprises of business in the year 2009 was how big of a hit U.S. e-commerce has taken. According to digital marketing intelligence firm comScore in its just-released report 2009 U.S. Digital Year in Review, e-retail spending in America decreased about 2% during the year to come in just under $210 billion.

This represents the first decline in e-commerce spending ever recorded.

Obviously, the economic recession was the culprit. But considering that e-commerce growth has charted above 20% annually in every year leading up to 2009, seeing an actual fall-off has raised more than a few eyebrows.

Where was the e-commerce decline most pronounced? It was in travel-related services, which saw revenues drop by 5% to ~$80 million. Not that all sectors saw decline. A few continued to experience growth during the year, including the books/magazines category which charted gains of ~12%. Online computer software purchases were also up by about 7%.

What does comScore see on the horizon for U.S. e-commerce? Is continued softness predicted … or a return to robust growth?

Analyzing the last few months of e-commerce activity during 2009 provides clues to the future: Growth looks like it’s returning. In fact, the 2009 holiday season marked a return to positive growth rates when compared against the same period in 2008.

[Granted, this comparison is made against “down” months of November and December in 2008, after the recession had already kicked in. But the pace of e-commerce activity is clearly picking up again.]

But whether it will go back to a 20%+ annual growth is still an open question.

Get Ready for the Endless Political Campaign …

New forecasts about political advertising have just been released. They confirm what many of us have suspected: The political campaign season, traditionally defined every two years by the presidential and off-year congressional election contests, is morphing into one gigantic mega-campaign that basically is with us all the time.

Instead of the nice breather we used to get from political advertising after the campaign season ended, it’s becoming one long, never-ending experience — some would say nightmare.

And if this surprises you, consider the past year alone in U.S. politics. First, there was the inauguration and the early fight over the economic stimulus package, with many political ads run pro and con.

This was followed by the health care debate which attracted an even bigger volume of advertising – probably because there were even more special interests involved. That initiative also sparked the Tea Party rallies and town hall meetings, which became fodder for still more political posturing (and paid advertising).

In the midst of the health care debate, along came the gubernatorial elections in Virginia and New Jersey as well as the “circus sideshow” in Upstate New York’s special congressional election where the Conservative Party candidate forced the endorsed Republican from the race – another opportunity for all sorts of campaign spending.

And just about the time the health care debate finally came to a vote in Congress … the Christmas Bomber shows up – still more fodder for paid political advertising, this time on national security.

As the year 2009 ended, when we thought we were over with politics for at least a few short months, out of nowhere comes the Massachusetts special election for senator that attracts millions of dollars per day in contributions over the Internet and sparking – you guessed it – beaucoup bucks in paid political advertising.

And this past week, when the exciting Superbowl and extreme weather events should be dominating the news, what’s prominently on our TV and cable channels as well? The Tea Party convention in Nashville, capped by an announcement that this group is forming a campaign political action committee to raise millions in funds to — of course — run new candidates for office.

More politics … more money … more advertising.

Of course, all of this is great news for local television and cable stations, which can snap out of their torpor and pocket a ton of new dollars in advertising revenues. In fact, media research and analytical firm Borrell Associates is predicting that U.S. political spending of all stripes will hit a record $4.2 billion in 2010.

Helping this along is the recent U.S. Supreme Court ruling that lifts restrictions on corporations and gives them the freedom to buy political advertising. Borrell estimates that this ruling will add ~10% to the total pool of funds this year.

It seems hard to believe that 2010 – a non-presidential election year – is on track to break 2008’s record for political spending, considering the huge amounts of advertising that were done by the McCain and (especially) the Obama campaigns in 2008. But the prognosticators insist 2010 will be the biggest year yet for political spending … to the tune of $1 billion more than in 2008.

What role does online play in all of this? The Internet is expected to account for less than $50 million in advertising revenues in 2010 – a comparable drop in the bucket. But growth will be very strong in this segment – not least because the web does a very good job of bringing in more campaign donations! The bottom-line prediction: Internet advertising will likely double to reach $100 million for the presidential campaign in 2012.

So the endless political campaign continues endlessly on … never ending … world without end. What fun for us!

The Mobile Web: Great Promise + Growth Pains

It’s clear that the mobile web is a big growth segment these days. Proof of that is found in recent Nielsen statistics, which have charted ~34% annual growth of the U.S. mobile web audience, now numbering some 57 million visitors using a mobile device to visit web sites (as of late summer 2009).

And now, a new forecast by the Gartner research firm projects that mobile phones will overtake PCs as the most common web access devices worldwide … as early as 2013. It estimates that the total number of smartphones and/or browser-enhanced phones will be ~1.82 billion, compared to ~1.78 billion PCs by then.

Gartner is even more aggressive than Morgan Stanley’s prediction that the mobile web will outstrip the desktop web by 2015.

So, what’s the problem?

Well … consumer studies also show that web surfing using mobile phones continues to be a frustrating experience for many users. In a recent survey of ~1,000 mobile web users, web application firm Compuware/Gomez found that two out of every three mobile web users reports having problems when accessing web sites on their phones.

Because people are so used to fast broadband connections – both at home and at work – it’s only natural that their expectations for the mobile web are similarly high. To illustrate this, Gomez found that more than half of mobile phone users are willing to wait just 6 to 10 seconds for a site to load before moving on.

And what happens after they give up? Sixty percent say they’d be less likely to visit the site again. More importantly, ~40% report that they’d head over to a competing site. As for what would happen if the mobile web experience was as fast and reliable as on a PC, more than 80% of the respondents in the Gomez study claim they would access web sites more often from their phones.

For marketers, this means that to maximize their success in the mobile world, they should reformat web sites to conform to the small-form factor of handheld devices. And Gartner also notes that “context” will be the king of the hill in mobile – more than just “search” – in that it will deliver a personalized user experience. New functionalities such as Google’s “Near Me Now” are providing information on businesses, dining and other services that are in the proximity of a mobile user’s location. These and other innovations are opening up whole new dimensions to “seeking and finding” in the mobile web world.

Facebook Continues on its Merry Way to Social Media (and Web?) Dominance

Here’s a very interesting finding ripped from today’s social media headlines: The Business Insider and other media outlets are reporting that Facebook now accounts for nearly one in four page views on the Internet in the United States.

So claims database marketing consulting firm Drake Direct, which has studied web traffic in the U.S. and the U.K. by analyzing data collected by Compete, a leading aggregator of web statistics.

Just to give you an idea of how significant Facebook’s results are: by comparison, search engine powerhouse Google accounts for only about one in twelve page views.

And Facebook is now closing in on Google when it comes to site visits – with each currently receiving around 2.5 billion visits per month. In fact, studying the trend lines, Drake Direct anticipates that Facebook site visits will surpass Google any time now.

Another interesting finding is that the length of the average Facebook visit now surpasses that of YouTube (~16 minutes versus ~14 minutes per visit), whereas YouTube had charted longer visits prior to now.

These findings underscore the continued success of Facebook as the most successful social media site, even as it has grown to 350+ million users, including more than 100 million in the U.S. with 5 million added in January alone. No doubt, it’s on a roll.

Where Does the News Begin? Pew Looks for Answers.

Pew studies news reporting today ... and who's crafting it.
You don’t have to be over 50 years old to be concerned about where the world might be heading when it comes to the generation of news stories and how they are vetted. As newspapers and other publishers have cut the size of their reporting and editorial staffs, the quality and consistency of news reporting has suffered in the eyes of many.

Recently, the Pew Research Center’s Project for Excellence in Journalism decided to take a look at this issue to see how it’s playing out on the ground by studying the “news ecosystem” of a single geographic region. The market chosen for the study – Baltimore, Maryland – just happens to be in my backyard, so I’ve been able to review the results with a good understanding of the dynamics of the region in question.

Pew’s Baltimore study evaluated the news environment during the summer of 2009 and came to some interesting conclusions. While the regional media landscape – print, web, radio and TV – has broadened considerably to include 53 separate outlets that regularly produce and broadcast some form of news content, much of what is truly “new news” came from the traditional news outlets and not from other media resources.

Six major local/regional news threads were studied, ranging from the Maryland state budget situation to crime trends, issues affecting the metro transit system, and the sale of the Senator Theater, a local historical landmark. An analysis of those news threads found that:

 More than 80% of the news stories were repetitive – just rehashes of someone else’s original news content that contained no new information.

 Of the ~20% of the news stories that did include new information, nearly all of the content came from traditional media, published either in conventional format (e.g., print) or in digital.

 General-audience newspapers like the Baltimore Sun produced roughly half of the news stories, followed by local TV stations such as WBAL-TV contributing ~30% of the reporting.

 Specialty business or legal newspaper outlets such as the Baltimore Business Journal and the Daily Record contributed just under of 15% of the news items, with the remaining news reporting coming primarily from local radio stations such as WYPR-FM.

 Interestingly, about one-third of the news coverage generated by newspaper publishers appeared on the Internet rather than in their print editions.

Thus, the Pew study demonstrates that “new news” is coming from the same sources as before, led by the local papers. But another clear picture to emerge from the Baltimore profile is that the scaling back of editorial staffs has resulted in less original reporting, with much heavier reliance on simply republishing stories that have appeared elsewhere.

At the same time, new interactive capabilities are giving “we the people” an unparalleled broadcast platform via the ability to post feedback and commentary, not to mention utilizing Facebook, Twitter and other social media platforms as a megaphone.

In today’s “everyone’s an editor because they can write” environment, no one can stop us from broadcasting our own opinions and analysis to the world. But that’s not the same thing as a properly sourced, properly vetted news story. And that’s what Pew sees falling away.

Plunging Headlong into the Next Digital Decade

Looking back over the past ten years, so much has happened in the world of digital communications, it’s almost impossible to remember the “bad old days” of slow-loading web pages and clunky Internet interfaces.

And yet, that was the norm for many web users at the beginning of the decade.

So what’s in store for the upcoming decade? When you consider that a 120-minute movie file can be downloaded about as quickly as a single web page these days, how could data processing and download times get any faster than they are already?

Certainly, if the world continues with transistor-based computer chip designs, there’s very little room for further improvement. That’s because today’s chips continue to be based on the original 1958 Shockley transistor design – except that they contain many more transistors along with circuits that have been engineered smaller and smaller — down practically to the size of an atom.

We can’t get much smaller than that without a radical new design platform. And now, along comes “quantum computing” which goes well beyond the traditional binary system of using ones and zeros in information processing. The betting is that so-called “qubits” – quantum processor units – will become the new design paradigm in the 2010s that will dramatically increase processor speed and power once again.

[An alarming side effect of quantum processing, however, is the possibility that computers will become so much more powerful that current methods of encryption will become obsolete. Clearly, new ways of protecting information will need to be developed along with the new speed and capabilities.]

In tandem with the new advancements in data processing power and speed, industry prognosticators such as Wired magazine’s chief editor Chris Anderson are predicting that bandwidth and storage will become virtually free and unlimited in the coming decade. As a result, it’s highly likely that the decade will be one of much greater collaboration between people in “peer to peer” creation and sharing of information and media, online and in real-time. Think Facebook on steroids.

How are things clicking in Internet marketing at the moment?

What’s happening with clickthrough behaviors on online ads these days? According to comScore, Inc., a digital market intelligence and measurement firm, activity today versus 2007 reveals that ~50% fewer people are clicking on Internet ads now compared to then. In fact, fewer than 10% of all Internet users accounts for ~85% of all ad clicks.

This may call to mind the old adage: “When a tree falls in the forest, does it make a sound … and does anybody know?”

On the other hand, it’s good to remember that banner advertising can have branding value. In fact, comScore research also shows that one in five users who click on an ad go on to conduct a search about the advertiser … and one in three visit the brand’s own web site.

Unfortunately, determining just how effective online advertising is can be a challenge to measure – reflective to some degree of the “bad old days” of print advertising. One reason for the difficulty is because of evolving consumer behaviors regarding “cookies.” When consumers delete tracking cookies from their computer, they’re counted as a “new customer” when returning to the site. Interestingly, comScore’s latest data find that nearly one-third of web users delete cookies – many as often as five times per month. And with the steady stream of news items warning of “Big Brother”-type information harvesting, it’s hardly a surprise that cookie deletion has grown by ~20% since 2007.

What’s the implication? Not accounting for cookie deletion can lead to an overstatement of unique visitors, reach and frequency – by about 2.5 times. (Relying on IP addresses doesn’t solve the issue either, because the typical computer in the U.S. has a multiple number of IP addresses.)

Of course, these hurdles don’t mean that an attempt to measure the effectiveness of online advertising is an exercise in futility. Just as in print advertising, there are clues marketers can hone in on that point to whether an online advertising campaign is a success. And prudent companies will discount web traffic statistics by a certain degree in order to paint a more realistic picture … not to mention incorporating conversion tracking triggered by specific actions on the web site such as a purchase, a customer query, or registering to download an informational document.

Firefox Turns Five: All grown-up now … and with a few grown-up challenges.

Firefox logoMozilla’s Firefox web browser marked a milestone this past week, celebrating its fifth birthday.

No question about it, the open-source browser has been a big success, with growth that has been impressive by any measure. As of the end of July, Firefox had been downloaded more than 1 billion times.

Indeed, a mainstream site like this one here (WordPress) reports that Firefox now represents a larger share of activity than Internet Explorer — 46% versus 39% of traffic.

But now that Firefox has come of age, it’s facing some of the same “grown up” challenges that other browsers face.

In fact, application security vendor Cenzic has just released its security trends report covering the first half of 2009. Guess what? Firefox led the field of web browers in terms of reported total vulnerabilities. Here are the stats from Cenzic:

 Firefox: 44% of reported browser vulnerabilities
 Apple Safari: 35%
 Internet Explorer: 15%
 Opera: 6%

Compared to Cenzic’s report covering the second half of 2008, Firefox’s figure is up from 39%, while IE’s number is down sharply from 43%.

Welcome to reality. As Firefox has grown in importance, it’s gained more exposure to vulnerabilities. A significant portion of those vulnerabilities have come via plug-ins.

Mozilla is trying to take steps to counteract this, including launching a plug-in checker service to ensure that users are running up-to-date versions. It also offers a “bug bounty” to anyone who discovers security holes in Firebox.

And the good news is that even though Firefox had the highest number of vulnerabilities, even Cenzic admits that this doesn’t necesarily mean Firefox users are more vulnerable to security threats. Plus, those vulnerabilities tend to be patched more quickly than those found in other browsers.

So on this fifth anniversary milestone, Firefox can be justly praised as a major success story in the web browser world.

Newspaper publishers and online news consumers: Still miles apart on paid content.

How can the views and perspectives of newspaper publishers and readers be so out of kilter? It might have something to do with “wishful thinking” on the part of the publishers.

Case in point: American Press Institute has just released the results of a field research study that compares the opinions of readers and publishers on paying for news content.

Naturally, this issue is of paramount concern to newspapers that are trying to create a new business model that is profitable. In fact, nearly 60% of the publisher respondents in the survey reported that they’re considering requiring paid access for online news — news that is currently provided to readers free of charge. At the same time, these respondents seem to believe that consumers will willingly “pay to play” in a new paid-content environment.

But I wonder about that.

Here’s an example of the disconnect between newspaper publishers and news consumers found in the survey: More than two-thirds of the publishers believe it will be “not very easy” or “not easy at all” for consumers to find similar news content online from alternative free sources once the shift to paid content happens. Do consumers agree? Well … only ~43% think the same way.

And where do newspaper publishers think people will go for news if their paper’s free online information is no longer available to them? Again, we see a big disparity in the results. The top three sources publishers think consumers will turn to are:

 The publisher’s own print newspaper: 75%
 Other local media: 55%
 Television: 53%

For consumers, those alternate sources all rated lower – in two cases, dramatically so:

 The publisher’s own print newspaper: 30%
 Other local media: 17%
 Television: 45%

[For the record, the alternative free news source identified by the most consumers was “other local web sites,” cited by 68% of respondents.]

With such dramatically different views held by newspaper publishers and their consumers, it’s clear that both sides can’t be correct. I’ll to bet that the consumers’ responses are closer to the reality.

For this reason, it would be advisable for publishers to tread very carefully as they attempt a shift to a paid content business model. Does the term “evaporating audiences” mean anything to them?