Search Engine Rankings: Page 1 is Where It’s At

All the hype you continually hear about how important it is to get on Page 1 of search engine result pages turns out to be … right on the money.

In a just-released study from digital marketing company iCrossing, nearly 9 million “non-branded” search queries conducted on Google, Yahoo and Bing were analyzed, with the clickthrough percentages from the first, second and third pages of the search engine results (SERPs) tallied.

It turned out that more than 8.5 million clickthroughs were made from the first page of results – a whopping 95% of the total. The rest was just crumbs: Clicks off the second page came in under 250,000, while third-page clicks clocked in at a paltry ~180,000.

The results were essentially the same for the three major search engines (all at 95% or 96%) – so it’s a clean sweep across the board and clearly behavior that fits all across the spectrum.

What this suggests is that when searching on generic or descriptive terms, most people will not go past the first page of results if they can’t find a site link that interests them. If they don’t hit paydirt on the first page, they’re far more likely to try another search using different keywords or phrases until they find a site on Page 1 that does the trick.

Comparing this newest iCrossing study with research from a few years back reveals that Page 1 clicks represent an even higher proportion today; earlier studies from a few years back had it pegged at 80% to 90%.

The implications of this study are be clear: if you’re looking to attract visitors to your site via generic or descriptive subject searches, you’d better make sure your site is designed so that it achieves first-page ranking … or your web efforts will be for naught.

That being said, the recipe for success in ranking hasn’t changed much at all. Despite all of the tempting “link juice” tips and tricks out there, the main keys to getting high rankings continue to be creating loads of good web content … speaking the same “language” as searchers (however inaccurate that might be) … and maintaining lots of good links to and from your site to increase its “relevance” to search engines.

No doubt, it’s getting tougher to achieve Page 1 ranking when there’s so much competition out there, but it’s well worth the effort.

The e-Commerce Hiccup

One of the bigger surprises of business in the year 2009 was how big of a hit U.S. e-commerce has taken. According to digital marketing intelligence firm comScore in its just-released report 2009 U.S. Digital Year in Review, e-retail spending in America decreased about 2% during the year to come in just under $210 billion.

This represents the first decline in e-commerce spending ever recorded.

Obviously, the economic recession was the culprit. But considering that e-commerce growth has charted above 20% annually in every year leading up to 2009, seeing an actual fall-off has raised more than a few eyebrows.

Where was the e-commerce decline most pronounced? It was in travel-related services, which saw revenues drop by 5% to ~$80 million. Not that all sectors saw decline. A few continued to experience growth during the year, including the books/magazines category which charted gains of ~12%. Online computer software purchases were also up by about 7%.

What does comScore see on the horizon for U.S. e-commerce? Is continued softness predicted … or a return to robust growth?

Analyzing the last few months of e-commerce activity during 2009 provides clues to the future: Growth looks like it’s returning. In fact, the 2009 holiday season marked a return to positive growth rates when compared against the same period in 2008.

[Granted, this comparison is made against “down” months of November and December in 2008, after the recession had already kicked in. But the pace of e-commerce activity is clearly picking up again.]

But whether it will go back to a 20%+ annual growth is still an open question.

Get Ready for the Endless Political Campaign …

New forecasts about political advertising have just been released. They confirm what many of us have suspected: The political campaign season, traditionally defined every two years by the presidential and off-year congressional election contests, is morphing into one gigantic mega-campaign that basically is with us all the time.

Instead of the nice breather we used to get from political advertising after the campaign season ended, it’s becoming one long, never-ending experience — some would say nightmare.

And if this surprises you, consider the past year alone in U.S. politics. First, there was the inauguration and the early fight over the economic stimulus package, with many political ads run pro and con.

This was followed by the health care debate which attracted an even bigger volume of advertising – probably because there were even more special interests involved. That initiative also sparked the Tea Party rallies and town hall meetings, which became fodder for still more political posturing (and paid advertising).

In the midst of the health care debate, along came the gubernatorial elections in Virginia and New Jersey as well as the “circus sideshow” in Upstate New York’s special congressional election where the Conservative Party candidate forced the endorsed Republican from the race – another opportunity for all sorts of campaign spending.

And just about the time the health care debate finally came to a vote in Congress … the Christmas Bomber shows up – still more fodder for paid political advertising, this time on national security.

As the year 2009 ended, when we thought we were over with politics for at least a few short months, out of nowhere comes the Massachusetts special election for senator that attracts millions of dollars per day in contributions over the Internet and sparking – you guessed it – beaucoup bucks in paid political advertising.

And this past week, when the exciting Superbowl and extreme weather events should be dominating the news, what’s prominently on our TV and cable channels as well? The Tea Party convention in Nashville, capped by an announcement that this group is forming a campaign political action committee to raise millions in funds to — of course — run new candidates for office.

More politics … more money … more advertising.

Of course, all of this is great news for local television and cable stations, which can snap out of their torpor and pocket a ton of new dollars in advertising revenues. In fact, media research and analytical firm Borrell Associates is predicting that U.S. political spending of all stripes will hit a record $4.2 billion in 2010.

Helping this along is the recent U.S. Supreme Court ruling that lifts restrictions on corporations and gives them the freedom to buy political advertising. Borrell estimates that this ruling will add ~10% to the total pool of funds this year.

It seems hard to believe that 2010 – a non-presidential election year – is on track to break 2008’s record for political spending, considering the huge amounts of advertising that were done by the McCain and (especially) the Obama campaigns in 2008. But the prognosticators insist 2010 will be the biggest year yet for political spending … to the tune of $1 billion more than in 2008.

What role does online play in all of this? The Internet is expected to account for less than $50 million in advertising revenues in 2010 – a comparable drop in the bucket. But growth will be very strong in this segment – not least because the web does a very good job of bringing in more campaign donations! The bottom-line prediction: Internet advertising will likely double to reach $100 million for the presidential campaign in 2012.

So the endless political campaign continues endlessly on … never ending … world without end. What fun for us!

The Mobile Web: Great Promise + Growth Pains

It’s clear that the mobile web is a big growth segment these days. Proof of that is found in recent Nielsen statistics, which have charted ~34% annual growth of the U.S. mobile web audience, now numbering some 57 million visitors using a mobile device to visit web sites (as of late summer 2009).

And now, a new forecast by the Gartner research firm projects that mobile phones will overtake PCs as the most common web access devices worldwide … as early as 2013. It estimates that the total number of smartphones and/or browser-enhanced phones will be ~1.82 billion, compared to ~1.78 billion PCs by then.

Gartner is even more aggressive than Morgan Stanley’s prediction that the mobile web will outstrip the desktop web by 2015.

So, what’s the problem?

Well … consumer studies also show that web surfing using mobile phones continues to be a frustrating experience for many users. In a recent survey of ~1,000 mobile web users, web application firm Compuware/Gomez found that two out of every three mobile web users reports having problems when accessing web sites on their phones.

Because people are so used to fast broadband connections – both at home and at work – it’s only natural that their expectations for the mobile web are similarly high. To illustrate this, Gomez found that more than half of mobile phone users are willing to wait just 6 to 10 seconds for a site to load before moving on.

And what happens after they give up? Sixty percent say they’d be less likely to visit the site again. More importantly, ~40% report that they’d head over to a competing site. As for what would happen if the mobile web experience was as fast and reliable as on a PC, more than 80% of the respondents in the Gomez study claim they would access web sites more often from their phones.

For marketers, this means that to maximize their success in the mobile world, they should reformat web sites to conform to the small-form factor of handheld devices. And Gartner also notes that “context” will be the king of the hill in mobile – more than just “search” – in that it will deliver a personalized user experience. New functionalities such as Google’s “Near Me Now” are providing information on businesses, dining and other services that are in the proximity of a mobile user’s location. These and other innovations are opening up whole new dimensions to “seeking and finding” in the mobile web world.

Facebook Continues on its Merry Way to Social Media (and Web?) Dominance

Here’s a very interesting finding ripped from today’s social media headlines: The Business Insider and other media outlets are reporting that Facebook now accounts for nearly one in four page views on the Internet in the United States.

So claims database marketing consulting firm Drake Direct, which has studied web traffic in the U.S. and the U.K. by analyzing data collected by Compete, a leading aggregator of web statistics.

Just to give you an idea of how significant Facebook’s results are: by comparison, search engine powerhouse Google accounts for only about one in twelve page views.

And Facebook is now closing in on Google when it comes to site visits – with each currently receiving around 2.5 billion visits per month. In fact, studying the trend lines, Drake Direct anticipates that Facebook site visits will surpass Google any time now.

Another interesting finding is that the length of the average Facebook visit now surpasses that of YouTube (~16 minutes versus ~14 minutes per visit), whereas YouTube had charted longer visits prior to now.

These findings underscore the continued success of Facebook as the most successful social media site, even as it has grown to 350+ million users, including more than 100 million in the U.S. with 5 million added in January alone. No doubt, it’s on a roll.

McCormick Place Loses its Luster

Has all the grumbling about Chicago’s vaunted McCormick Place as America’s premier tradeshow venue finally reached critical mass?

For years, corporate exhibitors have groused about government-controlled, money-losing McCormick Place. Stories abound of exhibitors being forced to spend hundreds of dollars for services as mundane as plugging in a piece of machinery, or being charged $1,000 to hang a sign from the ceiling, because of onerous union rules governing “who does this” and “who can’t do that.” It’s been a constant refrain of complaining I’ve heard at every tradeshow I’ve attended at McCormick Place, dating back some 20 years.

Despite all of the criticism about McCormick Place’s high costs and lack of user-friendly service, it remains the largest convention complex in America, with over 2.5 million square feet of exhibit space. But attendance has been declining pretty dramatically, from ~3.0 million in 2001 to ~2.3 million in 2008. While the figures haven’t been released yet for 2009, it’s widely expected that show traffic will be reported as down another 20%.

As the current economic recession has put the most severe strains yet on the tradeshow business, it seems that a rebellion against McCormick Place is in now full swing. According to a recent article in The Wall Street Journal, “a gradual drop-off in business … has turned into a rout as a string of high-profile shows have pulled out.” The deserters include a triennial plastics show (~75.000 attendees), as well as the Healthcare Information & Management Systems Society’s annual conference (~27,500 attendees).

But isn’t tradeshow attendance off in other convention centers as well? Well … yes. But clearly not as much. In truth, tradeshow attendance has been under pressure at a “macro” level ever since 9/11, and an important reason beyond the issue of terrorism is technological innovation and the ability for people to interact through video-conferencing and for companies to demo their equipment and services via the Internet and other forms of digital communication.

Tradeshows were once the only way to gather a community together, but now there are other options. One school of thought holds that large tradeshows are now less effective than small, targeted conferences that provide heightened ability for attendees to interact with one another on a more intimate basis. Some events no longer charge attendees … but they make sure to “vet” them carefully to ensure that the show sponsors who are underwriting the costs are reaching prospects with important degrees of influence or buying authority.

On top of these “macro” trends, the current economic downturn just makes McCormick Place look more and more like a loser when it comes to the tradeshow game. Compared to Chicago’s three most significant competing tradeshow locales – Atlanta, Las Vegas and Orlando – the cost of many items from electricians (union labor) to foodservice (greasy spoon-quality coffee at Starbucks® prices) to hotel accommodations (room fees and surtaxes that won’t quit) ranges two times to eight times higher in Chicago. And in today’s business climate when every cost is scrutinized closely, none of this looks very cost-effective to the corporate bean-counters.

True, Chicago is more centrally located for travel from both coasts: Who wants to take a five hour flight from New York to Las Vegas or from California to Orlando to attend a meeting?

[On the other hand, no one can honestly say that the weather in Chicago is preferable to sunny Florida or Nevada!]

So it would seem that Chicago’s worthy tradeshow competitors have achieved the upper hand now. I just returned from two national shows this past week – the International Air Conditioning, Heating & Refrigeration Expo and the International Poultry Expo. Where were they held? Orlando and Atlanta – the same cities which are attracting McCormick Place’s erstwhile customers.

Mathematicians and the Meltdown

I can’t wait for the release of The Quants, a new book by Wall Street Journal reporter Scott Patterson about the role of so-called “quant funds” in the financial near-meltdown in September 2008, to be published on February 2. The weekend edition of The Wall Street Journal printed excerpts from the book, a powerful indictment of the mathematicians and computer whizzes who “nearly destroyed Wall Street.”

According to Patterson, “quants” was a name given to “traders and financial engineers who used brain-twisting math and super-powered computers to pluck billions in fleeting dollars out of the market.” In a major departure from traditional trading – evaluating individual companies’ management, performance and competitive positions – the quants used mathematical formulae to wager on which stocks will rise or fall.

Because of breakthroughs in the application of mathematics to financial markets – some of them so novel so as to have won their discoverers Nobel Prize awards — quant funds had quickly come to dominate Wall Street, with most of them piling up profits day after day. (To the senior brass at the investment houses, who likely knew little if anything about how these funds operated except that they made a lot of money, a hands-off policy seemed just the thing.)

And just as in so many other fields, technology elevated the “nerds” to the position of “stars” – with commensurately stratospheric compensation.

Unfortunately, in September 2008 the quant funds could not anticipate the effect of the collapse of the housing market bubble. In fact, this development turned the mathematical formulae of the quant funds on their head: What should have declined, rising … and what should be going up, dropping.

Patterson’s book promises to go into the details of just how things spun out of control, as seen through the eyes of key Wall Street managers such as the piano-playing, songwriting Peter Muller, founder of Morgan Stanley’s Process Driven Trading (PDT) quant fund, and Cliff Asness, formerly of Goldman Sachs and leader of the Applied Quantitative Research (AGR) quant fund.

In addition to presenting all the facts and all the drama, I’m hoping that Patterson will offer a few observations on how we can avoid a debacle like this from happening again in the future.

Another key question is whether any of the proposed regulations being debated in Congress will address the practices of quant funds – or is it all too complicated for anyone to figure out?

If that’s so, it’s pretty scary.

The Latest Read on e-Readers

The e-reader phenomenon continues to grow. In fact, sales of e-readers have turned out to be one of the brightest spots in the consumer electronics segment during the 2009 holiday season.

And 2010 is starting out with a bevy of new e-reader product introductions from a half-dozen different manufacturers.

“Way back” in August 2008, research firm iSupply released projections for e-readers that anticipated 3.5 million units to be sold worldwide in 2009. That was up dramatically from 1.1 million units sold in 2008 – almost all of them Kindle or Sony e-readers.

Those projections were considered highly optimistic by some observers. But now that the year has passed, it’s looking like the prediction was on the low side; iSupply’s revised sales figures for 2009 are closer to 5 million units. And Forrester Research estimates that 2009 e-reader sales in the U.S. were very strong, with ~30% of the sales occurring during the holiday season in November and December.

In fact, Amazon has reported that its Kindle e-reader emerged as the most-gifted item ever from its web site.

Now, hard on the heals of the recent Nook e-reader introduction by Barnes & Noble comes news from the International Consumer Electronics Show (CES) of a host of new entrants in the e-reader game. Ranging in price from under $200 to nearly $800, each new entrant is aimed at meeting the needs of different target groups – from those wanting business news to people who wish to read full-length books. Not surprisingly, many of the new enhancements are centered on making the e-reader experience as “easy on the eyes” as possible.

Among the more interesting introductions at CES:

Que (made by Plastic Logic), which incorporates advanced polymer technology to create a shatter-proof screen.

Skiff (Hearst Corporation), which offers a store for digital newspaper/magazine subscriptions.

eDGe (from Entourage Systems), which provides two screens that fold up like a book. (One offers color display and the other a b/w display for newspaper reading.)

Not to be left on the sidelines, the granddaddy in this business – Amazon – is introducing an international version of the Kindle DX. Amazon now offers both larger- and smaller-sized Kindle units in prices ranging from $250 to $500.

With all of these new options in e-readers, what’s in store for 2010 volume? Observers are now predicting that unit sales will be twice as many as in 2009 … which certainly qualifies e-readers as the latest “rage” in the consumer electronics world.

The “Greening” of Corporate America: Fact or Fad?

Considering the cold winter season we’re having – not to mention the equally cold economic and business environment – it’s not hard to imagine that the “corporate green” trend, so popular and prevalent only a year or two ago, might have stalled out in a major way.

Add to this the recent flap over climate change data fudging by some over-enthusiastic scientists, and it seems the perfect recipe for “corporate green” being a movement that’s on the wane.

But a just-completed market research study on “green” marketing provides interesting clues that this might not be the case. A group of U.S. commercial/industrial firms was surveyed for MediaBuyerPlanner, an arm of Watershed Publishing, to determine the extent of green marketing that is occurring. Among the key findings:

• ~70% of the firms surveyed consider themselves to be “somewhat green” or “very green” … but they suspect that customers think of them as less green than they actually are. Perhaps related to this concern, ~80% of the respondents expect to spend more on green marketing in the future – and that percentage approaches 90% among the manufacturers contained in the survey sample.

• For those who currently feature “green” marketing themes in their promotional efforts, the most popular media for that is using the web (~74% of respondents), followed by print promotion (~50%) and direct marketing (~40%).

• More than half of the companies reported that they are taking concrete steps to become “greener” in their operations. The most popular actions are conserving energy in their operations (~60%) and changing products to reflect greener characteristics, such as altering product ingredients, packaging, or intended uses (~54%).

And here’s another interesting survey finding: Quite a few respondents believe their green marketing efforts are more effective than their normal marketing efforts. (One third of them felt this way, compared to just 7% who felt regular marketing activities are more effective than their green messaging. The remaining 60% have not observed a measurable differentiation and/or did not feel knowledgeable enough to make a judgment.)

The survey also found that the commitment senior management makes to sustainability and other green principles in the form of specific actions is what comes first … followed later by “green” marketing efforts. In other words, there is a lower incidence of companies creating green marketing campaigns just out of a desire to appear “green.”

This suggests that green marketing depends first on company management buying into the ideological principals of environmentalism.

Certainly, the “soft economy” as well as the controversy of “soft science” could be acting as a damper on the potency of green messaging. But this field research suggests that “corporate green” continues to be a trend as opposed to just a passing fad … and that its significance as a marketing platform for companies will grow stronger in the coming years.

Plunging Headlong into the Next Digital Decade

Looking back over the past ten years, so much has happened in the world of digital communications, it’s almost impossible to remember the “bad old days” of slow-loading web pages and clunky Internet interfaces.

And yet, that was the norm for many web users at the beginning of the decade.

So what’s in store for the upcoming decade? When you consider that a 120-minute movie file can be downloaded about as quickly as a single web page these days, how could data processing and download times get any faster than they are already?

Certainly, if the world continues with transistor-based computer chip designs, there’s very little room for further improvement. That’s because today’s chips continue to be based on the original 1958 Shockley transistor design – except that they contain many more transistors along with circuits that have been engineered smaller and smaller — down practically to the size of an atom.

We can’t get much smaller than that without a radical new design platform. And now, along comes “quantum computing” which goes well beyond the traditional binary system of using ones and zeros in information processing. The betting is that so-called “qubits” – quantum processor units – will become the new design paradigm in the 2010s that will dramatically increase processor speed and power once again.

[An alarming side effect of quantum processing, however, is the possibility that computers will become so much more powerful that current methods of encryption will become obsolete. Clearly, new ways of protecting information will need to be developed along with the new speed and capabilities.]

In tandem with the new advancements in data processing power and speed, industry prognosticators such as Wired magazine’s chief editor Chris Anderson are predicting that bandwidth and storage will become virtually free and unlimited in the coming decade. As a result, it’s highly likely that the decade will be one of much greater collaboration between people in “peer to peer” creation and sharing of information and media, online and in real-time. Think Facebook on steroids.