What people dislike most about B-to-B websites …

Too many business-to-business websites remain the “poor stepchildren” of the online world even after all these years.

btob websitesSo much attention is devoted to all the great ways retailers and other companies in consumer markets are delighting their customers online.

And it stands to reason:  Those sites are often intrinsically more interesting to focus on and talk about.

Plus, the companies that run those sites go the extra mile to attract and engage their viewers.  After all, consumers can easily click away to another online resource that offers a more compelling and satisfying experience.

Or, as veteran marketing specialist Denison ‘Denny’ Hatch likes to say, “You’re just one mouse-click away from oblivion.”

By comparison, buyers in the B-to-B sphere often have to slog through some pretty awful website navigation and content to find what they’re seeking.  But because their mission is bigger than merely viewing a website for the fun of it, they’ll put up with the substandard online experience anyway.

But this isn’t to say that people are particularly happy about it.

Through my company’s longstanding involvement with the B-to-B marketing world, I’ve encountered plenty of the “deficiencies” that keep business sites from connecting with their audiences in a more fulfilling way.

Sometimes the problems we see are unique to a particular site … but more often, it’s the “SOS” we see across many of them (if you’ll pardon the scatological acronym).

Broadly speaking, issues of website deficiency fall into five categories:

  • They run too slowly.
  • They look like something from the web world’s Neanderthal era.
  • They make it too difficult for people to locate what they’re seeking on the site.
  • Worse yet, they actually lack the information visitors need.
  • They look horrible when viewed on a mobile device — and navigation is no better.

Fortunately, each of these problems can be addressed – often without having to do a total teardown and rebuild.

But corporate inertia can (and often does) get in the way.

Sometimes big changes like Google’s recent “Mobilegeddon” mobile-friendly directives come along that nudge companies into action.  In times like that, it’s often when other needed adjustments and improvements get dealt with as well.

But then things can easily revert back to near-stasis mode until the next big external pressure point comes down the pike and stares people in the face.

Some of this pattern of behavior is a consequence of the commonly held (if erroneous) view that B-to-B websites aren’t ones that need continual attention and updating.

I’d love for more people to reject that notion — if for SEO relevance issues alone.  But after nearly three decades of working with B-to-B clients, I’m pretty much resigned to the fact that there’ll always be some of that dynamic at work.  It just comes with the territory.

Information, advertising and eye-tracking: Continuity among the chaos.

digital eyeIn the Western world, humans have been viewing and processing information in the same basic ways for hundreds of years. It’s a subconscious process that entails expending the most judicious use of time and effort to forage for information.

Because of how we Westerners read and write – left-to-right and top-to-bottom – the way we’ve evolved searching for information mirrors the same sort of behavior.

And today we have eye-scanning research and mouse click studies that prove the point.

In conducting online searches, where we land on information is known variously as the “area of greatest promise,” the “golden triangle,” or the “F-scan diagram.”

However you wish to name it, it generally looks like this on a Google search engine results page (SERP):

F-scan diagram

It’s easy to see how the “area of greatest promise” exists. We generally look for information by scanning down the beginning of the first listings on a page, and then continue viewing to the right if something seems to be a good match for our information needs, ultimately resulting in a clickthrough if our suspicions are correct.

Heat maps also show that quick judgments of information relevance on a SERP occur within this same “F-scan” zone; if we determine nothing is particularly relevant, we’re off to do a different keyword search.

This is why it’s so important for websites to appear within the top 5-7 organic listings on a SERP – or within the top 1-3 paid search results in the right-hand column of Google’s SERP.

In recent years, Google and other search engines have been offering enhancements to the traditional SERP, ranging from showing images across the top of the page to presenting geographic information, including maps.

To what degree is this changing the “conditioning” of people who are seeking out information today compared to before?

What new eye-tracking and heat maps are showing is that we’re evolving to looking at “chunks” of information first for clues as to the promising areas of results. But then within those areas, we revert to the same “F-scan” behaviors.

Here’s one example:

Eye-tracking Update

And there’s more:  The same eye-tracking and heat map studies are showing that this two-step process is actually more time-efficient than before.

We’re covering more of the page (at least on the first scan), but are also able to zero in on the most promising information bits on the page.  Once we find them, we’re quicker to click on the result, too.

So while Google and other search engines may be “conditioning” us to change time-honored information-viewing habits, it’s just as much that we’re “conditioning” Google to organize their SERPs in ways that are easiest and most beneficial to us in the way be seek out and find relevant information.

Bottom line, it’s continuity among the chaos. And it proves yet again that the same “prime positioning” on the page favored for decades by advertisers and users alike – above the fold and to the left – still holds true today.

Smartphones and Tablets have Doubled Our Time Spent Online

screenjumpersWhat a difference a few years makes.

Back in February 2010, Americans over the age of 18 spent a total of ~451 billion minutes’ time on the Internet, according to comScore’s Media Metrix research.

By comparison, in February 2013, the total time spent online had nearly doubled to ~890 minutes.

The vast majority of the increase is attributable to tablet computers and smartphones rather than PCs:

  • PC minutes rose from ~388 billion to ~467 billion (+24%).
  • Smartphone minutes grew from ~63 billion to a whopping ~208 billion (+230%).
  • Tablet minutes grew from zero to 115 billion (tablets didn’t exist in 2010).

In fact, taken together, smartphones and tablets now account for nearly 60% of the time online spent by people age 18 to 24.  On the other hand, smartphones account for a relatively small 25% of time spent online by Americans age 50 or older.

This age divide is also clearly evident in comScore’s estimated breakdown of platform adoption:

All American Adults

  • PC only:  ~30%
  • “Screen jumpers” (PC + mobile):  ~63%
  • Mobile platforms only:  ~7%

Young Adults (age 18-24)

  • PC only:  ~22%
  • Screen jumpers:  ~65%
  • Mobile only:  ~13%

Older Adults (age 50+)

  • PC only:  ~48%
  • Screen jumpers:  ~51%
  • Mobile only:  ~1%

The comScore analysis also provides some interesting stats pertaining to online share of minutes by the type of content being accessed.

Most online time spent on PCs:

  • Business/Finance (~68%)
  • TV (~68%)
  • News/Information (~62%)
  • Sports (~62%)
  • Retail (~49%)
  • Health (~54%)

Most online time spent on smartphones:

  • Radio (~77%)
  • Social Media (~58%)
  • Weather (~55%)
  • Games (~48%)

Tablets don’t lead in any single category, but score particularly well in these two:

  • Games (~34% of time online is spent on tablets)
  • TV (~20% of time online is spent on tablets)

More details and insights from the comScore report can be found here.

Are e-Readers Changing our Reading Habits?

e-reader products available todayE-readers have become the rage. That’s clear from how many people are now using them.

A Harris Interactive survey of ~2,180 consumers in July 2011 has found that ~15% of Americans over age 18 are using an e-reader device. That’s about double the percentage compared to last year’s poll.

Beyond this, another ~15% reported that they’re likely to buy one within the next six months.

The Harris research found some interesting regional differences in e-reader usage. I was quite surprised to learn that e-readers haven’t taken off nearly as strongly in the Midwest as compared to the other three regions of the country:

 Westerners: ~20% have an e-reader
 Easterners: ~19%
 Southerners: ~14%
 Midwesterners: ~9%

What are the characteristics of those who own e-readers, besides where they live? It turns out they’re far more active readers than the rest of the population.

For example, about one third of all survey respondents reported that they read more than 10 books during the year. But for those who own an e-reader, that percentage was nearly 60%.

And just because someone owns an e-reader doesn’t mean they’re stopped purchasing actual books. While one-third of all the survey respondents reported that they haven’t purchased any books in the past year … that percentage was only 6% of those who use e-readers.

The criticism commonly heard that e-readers may be the death knell for traditional books because cause people to download fewer books than they would purchase in physical form may not carry much weight, if the Harris survey results are to be believed.

On the contrary, the e-reader phenomenon appears to be making some people even more voracious readers than before. About one third of the e-reader respondents in the survey reported that they read more now than before – and not just on their e-readers.

Clearly, e-readers represent a phenomenon that’s taken firm hold and is here to stay. But whether it’s radically changing the reading habits of its users … that remains an open question. The early signs suggest “no.”

What about your experience? Have your habits changed with the advent of e-readers? How so?

Dumb and Dumber: Internet Explorer Users, the Media and the AptiQuant Hoax

The AptiQuant "Dumb IE Users" Research has more than one news organization with egg on its face.
AptiQuant's "Not-so-smart IE users" research brief leaves more than one news organization with egg on its face.
You may have read the news reports a few weeks ago of a study conducted by a Canadian research company that “concluded” that Internet Explorer users have lower IQs than users of other browsers like Firebox and Safari.

The “news release” was peppered with authentic-looking charts and graphs that supposedly provided backup for the conclusions, which purportedly came from online IQ testing of ~100,000 individuals found randomly through searches and Internet advertising.

Not surprisingly, the news that IE users are somehow “dumber” than the “bright bulbs” who use the supposedly more “hip” Chrome, Opera, Safari and Firefox browsers hit the newswires like a brick.

I saw reports on the research all over the place – from the BBC and Huffington Post to Yahoo, the New York Times, Forbes and MediaPost.

… And then, a week or so after the story burst on the scene, it began to fall apart.

One intrepid BBC reporter dug into the story deeper, and discovered in the process that AptiQuant, the Vancouver-based organization billed as a “psychometric consulting company” that supposedly conducted the survey, is an entity with no traceable presence back beyond just a few weeks prior to the deployment of its research findings.

Not only that, the AptiQuant website had been set up only a few days prior … and the site’s professional-looking photos lifted wholesale from a legitimate Paris-based consulting firm.

Curious, I trolled around online to find the original research brief released by AptiQuant. It seems to me that anyone reading the study’s conclusions would smell a rat.

True, the statistical data appear impressive enough. But no research organization worth its salt is going to make comments such as the following a part of its research conclusions, and I quote:

The study showed a substantial relationship between an individual’s cognitive ability and their choice of web browser. From the test results, it is a clear indication that individuals on the lower side of the IQ scale tend to resist a change/upgrade of their browsers. This hypothesis can be extended to any software in general ….

It is common knowledge that Internet Explorer Versions 6.0 to 8.0 are highly incompatible with modern web standards. In order to make websites work properly on these browsers, web developers have to spend a lot of unnecessary effort. This results in an extra financial strain on web projects, and has over the last decade cost millions of man-hours to IT companies. Now that we have a statistical pattern on the continuous usage of incompatible browsers, better steps can be taken to eradicate this nuisance.Nuisance? I mean, really!

A few days later, the perpetrators of the phony research report came forward and admitted as much. AptiQuant’s purported CEO, a person using the moniker “Leonard Howard,” laid the cards on the table:

“The main purpose behind this hoax was to create awareness about the incompatibilities of IE6 and how it is pulling back innovation. So, if you are still using IE6, please update to a newer browser. We got this idea when adding some features to our comparison shopping site … we found out that IE6 was highly incompatible with web standards. IE 7.0 and 8.0, though better than 6.0, are still incompatible with not only the standards, but with each other, too.”

It was also noted that “telltale signs that should have uncovered the hoax in less than five minutes” included the following red flags:

• The AptiQuant domain was registered only on July 14, 2011

• The IQ test that was referenced in the report (Wechsler Adult Intelligence Scale IV test) is copyrighted and cannot be administered online

• The company address listed on the report does not exist

• Much of the content on the website was ripped from other sites – including the photo images

• The website was developed using the WordPress platform, which would be highly unusual for any credible consulting firmBeyond the fact that “Mr. Howard & Co.” must have had way too much spare time on their hands … I think it’s interesting – and sobering – to witness how many reputable news organizations took this report and ran with it without so much as a minute of fact-checking – or even picking up the phone to get an additional quote from an AptiQuant company spokesperson.

… Especially since the topic and the conclusions drawn – namely, that some people are dumber than others – were bound to be controversial.

In the “old days,” a story like this would be lucky to be published in single outlet, if at all. But in today’s “brave new world” of online news, it took mere hours for the news to bound about the Internet and show up on dozens of legitimate news sites … thereby enabling the story to take on a “legitimate” life of its own.

I wonder what’ll be next. Because it’s sure to happen again.

The Twitter Machine: Keeping Hype Alive

Americans' Twitter usage isn't getting anywhere near Facebook'sI’ve blogged before about Twitter’s seeming inability to break out of its “niche” position in communications. We now have enough time under our belt with Twitter to begin to draw some conclusions rather than simply engage in speculation.

Endlessly hyped (although sometimes correctly labeled as a revolutionary communications tool – see the North African freedom movements) the fact is that Twitter hasn’t been adopted by the masses like we’ve witnessed with Facebook.

The Pew Research Center’s Internet & American Life Project estimates that fewer than 10% of American adults who are online are Twitter users. That equates to about 15 million Americans, which is vastly lower than Twitter’s own claims of ~65 million users.

But whether you choose to believe the 15 million or the 65 million figure, it’s a far cry from the 150+ million Americans who are on Facebook – which represents about half of the entire American population.

You can find a big reason for Pew’s discrepancy by snooping around on Twitter a bit. It won’t take you long to find countless Twitter accounts that are bereft of any tweet activity at all. People may have set their acount up at one time, but long ago lost interest in using the platform – if indeed they ever had any real Twitter zeal beyond “follow-the-leader.” (“Everybody’s going on Twitter … shouldn’t I sign up, too?”)

This is the purest essence of hype: generating a flurry of interest that quickly dissipates as the true value (or lack thereof) is discerned by users.

Of course, Twitter does have its place. Some brands find the platform to be a good venue for announcing new products and sales deals. And it doesn’t take long for the best of those deals promoted on Twitter to leech their way into the rest of the online world.

Other companies – although far fewer – are using Twitter as a kind of customer service discussion board.

And as we all know, celebrities l-o-v-e their Twitter accounts. What a great, easy way to generate an endless stream of sound-bite information about their favorite topic: themselves.

Analyses of active Twitter accounts have shown that a sizable chunk of the activity is made up of media properties and brands tweeting each other … a lot of inside-the-park baseball.

What’s missing from the equation is the level of “real people” engagement one can find on Facebook in abundance … and maybe soon on Google+ as well. That’s real social interaction – in spades.

Actually, you mightn’t be too far off the mark if you deduced that Twitter is the digital equivalent of a bunch of industry insiders at a cocktail party … saying little of real importance while trying to appear “impressive” and “hip” at the same time.

But who’s being fooled by that?

Big Branding News on the Internet Domain Name Front

ICANN logoIt was only a matter of time. Internet domain names are now poised to move to a new level of branding sophistication.

This past week, the Internet Corporation for Assigned Names and Numbers (ICANN) decided to broaden domain name suffixes to encompass pretty much anything. Instead of being restricted to suffixes like .com and .net that we’re so used to seeing, beginning in January 2012, companies will be able to apply for the use of any suffix.

At one level, there’s a practical reason for the change in policy. As happened with telephone lines in an earlier era when a host of new FAX numbers and cellphones came onstream, the inventory of available web addresses under the original system of .com, .edu, .gov and .org has been drying up. Recent moves to authorize the use of .biz, .us and .xxx have been merely stopgap measures that have done little to alleviate the pending inventory crunch.

But the latest ICANN move will likely have ripple effects that go well beyond the practical issue of available web addresses. Industry observers anticipate that the new policies will unleash a flurry of branding activity as leading companies apply for the right to use their own brand names as suffixes.

In fact, Peter Dengate Thrush, chairman of ICANN’s board of directors, believes the move will “usher in a new Internet age.”

It’s expected that major consumer brands like Coca Cola and Toyota will be among the first to nab new domain suffixes like .coke or .toyota.

It’s a natural tactic for companies to employ as a defensive step against unscrupulous use of their brand names by other parties. But it’s also an effective way to gain more control over their overall online web presence via the ability to send visitors more directly to various portions of their world in cyberspace.

Of course, we can’t expect these new suffixes to be acquired on the cheap. Gone are the days when someone could purchase an address like “weather.com” for a just few dollars … and then sell it later on for hundreds of thousands.

In fact, it’s being reported by the Los Angeles Times that the cost to secure a new domain will be in the neighborhood of $185,000 – hardly chump change. At that price tag, only well-established organizations will be in a position to apply – and those applications must also be able to show that they have the technical capabilities to keep the domain running. So no cyber-squatters need apply.

Bloomberg Businessweek predicts that leading companies may invest upwards of $500,000 each to secure their brand identities online and to prevent them from being “hijacked” by others. It certainly gives a fresh new meaning to the term “eminent domain”!