An Address: The Next Human Right?

The flag of the Universal Postal Union, the UN agency that coordinates postal policies for member nations.
The flag of the Universal Postal Union, the UN agency that coordinates postal policies for member nations.
Food, shelter, clothing. These are considered basic human rights. But what about a personal address? Is that a human right, too?

That’s the contention of Charles Prescott, a former Direct Marketing Association official who is in the midst of forming a new transnational organization to promote universal “addressability.”

“The consequences of existence, or lack thereof, of an address – especially in the developing world – are dire,” Prescott says. The new group he is organizing (the name isn’t finalized yet – the International Address Data Association is being considered) will promote the adoption of a Universal Postal Union resolution calling for all countries to adopt formal address systems, including change-of-address systems that are available at an affordable cost.

Prescott notes that the cost of change-of-address systems range widely at present, from just a few cents per name in the United States to as high as ~80 cents per name in The Netherlands. If costs are too great, businesses won’t bother using them.

In the U.S., highly addressable mail isn’t that big of an issue, although it’s probably true that a great many catalogues and other printed materials end up in landfills because they’re not addressed to the correct location and never find their way to their intended recipients.

But it’s a huge issue in the developing world. “Address coordinates which can be associated with an individual are extraordinarily important for fostering economic, social and political development,” Prescott emphasizes. But how does this play out in parts of the world where the address descriptions are vague … or non-existent?

I recall when I visited Mumbai, India in the late 1970s, I stayed in a dwelling known as Motiwala Mansion in the Mahim District of the city. Its address was simply “Opposite Shree Cinema.” I asked about the address of the movie house across the street and was told that it was “Opposite Motiwala Mansion.” How’s that for confusion in a city of millions if you don’t know where either one of these buildings is located?

Even more challenging are the slum districts that pepper the world, where addresses basically don’t exist. Today, with the explosion of mobile technology, cell phones reach into every nook and cranny of the world. In fact, cell phones are ubiquitous in the favelas, back alleys and other transient communities that have sprung up in and around every major third-world city. These phones can track the coordinates where someone is living, thereby tagging him or her with an “address” of sorts.

Sound far-fetched? It’s actually happening today, such as with one Brazilian department store chain that is extending store credit to people residing in these localities – once their physical location is known.

But Prescott contends that mobile phone coordinates represent only a partial solution – one that doesn’t allow for the delivery of physical mail. It also doesn’t solve other barriers existing in some countries that have a direct bearing on the economic opportunities for poorer people. One example: the need to show a birth certificate to register children in school … hobbled by the inability to obtain that birth certificate without having a formal address.

Obviously, any new push for a universal “address” initiative faces challenges. As illustrated by the Mumbai example above, addresses will need to be developed using systematic logic that is consistently applied from community to community inside a country. That’s a lot easier said than done.

But as an evangelist for providing every person on the planet an address as a basic human right, Prescott is clearly serious in his endeavor. The advisory board he’s formed includes key thought leaders from business organizations, academia and government in the United States and Europe. This initiative bears watching.

Stanford Weighs In on Web Credibility

Web credibility.With more than 200 million web sites in cyberspace these days, what makes the difference between the good ones and bad ones? That’s what Stanford University has sought to find out through a multi-year research project.

The Stanford Persuasive Technology Lab conducted research with a cross-section of more than 4,500 web consumers over a three-year period. Distilling the mountain of information gleaned from this sample, the Lab issued its Stanford Guidelines for Web Credibility. It boils the research findings down to ten basic guidelines that, if followed, mean that a web site will be viewed as credible and authoritative.

In reviewing the Stanford guidelines, some of them stand out to me as ones that are too often missed by web developers. In particular:

Show there’s a “legitimate” organization behind the web site. Listing a physical address – not a P.O. Box – is one way to do this.

Make it easy to verify the accuracy of information. Third-party citations and data references, along with links to other respected web sites, are ways to accomplish this.

Make sure the site is “intuitive” and easy to navigate. People’s tolerance level for a web site that doesn’t follow a clear, logical thought path is low.

Update site content often. People put less credibility in sites that look like they’re informationally static or stale.

And especially important: Stanford’s research found that content errors should be fixed, no matter how inconsequential they might seem. It turns out that broken links, bad spelling, poor punctuation and other typos negatively affect the credibility of web sites more strongly than many other factors — and yet they’re among the easiest items to fix.

When you look at the overall web credibility guidelines from Stanford, they aren’t particularly challenging – and they shouldn’t be hard to put into practice.

But when you consider how lame many web sites actually are, it’s another reminder of how – in web design as in so much else in business and government – “best practices” too often fall victim to expediency or just plain slipshod execution.

Online Customer Review Sites: Who’s Yelping Now?

The news this week that social networking and user review web site Yelp® will now de-couple the presentation of reviews from advertising programs comes as a rare victory for businesses that have been feeling more than a little pressured (blackmailed?) by the company’s strong-arm revenue-raising tactics.

The web has long had something of a “Wild West” atmosphere when it comes to reviews of businesses helping or (more likely) hurting the reputation of merchants.

Yelp is arguably the most significant of these sites. Since its inception in 2004 as a local site search resource covering businesses in the San Francisco metro area, Yelp has expanded to include local search and reviews of establishments in nearly 20 major urban markets. With its branding tagline “Real people. Real reviews®,” Yelp is visited by ~25 million people each month, making it one of the most heavily trafficked Internet sites in America.

Yelp solicits and publishes user ratings and reviews of local stores, restaurants, hotels and other merchants (even churches and doctor offices are rated), along with providing basic information on each entry’s location, hours of operation, and so forth – with nearly 3 million reviews submitted at last count.

Predictably, user ratings can have a great deal of influence over the relative popularity of the businesses in question. While most reviews are positive (ratings are on a 5-point scale), Yelp also employs a proprietary algorithm – some would say “secret formula” – to rank reviews based on a selection of factors ostensibly designed to give greater credence to “authentic” user reviews as opposed to “ringers” or “put-up jobs.”

Not surprisingly, Yelp hasn’t disclosed this formula to anyone.

So far, so good. But Yelp began to raise the ire of companies when its eager and aggressive advertising sales team began pitching paid promotional (sponsorship) programs to listed businesses that looked suspiciously like tying advertising expenditures to favorable treatment on reviews as a sort of quid quo pro.

Purchase advertising space on Yelp … and positive reviews miraculously start appearing at the top of the page. Decide against advertising … and watch the tables turn as they drop to the bottom or out of site altogether.

Concerns are so strong that three separate lawsuits have been filed this year already, culminating in a class-action lawsuit filed in February that accuses Yelp of “extortion,” including the claim that Yelp ad sales reps have offered to hide or bury a merchant’s negative customer reviews in exchange for signing them up as Yelp sponsors.

“The conduct is an offer to manipulate content in exchange for payment,” Jared Beck, an attorney for one of the plaintiffs, states bluntly.

As for whether Yelp’s announcement of new standards will now curb the rash of lawsuits, it seems clear that this is the intent. But so long as Yelp offers to do any sort of manipulation or reshuffling of reviews in exchange for advertising, the lawsuits will probably continue – even if there’s only the appearance of impropriety.

Oh, and don’t look for Yelp to provide any additional revelations regarding how reviews are sequenced to appear on the page. Too much transparency, and it’ll only make it easier for people to figure out how to “game” the ratings.

U.S. consumers: More comfortable than ever making online purchases.

Online purchasingHave U.S. consumers finally gotten over their skittishness about making purchases over the Internet? A newly released study from Javelin Strategy & Research suggests that they have.

The 2010-2014 Online Retail Payments Forecast report draws its findings from data collected online in November 2009 from a randomly selected panel of nearly 3,300 U.S. consumers representing a representative cross-sample by age, gender and income levels.

Based on the Javelin sample, nearly two-thirds of American consumers are now either “comfortable” or “very comfortable” with shopping online.

On the other end of the scale, ~22% of U.S. consumers continue to be wary of online purchasing; these people haven’t made an online purchase within the past year … or in some cases, never.

These figures suggest that the consumer comfort level with making online purchases is as high as it’s ever been. And how are consumers making their online payments? The Javelin study reports that among those respondents reporting online activities, the five most popular payment methods are:

 Major credit card: 70%
 Major debit card: 55%
 Online payment service such as PayPal®: 51%
 Gift card (good at one specific merchant): 41%
 Store-branded credit card (good at one specific merchant): 27%

Even with more than half of consumers using a debit card for online purchases, the total dollar volume of online sales attributable to debit cards is less than 30%. Javelin forecasts debit card share to continue climbing in the short-term, however, due to tighter consumer credit standards now in force.

Bottom line, the Javelin report suggests that despite the periodic horror stories that have been published about credit card information and other financial data being captured or mined off the Internet, the convenience and price/selection benefits of online shopping are winning the day with consumers. Not surprising at all, really.

Search Engine Rankings: Page 1 is Where It’s At

All the hype you continually hear about how important it is to get on Page 1 of search engine result pages turns out to be … right on the money.

In a just-released study from digital marketing company iCrossing, nearly 9 million “non-branded” search queries conducted on Google, Yahoo and Bing were analyzed, with the clickthrough percentages from the first, second and third pages of the search engine results (SERPs) tallied.

It turned out that more than 8.5 million clickthroughs were made from the first page of results – a whopping 95% of the total. The rest was just crumbs: Clicks off the second page came in under 250,000, while third-page clicks clocked in at a paltry ~180,000.

The results were essentially the same for the three major search engines (all at 95% or 96%) – so it’s a clean sweep across the board and clearly behavior that fits all across the spectrum.

What this suggests is that when searching on generic or descriptive terms, most people will not go past the first page of results if they can’t find a site link that interests them. If they don’t hit paydirt on the first page, they’re far more likely to try another search using different keywords or phrases until they find a site on Page 1 that does the trick.

Comparing this newest iCrossing study with research from a few years back reveals that Page 1 clicks represent an even higher proportion today; earlier studies from a few years back had it pegged at 80% to 90%.

The implications of this study are be clear: if you’re looking to attract visitors to your site via generic or descriptive subject searches, you’d better make sure your site is designed so that it achieves first-page ranking … or your web efforts will be for naught.

That being said, the recipe for success in ranking hasn’t changed much at all. Despite all of the tempting “link juice” tips and tricks out there, the main keys to getting high rankings continue to be creating loads of good web content … speaking the same “language” as searchers (however inaccurate that might be) … and maintaining lots of good links to and from your site to increase its “relevance” to search engines.

No doubt, it’s getting tougher to achieve Page 1 ranking when there’s so much competition out there, but it’s well worth the effort.

The e-Commerce Hiccup

One of the bigger surprises of business in the year 2009 was how big of a hit U.S. e-commerce has taken. According to digital marketing intelligence firm comScore in its just-released report 2009 U.S. Digital Year in Review, e-retail spending in America decreased about 2% during the year to come in just under $210 billion.

This represents the first decline in e-commerce spending ever recorded.

Obviously, the economic recession was the culprit. But considering that e-commerce growth has charted above 20% annually in every year leading up to 2009, seeing an actual fall-off has raised more than a few eyebrows.

Where was the e-commerce decline most pronounced? It was in travel-related services, which saw revenues drop by 5% to ~$80 million. Not that all sectors saw decline. A few continued to experience growth during the year, including the books/magazines category which charted gains of ~12%. Online computer software purchases were also up by about 7%.

What does comScore see on the horizon for U.S. e-commerce? Is continued softness predicted … or a return to robust growth?

Analyzing the last few months of e-commerce activity during 2009 provides clues to the future: Growth looks like it’s returning. In fact, the 2009 holiday season marked a return to positive growth rates when compared against the same period in 2008.

[Granted, this comparison is made against “down” months of November and December in 2008, after the recession had already kicked in. But the pace of e-commerce activity is clearly picking up again.]

But whether it will go back to a 20%+ annual growth is still an open question.

The Mobile Web: Great Promise + Growth Pains

It’s clear that the mobile web is a big growth segment these days. Proof of that is found in recent Nielsen statistics, which have charted ~34% annual growth of the U.S. mobile web audience, now numbering some 57 million visitors using a mobile device to visit web sites (as of late summer 2009).

And now, a new forecast by the Gartner research firm projects that mobile phones will overtake PCs as the most common web access devices worldwide … as early as 2013. It estimates that the total number of smartphones and/or browser-enhanced phones will be ~1.82 billion, compared to ~1.78 billion PCs by then.

Gartner is even more aggressive than Morgan Stanley’s prediction that the mobile web will outstrip the desktop web by 2015.

So, what’s the problem?

Well … consumer studies also show that web surfing using mobile phones continues to be a frustrating experience for many users. In a recent survey of ~1,000 mobile web users, web application firm Compuware/Gomez found that two out of every three mobile web users reports having problems when accessing web sites on their phones.

Because people are so used to fast broadband connections – both at home and at work – it’s only natural that their expectations for the mobile web are similarly high. To illustrate this, Gomez found that more than half of mobile phone users are willing to wait just 6 to 10 seconds for a site to load before moving on.

And what happens after they give up? Sixty percent say they’d be less likely to visit the site again. More importantly, ~40% report that they’d head over to a competing site. As for what would happen if the mobile web experience was as fast and reliable as on a PC, more than 80% of the respondents in the Gomez study claim they would access web sites more often from their phones.

For marketers, this means that to maximize their success in the mobile world, they should reformat web sites to conform to the small-form factor of handheld devices. And Gartner also notes that “context” will be the king of the hill in mobile – more than just “search” – in that it will deliver a personalized user experience. New functionalities such as Google’s “Near Me Now” are providing information on businesses, dining and other services that are in the proximity of a mobile user’s location. These and other innovations are opening up whole new dimensions to “seeking and finding” in the mobile web world.

Facebook Continues on its Merry Way to Social Media (and Web?) Dominance

Here’s a very interesting finding ripped from today’s social media headlines: The Business Insider and other media outlets are reporting that Facebook now accounts for nearly one in four page views on the Internet in the United States.

So claims database marketing consulting firm Drake Direct, which has studied web traffic in the U.S. and the U.K. by analyzing data collected by Compete, a leading aggregator of web statistics.

Just to give you an idea of how significant Facebook’s results are: by comparison, search engine powerhouse Google accounts for only about one in twelve page views.

And Facebook is now closing in on Google when it comes to site visits – with each currently receiving around 2.5 billion visits per month. In fact, studying the trend lines, Drake Direct anticipates that Facebook site visits will surpass Google any time now.

Another interesting finding is that the length of the average Facebook visit now surpasses that of YouTube (~16 minutes versus ~14 minutes per visit), whereas YouTube had charted longer visits prior to now.

These findings underscore the continued success of Facebook as the most successful social media site, even as it has grown to 350+ million users, including more than 100 million in the U.S. with 5 million added in January alone. No doubt, it’s on a roll.

McCormick Place Loses its Luster

Has all the grumbling about Chicago’s vaunted McCormick Place as America’s premier tradeshow venue finally reached critical mass?

For years, corporate exhibitors have groused about government-controlled, money-losing McCormick Place. Stories abound of exhibitors being forced to spend hundreds of dollars for services as mundane as plugging in a piece of machinery, or being charged $1,000 to hang a sign from the ceiling, because of onerous union rules governing “who does this” and “who can’t do that.” It’s been a constant refrain of complaining I’ve heard at every tradeshow I’ve attended at McCormick Place, dating back some 20 years.

Despite all of the criticism about McCormick Place’s high costs and lack of user-friendly service, it remains the largest convention complex in America, with over 2.5 million square feet of exhibit space. But attendance has been declining pretty dramatically, from ~3.0 million in 2001 to ~2.3 million in 2008. While the figures haven’t been released yet for 2009, it’s widely expected that show traffic will be reported as down another 20%.

As the current economic recession has put the most severe strains yet on the tradeshow business, it seems that a rebellion against McCormick Place is in now full swing. According to a recent article in The Wall Street Journal, “a gradual drop-off in business … has turned into a rout as a string of high-profile shows have pulled out.” The deserters include a triennial plastics show (~75.000 attendees), as well as the Healthcare Information & Management Systems Society’s annual conference (~27,500 attendees).

But isn’t tradeshow attendance off in other convention centers as well? Well … yes. But clearly not as much. In truth, tradeshow attendance has been under pressure at a “macro” level ever since 9/11, and an important reason beyond the issue of terrorism is technological innovation and the ability for people to interact through video-conferencing and for companies to demo their equipment and services via the Internet and other forms of digital communication.

Tradeshows were once the only way to gather a community together, but now there are other options. One school of thought holds that large tradeshows are now less effective than small, targeted conferences that provide heightened ability for attendees to interact with one another on a more intimate basis. Some events no longer charge attendees … but they make sure to “vet” them carefully to ensure that the show sponsors who are underwriting the costs are reaching prospects with important degrees of influence or buying authority.

On top of these “macro” trends, the current economic downturn just makes McCormick Place look more and more like a loser when it comes to the tradeshow game. Compared to Chicago’s three most significant competing tradeshow locales – Atlanta, Las Vegas and Orlando – the cost of many items from electricians (union labor) to foodservice (greasy spoon-quality coffee at Starbucks® prices) to hotel accommodations (room fees and surtaxes that won’t quit) ranges two times to eight times higher in Chicago. And in today’s business climate when every cost is scrutinized closely, none of this looks very cost-effective to the corporate bean-counters.

True, Chicago is more centrally located for travel from both coasts: Who wants to take a five hour flight from New York to Las Vegas or from California to Orlando to attend a meeting?

[On the other hand, no one can honestly say that the weather in Chicago is preferable to sunny Florida or Nevada!]

So it would seem that Chicago’s worthy tradeshow competitors have achieved the upper hand now. I just returned from two national shows this past week – the International Air Conditioning, Heating & Refrigeration Expo and the International Poultry Expo. Where were they held? Orlando and Atlanta – the same cities which are attracting McCormick Place’s erstwhile customers.

Mathematicians and the Meltdown

I can’t wait for the release of The Quants, a new book by Wall Street Journal reporter Scott Patterson about the role of so-called “quant funds” in the financial near-meltdown in September 2008, to be published on February 2. The weekend edition of The Wall Street Journal printed excerpts from the book, a powerful indictment of the mathematicians and computer whizzes who “nearly destroyed Wall Street.”

According to Patterson, “quants” was a name given to “traders and financial engineers who used brain-twisting math and super-powered computers to pluck billions in fleeting dollars out of the market.” In a major departure from traditional trading – evaluating individual companies’ management, performance and competitive positions – the quants used mathematical formulae to wager on which stocks will rise or fall.

Because of breakthroughs in the application of mathematics to financial markets – some of them so novel so as to have won their discoverers Nobel Prize awards — quant funds had quickly come to dominate Wall Street, with most of them piling up profits day after day. (To the senior brass at the investment houses, who likely knew little if anything about how these funds operated except that they made a lot of money, a hands-off policy seemed just the thing.)

And just as in so many other fields, technology elevated the “nerds” to the position of “stars” – with commensurately stratospheric compensation.

Unfortunately, in September 2008 the quant funds could not anticipate the effect of the collapse of the housing market bubble. In fact, this development turned the mathematical formulae of the quant funds on their head: What should have declined, rising … and what should be going up, dropping.

Patterson’s book promises to go into the details of just how things spun out of control, as seen through the eyes of key Wall Street managers such as the piano-playing, songwriting Peter Muller, founder of Morgan Stanley’s Process Driven Trading (PDT) quant fund, and Cliff Asness, formerly of Goldman Sachs and leader of the Applied Quantitative Research (AGR) quant fund.

In addition to presenting all the facts and all the drama, I’m hoping that Patterson will offer a few observations on how we can avoid a debacle like this from happening again in the future.

Another key question is whether any of the proposed regulations being debated in Congress will address the practices of quant funds – or is it all too complicated for anyone to figure out?

If that’s so, it’s pretty scary.