The End of Privacy

An article by technology author Steve Lohr published last week in The New York Times caught my eye. Titled “How Privacy Vanishes Online,” it explores how conventional notions of “privacy” have become obsolete over the past several years as more people engage in cyber/social interaction and web e-commerce.

What’s happening is that seemingly innocuous bits of information are being collected, “read” and reassembled by computers to build a person’s identity without requiring direct access to the information.

In effect, technology has provided the tools whereby massive amounts of information can be collected and crunched to establish patterns and discern all sorts of “private” information.

The proliferation of activity on social networking sites such as Flickr, Facebook and LinkedIn is making it easier than ever to assemble profiles that are uncanny in their accuracy.

Pulling together disparate bits of information helps computers establish a “social signature” for an individual, which can then be used to determine any number of characteristics such as marital status, relationship status, names and ages of children, shopping habits, brand preferences, personal hobbies and other interests, favorite causes (controversial or not), charitable contributions, legal citations, and so on.

One of the more controversial experiments was conducted by MIT researchers last year, dubbed “Project Gaydar.” In a review of ~4,000 Facebook profiles, computers were able to use the information to predict male sexual preference with nearly 80% accuracy – even when no explicit declaration of sexual orientation was made on the profiles.

Others, however, have pointed to positive benefits of data mining and how it can benefit consumers. For instance, chain grocery stores can utilize data collected about product purchases made by people who use store loyalty cards, enabling the chains to provide shoppers relevant, valuable coupon offers for future visits.

Last year, media company Netflix awarded a substantial prize to a team of computer specialists who were able to develop software capabilities to analyze the movie rental behavior of ~500,000 Netflix subscribers … and significantly improve the predictive accuracy of product recommendations made to them.

To some, the Netflix program is hardly controversial. To others, it smacks of the “big brother” snooping that occurred in an earlier time during the Supreme Court confirmation hearings for Robert Bork and Clarence Thomas, when over-zealous Senate staffers got their hands on movie store rental records to determine what kind of fare was being watched by the nominees and their families.

Indeed, last week Netflix announced that it will not be moving forward with a subsequent similar initiative. (In all likelihood, this decision was influenced by pending private litigation more than any sort of altruism.)

Perhaps the most startling development on the privacy front comes courtesy of Carnegie Mellon University, where two researchers have run an experiment wherein they have been able to correctly predict the Social Security numbers for nearly 10% of everyone born between 1989 and 2003 – almost 5 million people.

How did they do it? They started by accessing publicly available information from various sources including social networking sites to collect two critical pieces of information: birthdate, plus city or state of birth. This enabled the researchers to determine the first three digits of each Social Security number, which then provided the baseline for running repeat cycles of statistical correlation and inference to “crack” the Social Security Administration’s proprietary number assignment system.

So as it turns out, it’s not enough anymore merely to be concerned about what you might have revealed in cyberspace on a self-indulgent MySpace page or in an ill-advised newsgroup post.

Social Security numbers … passwords … account numbers … financial data. Today, they’re all fair game.

U.S. consumers: More comfortable than ever making online purchases.

Online purchasingHave U.S. consumers finally gotten over their skittishness about making purchases over the Internet? A newly released study from Javelin Strategy & Research suggests that they have.

The 2010-2014 Online Retail Payments Forecast report draws its findings from data collected online in November 2009 from a randomly selected panel of nearly 3,300 U.S. consumers representing a representative cross-sample by age, gender and income levels.

Based on the Javelin sample, nearly two-thirds of American consumers are now either “comfortable” or “very comfortable” with shopping online.

On the other end of the scale, ~22% of U.S. consumers continue to be wary of online purchasing; these people haven’t made an online purchase within the past year … or in some cases, never.

These figures suggest that the consumer comfort level with making online purchases is as high as it’s ever been. And how are consumers making their online payments? The Javelin study reports that among those respondents reporting online activities, the five most popular payment methods are:

 Major credit card: 70%
 Major debit card: 55%
 Online payment service such as PayPal®: 51%
 Gift card (good at one specific merchant): 41%
 Store-branded credit card (good at one specific merchant): 27%

Even with more than half of consumers using a debit card for online purchases, the total dollar volume of online sales attributable to debit cards is less than 30%. Javelin forecasts debit card share to continue climbing in the short-term, however, due to tighter consumer credit standards now in force.

Bottom line, the Javelin report suggests that despite the periodic horror stories that have been published about credit card information and other financial data being captured or mined off the Internet, the convenience and price/selection benefits of online shopping are winning the day with consumers. Not surprising at all, really.

Search Engine Rankings: Page 1 is Where It’s At

All the hype you continually hear about how important it is to get on Page 1 of search engine result pages turns out to be … right on the money.

In a just-released study from digital marketing company iCrossing, nearly 9 million “non-branded” search queries conducted on Google, Yahoo and Bing were analyzed, with the clickthrough percentages from the first, second and third pages of the search engine results (SERPs) tallied.

It turned out that more than 8.5 million clickthroughs were made from the first page of results – a whopping 95% of the total. The rest was just crumbs: Clicks off the second page came in under 250,000, while third-page clicks clocked in at a paltry ~180,000.

The results were essentially the same for the three major search engines (all at 95% or 96%) – so it’s a clean sweep across the board and clearly behavior that fits all across the spectrum.

What this suggests is that when searching on generic or descriptive terms, most people will not go past the first page of results if they can’t find a site link that interests them. If they don’t hit paydirt on the first page, they’re far more likely to try another search using different keywords or phrases until they find a site on Page 1 that does the trick.

Comparing this newest iCrossing study with research from a few years back reveals that Page 1 clicks represent an even higher proportion today; earlier studies from a few years back had it pegged at 80% to 90%.

The implications of this study are be clear: if you’re looking to attract visitors to your site via generic or descriptive subject searches, you’d better make sure your site is designed so that it achieves first-page ranking … or your web efforts will be for naught.

That being said, the recipe for success in ranking hasn’t changed much at all. Despite all of the tempting “link juice” tips and tricks out there, the main keys to getting high rankings continue to be creating loads of good web content … speaking the same “language” as searchers (however inaccurate that might be) … and maintaining lots of good links to and from your site to increase its “relevance” to search engines.

No doubt, it’s getting tougher to achieve Page 1 ranking when there’s so much competition out there, but it’s well worth the effort.

The e-Commerce Hiccup

One of the bigger surprises of business in the year 2009 was how big of a hit U.S. e-commerce has taken. According to digital marketing intelligence firm comScore in its just-released report 2009 U.S. Digital Year in Review, e-retail spending in America decreased about 2% during the year to come in just under $210 billion.

This represents the first decline in e-commerce spending ever recorded.

Obviously, the economic recession was the culprit. But considering that e-commerce growth has charted above 20% annually in every year leading up to 2009, seeing an actual fall-off has raised more than a few eyebrows.

Where was the e-commerce decline most pronounced? It was in travel-related services, which saw revenues drop by 5% to ~$80 million. Not that all sectors saw decline. A few continued to experience growth during the year, including the books/magazines category which charted gains of ~12%. Online computer software purchases were also up by about 7%.

What does comScore see on the horizon for U.S. e-commerce? Is continued softness predicted … or a return to robust growth?

Analyzing the last few months of e-commerce activity during 2009 provides clues to the future: Growth looks like it’s returning. In fact, the 2009 holiday season marked a return to positive growth rates when compared against the same period in 2008.

[Granted, this comparison is made against “down” months of November and December in 2008, after the recession had already kicked in. But the pace of e-commerce activity is clearly picking up again.]

But whether it will go back to a 20%+ annual growth is still an open question.

Get Ready for the Endless Political Campaign …

New forecasts about political advertising have just been released. They confirm what many of us have suspected: The political campaign season, traditionally defined every two years by the presidential and off-year congressional election contests, is morphing into one gigantic mega-campaign that basically is with us all the time.

Instead of the nice breather we used to get from political advertising after the campaign season ended, it’s becoming one long, never-ending experience — some would say nightmare.

And if this surprises you, consider the past year alone in U.S. politics. First, there was the inauguration and the early fight over the economic stimulus package, with many political ads run pro and con.

This was followed by the health care debate which attracted an even bigger volume of advertising – probably because there were even more special interests involved. That initiative also sparked the Tea Party rallies and town hall meetings, which became fodder for still more political posturing (and paid advertising).

In the midst of the health care debate, along came the gubernatorial elections in Virginia and New Jersey as well as the “circus sideshow” in Upstate New York’s special congressional election where the Conservative Party candidate forced the endorsed Republican from the race – another opportunity for all sorts of campaign spending.

And just about the time the health care debate finally came to a vote in Congress … the Christmas Bomber shows up – still more fodder for paid political advertising, this time on national security.

As the year 2009 ended, when we thought we were over with politics for at least a few short months, out of nowhere comes the Massachusetts special election for senator that attracts millions of dollars per day in contributions over the Internet and sparking – you guessed it – beaucoup bucks in paid political advertising.

And this past week, when the exciting Superbowl and extreme weather events should be dominating the news, what’s prominently on our TV and cable channels as well? The Tea Party convention in Nashville, capped by an announcement that this group is forming a campaign political action committee to raise millions in funds to — of course — run new candidates for office.

More politics … more money … more advertising.

Of course, all of this is great news for local television and cable stations, which can snap out of their torpor and pocket a ton of new dollars in advertising revenues. In fact, media research and analytical firm Borrell Associates is predicting that U.S. political spending of all stripes will hit a record $4.2 billion in 2010.

Helping this along is the recent U.S. Supreme Court ruling that lifts restrictions on corporations and gives them the freedom to buy political advertising. Borrell estimates that this ruling will add ~10% to the total pool of funds this year.

It seems hard to believe that 2010 – a non-presidential election year – is on track to break 2008’s record for political spending, considering the huge amounts of advertising that were done by the McCain and (especially) the Obama campaigns in 2008. But the prognosticators insist 2010 will be the biggest year yet for political spending … to the tune of $1 billion more than in 2008.

What role does online play in all of this? The Internet is expected to account for less than $50 million in advertising revenues in 2010 – a comparable drop in the bucket. But growth will be very strong in this segment – not least because the web does a very good job of bringing in more campaign donations! The bottom-line prediction: Internet advertising will likely double to reach $100 million for the presidential campaign in 2012.

So the endless political campaign continues endlessly on … never ending … world without end. What fun for us!

The Mobile Web: Great Promise + Growth Pains

It’s clear that the mobile web is a big growth segment these days. Proof of that is found in recent Nielsen statistics, which have charted ~34% annual growth of the U.S. mobile web audience, now numbering some 57 million visitors using a mobile device to visit web sites (as of late summer 2009).

And now, a new forecast by the Gartner research firm projects that mobile phones will overtake PCs as the most common web access devices worldwide … as early as 2013. It estimates that the total number of smartphones and/or browser-enhanced phones will be ~1.82 billion, compared to ~1.78 billion PCs by then.

Gartner is even more aggressive than Morgan Stanley’s prediction that the mobile web will outstrip the desktop web by 2015.

So, what’s the problem?

Well … consumer studies also show that web surfing using mobile phones continues to be a frustrating experience for many users. In a recent survey of ~1,000 mobile web users, web application firm Compuware/Gomez found that two out of every three mobile web users reports having problems when accessing web sites on their phones.

Because people are so used to fast broadband connections – both at home and at work – it’s only natural that their expectations for the mobile web are similarly high. To illustrate this, Gomez found that more than half of mobile phone users are willing to wait just 6 to 10 seconds for a site to load before moving on.

And what happens after they give up? Sixty percent say they’d be less likely to visit the site again. More importantly, ~40% report that they’d head over to a competing site. As for what would happen if the mobile web experience was as fast and reliable as on a PC, more than 80% of the respondents in the Gomez study claim they would access web sites more often from their phones.

For marketers, this means that to maximize their success in the mobile world, they should reformat web sites to conform to the small-form factor of handheld devices. And Gartner also notes that “context” will be the king of the hill in mobile – more than just “search” – in that it will deliver a personalized user experience. New functionalities such as Google’s “Near Me Now” are providing information on businesses, dining and other services that are in the proximity of a mobile user’s location. These and other innovations are opening up whole new dimensions to “seeking and finding” in the mobile web world.

Facebook Continues on its Merry Way to Social Media (and Web?) Dominance

Here’s a very interesting finding ripped from today’s social media headlines: The Business Insider and other media outlets are reporting that Facebook now accounts for nearly one in four page views on the Internet in the United States.

So claims database marketing consulting firm Drake Direct, which has studied web traffic in the U.S. and the U.K. by analyzing data collected by Compete, a leading aggregator of web statistics.

Just to give you an idea of how significant Facebook’s results are: by comparison, search engine powerhouse Google accounts for only about one in twelve page views.

And Facebook is now closing in on Google when it comes to site visits – with each currently receiving around 2.5 billion visits per month. In fact, studying the trend lines, Drake Direct anticipates that Facebook site visits will surpass Google any time now.

Another interesting finding is that the length of the average Facebook visit now surpasses that of YouTube (~16 minutes versus ~14 minutes per visit), whereas YouTube had charted longer visits prior to now.

These findings underscore the continued success of Facebook as the most successful social media site, even as it has grown to 350+ million users, including more than 100 million in the U.S. with 5 million added in January alone. No doubt, it’s on a roll.

Where Does the News Begin? Pew Looks for Answers.

Pew studies news reporting today ... and who's crafting it.
You don’t have to be over 50 years old to be concerned about where the world might be heading when it comes to the generation of news stories and how they are vetted. As newspapers and other publishers have cut the size of their reporting and editorial staffs, the quality and consistency of news reporting has suffered in the eyes of many.

Recently, the Pew Research Center’s Project for Excellence in Journalism decided to take a look at this issue to see how it’s playing out on the ground by studying the “news ecosystem” of a single geographic region. The market chosen for the study – Baltimore, Maryland – just happens to be in my backyard, so I’ve been able to review the results with a good understanding of the dynamics of the region in question.

Pew’s Baltimore study evaluated the news environment during the summer of 2009 and came to some interesting conclusions. While the regional media landscape – print, web, radio and TV – has broadened considerably to include 53 separate outlets that regularly produce and broadcast some form of news content, much of what is truly “new news” came from the traditional news outlets and not from other media resources.

Six major local/regional news threads were studied, ranging from the Maryland state budget situation to crime trends, issues affecting the metro transit system, and the sale of the Senator Theater, a local historical landmark. An analysis of those news threads found that:

 More than 80% of the news stories were repetitive – just rehashes of someone else’s original news content that contained no new information.

 Of the ~20% of the news stories that did include new information, nearly all of the content came from traditional media, published either in conventional format (e.g., print) or in digital.

 General-audience newspapers like the Baltimore Sun produced roughly half of the news stories, followed by local TV stations such as WBAL-TV contributing ~30% of the reporting.

 Specialty business or legal newspaper outlets such as the Baltimore Business Journal and the Daily Record contributed just under of 15% of the news items, with the remaining news reporting coming primarily from local radio stations such as WYPR-FM.

 Interestingly, about one-third of the news coverage generated by newspaper publishers appeared on the Internet rather than in their print editions.

Thus, the Pew study demonstrates that “new news” is coming from the same sources as before, led by the local papers. But another clear picture to emerge from the Baltimore profile is that the scaling back of editorial staffs has resulted in less original reporting, with much heavier reliance on simply republishing stories that have appeared elsewhere.

At the same time, new interactive capabilities are giving “we the people” an unparalleled broadcast platform via the ability to post feedback and commentary, not to mention utilizing Facebook, Twitter and other social media platforms as a megaphone.

In today’s “everyone’s an editor because they can write” environment, no one can stop us from broadcasting our own opinions and analysis to the world. But that’s not the same thing as a properly sourced, properly vetted news story. And that’s what Pew sees falling away.

Plunging Headlong into the Next Digital Decade

Looking back over the past ten years, so much has happened in the world of digital communications, it’s almost impossible to remember the “bad old days” of slow-loading web pages and clunky Internet interfaces.

And yet, that was the norm for many web users at the beginning of the decade.

So what’s in store for the upcoming decade? When you consider that a 120-minute movie file can be downloaded about as quickly as a single web page these days, how could data processing and download times get any faster than they are already?

Certainly, if the world continues with transistor-based computer chip designs, there’s very little room for further improvement. That’s because today’s chips continue to be based on the original 1958 Shockley transistor design – except that they contain many more transistors along with circuits that have been engineered smaller and smaller — down practically to the size of an atom.

We can’t get much smaller than that without a radical new design platform. And now, along comes “quantum computing” which goes well beyond the traditional binary system of using ones and zeros in information processing. The betting is that so-called “qubits” – quantum processor units – will become the new design paradigm in the 2010s that will dramatically increase processor speed and power once again.

[An alarming side effect of quantum processing, however, is the possibility that computers will become so much more powerful that current methods of encryption will become obsolete. Clearly, new ways of protecting information will need to be developed along with the new speed and capabilities.]

In tandem with the new advancements in data processing power and speed, industry prognosticators such as Wired magazine’s chief editor Chris Anderson are predicting that bandwidth and storage will become virtually free and unlimited in the coming decade. As a result, it’s highly likely that the decade will be one of much greater collaboration between people in “peer to peer” creation and sharing of information and media, online and in real-time. Think Facebook on steroids.

Firefox Turns Five: All grown-up now … and with a few grown-up challenges.

Firefox logoMozilla’s Firefox web browser marked a milestone this past week, celebrating its fifth birthday.

No question about it, the open-source browser has been a big success, with growth that has been impressive by any measure. As of the end of July, Firefox had been downloaded more than 1 billion times.

Indeed, a mainstream site like this one here (WordPress) reports that Firefox now represents a larger share of activity than Internet Explorer — 46% versus 39% of traffic.

But now that Firefox has come of age, it’s facing some of the same “grown up” challenges that other browsers face.

In fact, application security vendor Cenzic has just released its security trends report covering the first half of 2009. Guess what? Firefox led the field of web browers in terms of reported total vulnerabilities. Here are the stats from Cenzic:

 Firefox: 44% of reported browser vulnerabilities
 Apple Safari: 35%
 Internet Explorer: 15%
 Opera: 6%

Compared to Cenzic’s report covering the second half of 2008, Firefox’s figure is up from 39%, while IE’s number is down sharply from 43%.

Welcome to reality. As Firefox has grown in importance, it’s gained more exposure to vulnerabilities. A significant portion of those vulnerabilities have come via plug-ins.

Mozilla is trying to take steps to counteract this, including launching a plug-in checker service to ensure that users are running up-to-date versions. It also offers a “bug bounty” to anyone who discovers security holes in Firebox.

And the good news is that even though Firefox had the highest number of vulnerabilities, even Cenzic admits that this doesn’t necesarily mean Firefox users are more vulnerable to security threats. Plus, those vulnerabilities tend to be patched more quickly than those found in other browsers.

So on this fifth anniversary milestone, Firefox can be justly praised as a major success story in the web browser world.