Clueless at the Capitol

Lawmakers’ cringeworthy questioning of Facebook’s Mark Zuckerberg calls into question the government’s ability to regulate social media.

Saul Loeb/AFP/Getty Images
Facebook CEO Mark Zuckerberg on Capitol Hill, April 2018. (Saul Loeb/AFP/Getty Images)

With the testimony on Capitol Hill last week by Facebook CEO Mark Zuckerberg, there’s heightened concern about the negative side effects of social media platforms. But in listening to lawmakers questioning Zuckerberg, it became painfully obvious that our Federal legislators have next to no understanding of the role of advertising in social media – or even how social media works in its most basic form.

Younger staff members may have written the questions for their legislative bosses, but it was clear that the lawmakers were ill-equipped to handle Zuckerberg’s alternatively pat, platitudinous and evasive responses and to come back with meaningful follow-up questions.

Even the younger senators and congresspeople didn’t acquit themselves well.

It made me think of something else, too. The questioners – and nearly everyone else, it seems – are missing this fundamental point about social media:  Facebook and other social media platforms aren’t much different from old-fashioned print media, commercial radio and TV/cable in that that they all generate the vast bulk of their earnings from advertising.

It’s true that in addition to advertising revenues, print publications usually charge subscribers for paper copies of their publications. In the past, this was because 1) they could … but 2) also to help defray the cost of paper, ink, printing and physical distribution of their product to news outlets or directly to homes.

Commercial radio and TV haven’t had those costs, but neither did they have a practical way of charging their audiences for broadcasts – at least not until cable and satellite came along – and so they made their product available to their audiences at no charge.

The big difference between social media platforms and traditional media is that social platforms can do something that the marketers of old could only dream about: target their advertising based on personally identifiable demographics.

Think about it:  Not so many years ago, the only demographics available to marketers came from census publications, which by law cannot reveal any personally identifiable information.  Moreover, the U.S. census is taken only every ten years, so the data ages pretty quickly.

Beyond census information, advertisers using print media could rely on audit reports from ABC and BPA.  If it was a business-to-business publication, some demographic data was available based on subscriber-provided information (freely provided in exchange for receiving those magazines free of charge).  But in the case of consumer publications, the audit information wouldn’t give an advertiser anything beyond the number of copies printed and sold, and (sometimes) a geographic breakdown of where mail subscribers lived.

Advertisers using radio or TV media had to rely on researchers like Nielsen — but that research surveyed only a small sample of the audience.

What this meant was that the only way advertisers could “move the needle” in a market was to spend scads of cash on broadcasting their messages to the largest possible audience. As a connecting mechanism, this is hugely inefficient.

The value proposition that Zuckerberg’s Facebook and other social media platforms provide is the ability to connect advertisers with more people for less spend, due to these platforms’ abilities to use personally identifiable demographics for targeting the advertisements.

Want to find people who enjoy doing DIY projects but who live just in areas where your company has local distribution of your products? Through Facebook, you can narrow-cast your reach by targeting consumers involved with particular activities and interests in addition to geography, age, gender, or whatever other characteristics you might wish to use as filters.

That’s massively more efficient and effective than relying on something like median household income within a zip code or census tract. It also means that your message will be more targeted — and hence more relevant — to the people who see it.

All of this is immensely more efficient for advertisers, which is why social media advertising (in addition to search advertising on Google) has taken off while other forms of advertising have plateaued or declined.

But there’s a downside: Social media is being manipulated (“abused” might be the better term) by “black hats” – people who couldn’t do such things in the past using census information or Nielsen ratings or magazine audit statements.

Here’s another reality: Facebook and other social media platforms have been so fixated on their value proposition that they failed to conceive of — or plan for — the behavior inspired by the evil side of humanity or those who feel no guilt about taking advantage of security vulnerabilities for financial or political gain.

Facebook COO Sheryl Sandberg interviewed on the Today Show, April 2018.

Now that that reality has begun to sink in, it’ll be interesting to see how Mark Zuckerberg and Sheryl Sandberg of Facebook — not to mention other social media business leaders — respond to the threat.

They’ll need to do something — and it’ll have to be something more compelling than CEO Zuckerberg’s constant refrain at the Capitol Hill hearings (“I’ll have my team look into that.”) or COO Sandberg’s litany (“I’m glad you asked that question, because it’s an important one.”) on her parade of TV/cable interviews.  The share price of these companies’ stock will continue to pummeled until investors understand what changes are going to be made that will actually achieve concrete results.

Comfortable in our own skin: Consumers embrace biometrics for identification and authentication purposes.

Perhaps it’s the rash of daily reports about data breaches. Or the one-too-many compromises of protection of people’s passwords.

Whatever the cause, it appears that Americans are becoming increasingly interested in the use of biometrics to verify personal identity or to enable payments.

And the credit card industry has taken notice. Biometrics – the descriptive term for body measurements and calculations – is becoming more prevalent as a means to authenticate identity and enable proper access and control of accounts.

A recent survey of ~1,000 American adult consumers, conducted in Fall 2017 by AYTM Marketing Research for VISA, revealed that two-thirds of the respondents are now familiar with biometrics.

What’s more, for those who understand what biometrics entails, more than 85% of the survey’s respondents expressed interest in their use for identity authentication.

About half of the respondents think that adopting biometrics would be more secure than using PIN numbers or passwords. Even more significantly, ~70% think that biometrics would make authentication faster and easier – whether it be done via voice recognition or by fingerprint recognition.

Interestingly, the view that biometrics are “easier” than traditional methods appears to be the case despite the fact that fewer than one-third of the survey respondents use unique passwords for each of their accounts.

As a person who does use unique passwords for my various accounts – and who has the usual “challenges” managing so many different ones – I would have thought that people who use only a few passwords might find traditional methods of authentication relatively easy to manage. Despite this, the “new world” of biometrics seems like a good bet for many of these people.

That stated, it’s also true that people are understandably skittish about ID theft in general. To illustrate, about half of the respondents in the AYTM survey expressed concerns about the risk of a security breach of biometric data – in other words, that the very biometric information used to authenticate a person could be nabbed by others who could use it the data for nefarious purposes.

And lastly, a goodly percentage of “Doubting Thomases” question whether biometric authentication will work properly – or even if it does work, whether it might require multiple attempts to do so.

In other words, it may end up being “déjà vu all over again” with this topic …

For an executive summary of the AYTM research findings, click or tap here.

Gord Hotchkiss and the Phenomenon of “WTF Tech”

Gord Hotchkiss

Occasionally I run across an opinion piece that’s absolutely letter-perfect in terms of what it’s communicating.

This time it’s a column by marketing über-specialist Gord Hotchkiss that appeared this week in MediaPost … and he hits all the right notes in a piece he’s headlined simply: WTF Tech.

Here is Hotchkiss’ piece in full:

WTF Tech

By Gord Hotchkiss , Featured Contributor, MediaPost

Do you need a Kuvée?

Wait. Don’t answer yet. Let me first tell you what a Kuvée is: It’s a $178 wine bottle that connects to WiFi.

Ok, let’s try again. Do you need a Kuvée?

Don’t bother answering. You don’t need a Kuvée.

No one needs a Kuvée. The earth has 7.2 billion people on it. Not one of them needs a Kuvée. That’s probably why the company is packing up its high-tech bottles and calling it a day.

A Kuvée is an example of WTF Tech. Hold that thought, because we’ll get back to that in a minute.

So, we’ve established that you don’t need a Kuvée. “But that’s not the point,” you might say. “It’s not whether I need a Kuvée. It’s whether I want a Kuvée.” Fair point. In our world of ostentatious consumerism, it’s not really about need — it’s about desire. And lord knows many of the most pretentious and entitled a**holes in the world are wine snobs.

But I have to believe that, buried deep in our lizard brain, there is still a tenuous link between wanting something and needing something. Drench it as we might in the best wine technology can serve, there still might be spark of practicality glowing in the gathering dark of our souls. But like I said, I know some real dickhead wine drinkers. So, who knows? Maybe Kuvée was just ahead of the curve.

And that brings us back to WTF tech. This defines the application of tech to a problem that doesn’t exist — simply because it’s tech. There is no practical reason why this tech ever needs to exist.

Besides the Kuvée, here are some other examples of WTF tech:

The Kérastase Hair Coach

This is a hairbrush with an Internet connection. Seriously. It has a microphone that “listens” while you brush your “hear,” as well as an accelerometer, gyroscope and other sensors. It’s supposed to save you from bruising your hair while you’re brushing it. It retails for “under $200.”

The Hushme Mask

This tech actually does solve a problem, but in a really stupid way. The problem is obnoxious jerks that insist on carrying on their phone conversation at the top of their lungs while sitting next to you. That’s a real problem, right? But here’s the stupid part. In order for this thing to work, you have to convince the guilty party to wear this Hannibal Lecter-like mask while they’re on the phone. Go ahead, buy one for $189 and give it a shot next time you run into a really loud tele-jerk. Let me know how it works out for you.

Denso Vacuum Shoes

“These boots are made for sucking, and that’s just what they’ll do.”

Finally, an invention that lets you shoe-ver your carpet. That’s right, the Japanese company Denso is working on a prototype of a shoe that vacuums as you walk, storing the dirt in a tiny box in the shoe’s sole. As a special bonus, they look just like a pair of circa 1975 Elton John Pinball Wizard boots.

When You’re a Hammer

We live in a “tech for tech’s sake” time. When all the world is a high-tech hammer, everything begins to look like a low-tech nail. Each of these questionable gadgets had investors who believed in them. Both the Kuvée and the Hushme had successful crowd-funding campaigns. The Hair Coach and the Vacuum Shoes have corporate backing.

The dot-com bubble of 2000-2002 has just morphed into a bunch of broader-based — but no less ephemeral — bubbles.

Let me wrap up with a story. Some years ago, I was speaking at a conference and my panel was the last one of the day. After it wrapped, the moderator, a few of the other panelists and I decided to go out for dinner. One of my co-panelists suggested a restaurant he had done some programming work for.

When we got there, he showed us his brainchild. With much pomp and ceremony, our waiter delivered an iPad to the table. Our co-panelist took it and showed us how his company had set up the wine list as an app. Theoretically, you could scroll through descriptions and see what the suggested pairings were. I say theoretically, because none of that happened on this particular night.

Our moderator watched silently as the demonstration struggled through a series of glitches. Finally, he could stay silent no longer. “You know what else works, Dave? A sommelier,” he said. “When I’m paying this much for a dinner, I want to talk to a f*$@ng human.”

Sometimes, there’s just not an app for that.

_______________________

Does Gord Hotchkiss’ column resonate with you as it did me? Feel free to leave a comment for the benefit of other readers if you wish.

Future shock? How badly is cyber-hacking nibbling away at our infrastructure?

I don’t know about you, but I’ve never forgotten the late afternoon of August 14, 2003 when problems with the North American power grid meant that people in eight states stretching from New England to Detroit suddenly found themselves without power.

Fortunately, my company’s Maryland offices were situated about 100 miles beyond the southernmost extent of the blackout. But it was quite alarming to watch the power outage spread across a map of the Northeastern and Great Lakes States (plus Ontario) in real-time, like some sort of creeping blob from a science fiction film.

According to Wikipedia’s article on the topic, the impact of the blackout was substantial — and far-reaching:

“Essential services remained in operation in some … areas. In others, backup generation systems failed. Telephone networks generally remained operational, but the increased demand triggered by the blackout left many circuits overloaded. Water systems in several cities lost pressure, forcing boil-water advisories to be put into effect. Cellular service was interrupted as mobile networks were overloaded with the increase in volume of calls; major cellular providers continued to operate on standby generator power. Television and radio stations remained on the air with the help of backup generators — although some stations were knocked off the air for periods ranging from several hours to the length of the entire blackout.”

Another (happier) thing I remember from this 15-year-old incident is that rather than causing confusion or bedlam, the massive power outage brought out the best in people. This anecdote from the blackout was typical:  Manhattanites opening their homes to workers who couldn’t get to their own residences for the evening.

For most of the 50 million+ Americans and Canadians affected by the blackout, power was restored after about six hours.  But for some, it would take as long as two days for power restoration.

Upon investigation of the incident, it was discovered that high temperatures and humidity across the region had increased energy demand as people turned on air conditioning units and fans. This caused power lines to sag as higher currents heated the lines.  The precipitating cause of the blackout was a software glitch in the alarm system in a control room of FirstEnergy Corporation, causing operators to be unaware of the need to redistribute the power load after overloaded transmission lines had drooped into foliage.

In other words, what should have been, at worst, a manageable localized blackout cascaded rapidly into a collapse of the entire electric grid across multiple states and regions.

But at least the incident was borne out of human error, not nefarious motives.

That 2003 experience should make anyone hearing last week’s testimony on Capitol Hill about the risks faced by the U.S. power grid think long and hard about what could happen in the not-so-distant future.

The bottom-line on the testimony presented in the hearings is that malicious cyberattacks are becoming more sophisticated – and hence more capable of causing damage to American infrastructure. The Federal Energy Regulatory Commission (FERC) is cautioning that hackers are increasingly threatening U.S. utilities ranging from power plants to water processing systems.

Similar warnings come from the Department of Homeland Security, which reports that hackers have been attacking the U.S. electric grid, power plants, transportation facilities and even targets in commercial sectors.

The Energy Department goes even further, reporting in 2017 that the United States electrical power grid is in “imminent danger” from a cyber-attack. To underscore this threat, the Department contends that more than 100,000 cyber-attacks are being mounted every day.

With so many attacks of this kind happening on so many fronts, one can’t help but think that it’s only a matter of time before we face a “catastrophic event” that’s even more consequential than the one that affected the power grid in 2003.

Even more chilling, if it’s borne out of intentional sabotage – as seems quite likely based on recent testimony – it’s pretty doubtful that remedial action could be taken as quickly or as effectively as what would be done in response to an accidental incident likr the one that happened in 2003.

Put yourself in the saboteurs’ shoes: If your aim is to bring U.S. infrastructure to its knees, why plan for a one-off event?  You’d definitely want to build in ways to cause cascading problems – not to mention planting additional “land-mines” to frustrate attempts to bring systems back online.

Contemplating all the implications is more than sobering — it’s actually quite frightening. What are your thoughts on the matter?  Please share them with other readers.

Re-imagining the rules for company leadership: Rajeev Peshawaria’s prescriptions.

As the nature of how companies do business changes, what about time-honored managerial styles? Do they need to change as well?

Open Source Leadership is a newly published book by business author former Coca-Cola and Morgan Stanley executive and Rajeev Peshawaria.  Published by McGraw-Hill, Peshawaria’s book contends that many of the many management practices that persist today are no longer well-aligned with the reality of current workplaces, current employees … or even society in general.

One fundamental change that has happened just in the past generation is what Preshawaria labels “uber-connectivity.” Thanks to the Internet, mobile phones and other communication technologies, people are able to access information on nearly any topic and obtain answers to any question — wherever they are and whenever they want.

According to the author, this near-limitless access to information empowers people to an unprecedented degree – and it narrows the gulf between “experts” and “regular folks.”

As for “guru-worship” – the inclination of at least some people to seek out and learn from the soothsayers in the business world … that’s yesterday’s bread.

Lest Peshawaria be accused of being what he himself declares irrelevant, he remarks, “The guru is dead. Long live the Google.”

Rajeev Peshawaria

Couple uber-connectivity with increasing world population plus the concentration of that population in urban areas, and the result is companies that are now able to source talent and knowledge from wherever they exist.

How do these changes affect the theory and practice of business management?

In Peshawaria’s view, company leaders are still called upon to provide steadfast leadership about “purpose and values,” while at the same time acting with “compassion, humility and respect for people.”

Some of this may sound something like the “autocratic” management style that was prevalent in business until the 1980s – but not exactly. At the same time, it’s different from the “all-inclusive” democratic style that became ascendant in the world of business during the past three decades.  Let’s call it a hybrid.

One other important factor addressed by Peshawaria in his book is that employee motivation remains a nettlesome issue for companies – and far more complex than most management theories and stratagems account for.

One prescription from Peshawaria is for managers to dump the notion of giving “stretch goals” to all employees in an attempt to foster high performance. He argues that stretch goals work only for “the small percentage of employees [who] have the creativity, innovation and drive to truly relish and achieve stretch goals at any one point in time.”

According to Peshawaria, for the majority of employees stretch goals end up “causing stress, anxiety, or poorly thought-out behavior.”

Open Source Leadership is a book that’s worth a read – and it’s readily available from Amazon and other online retailers.  For those who have read about Rajeev Peshawaria’s theories in this new book or in his earlier volume Too Many Bosses, Too Few Leaders – or if you have years of experience working in business organizations, what do you think about the author’s perspectives and prescriptions?  Are they on point … or off-base?  Please share your views with other readers.

Blockchain Technology: Hype … or Hope?

Recently I read a column in MediaPost by contributing writer Ted McConnell (The Block Chain [sic] Gang, 3/8/18) that seemed pretty wide of the mark in terms of some of its (negative) pronouncements about blockchain technology.

Not being an expert on the subject, I shared the column with my brother, Nelson Nones, who is a blockchain technology specialist, to get his “read” on the article’s POV.

Nelson cited four key areas where he disagreed with the positions of the column’s author – particularly in the fallacy of conflating blockchain technology – the system of keeping records which are impossible to falsify – with cryptocurrencies (systems for trading assets).

But the bigger aspect, Nelson pointed out, is that the MediaPost column reflects a general degree of negativity that has crept into the press recently about blockchain technology and its potential for solving business security challenges.  He noted that blockchain technology is in the midst of going through the various stages of Gartner’s well-known “hype cycle” – a model that charts the maturity, adoption and social application of emerging technologies.

Interested in learning more about this larger issue, I asked Nelson to expand on this aspect. Presented below is what he reported back to me.  His perspectives are interesting enough, I think, to share with others in the world of business.

Hyperbole All Over Again?

Most folks in the technology world – and many business professionals outside of it – are familiar with the “hype cycle.” It’s a branded, graphic representation of the maturity, adoption and social application of technologies from Gartner, Inc., a leading IT research outfit.

According to Gartner, the hype cycle progresses through five phases, starting with the Technology Trigger:

“A potential technology breakthrough kicks things off. Early proof-of-concept stories and media interest trigger significant publicity. Often no usable products exist and commercial viability is unproven.”

Next comes the Peak of Inflated Expectations, which implies that everyone is jumping on the bandwagon. But Gartner is a bit less sanguine:

“Early publicity produces a number of success stories — often accompanied by scores of failures. Some companies take action; most don’t.”

There follows a precipitous plunge into the Trough of Disillusionment.  Gartner says:

“Interest wanes as experiments and implementations fail to deliver. Producers of the technology shake out or fail. Investment continues only if the surviving providers improve their products to the satisfaction of early adopters.”

If the hype cycle is to be believed, the Trough of Disillusionment cannot be avoided; but from a technology provider’s perspective it seems a dreadful place to be.

With nowhere to go but up, emerging technologies begin to climb the Slope of Enlightenment. Gartner explains:

“More instances of how the technology can benefit the enterprise start to crystallize and become more widely understood. Second- and third-generation products appear from technology providers. More enterprises fund pilots; conservative companies remain cautious.”

Finally, we ascend to the Plateau of Productivity. Gartner portrays a state of “nirvana” in which:

“Mainstream adoption starts to take off. Criteria for assessing provider viability are more clearly defined. The technology’s broad market applicability and relevance are clearly paying off. If the technology has more than a niche market, then it will continue to grow.”

Gartner publishes dozens of these “hype cycles,” refreshing them every July or August. They mark the progression of specific technologies along the curve – as well as predicting the number of years to mainstream adoption.

Below is an infographic I’ve created which plots Gartner’s view for cloud computing overlaid by a plot of global public cloud computing revenues during that time, as reported by Forrester Research, another prominent industry observer.

Cloud Computing Hype Cycle

This infographic provides several interesting insights. For starters, cloud computing first appeared on Gartner’s radar in 2008. In hindsight that seems a little late — especially considering that Salesforce.com launched its first product all the way back in 2000. Amazon Web Services (AWS) first launched in 2002 and re-launched in 2006.

Perhaps Gartner was paying more attention to Microsoft, which announced Azure in 2008 but didn’t release it until 2010. Microsoft, Amazon, IBM and Salesforce are the top four providers today, holding ~57% of the public cloud computing market between them.

Cloud computing hit the peak of Gartner’s hype cycle just one year later, in 2009, but it lingered at or near the Peak of Inflated Expectations for three full years. All the while, Gartner was predicting mainstream adoption within 2 to 5 years. Indeed, they have made the same prediction for ten years in a row (although I would argue that cloud computing has already gained mainstream market adoption).

It also turns out that the Trough of Disillusionment isn’t quite the valley of despair that Gartner would have us believe. In fact, global public cloud revenues grew from $26 billion during 2011 to $114 billion during 2016 — roughly half the 64% compound annual growth rate (CAGR) during the Peak of Inflated Expectations, but a respectable 35% CAGR nonetheless.

Indeed, there’s no evidence here of waning market interest – nor did an industry shakeout occur. With the exception of IBM, the leading producers in 2008 remain the leading producers today.

All in all, it seems the hype curve significantly lagged the revenue curve during the plunge into the Trough of Disillusionment.

… Which brings us to blockchain technology. Just last month (February 2018), Gartner Analyst John Lovelock stated, “We are in the first phase of blockchain use, what I call the ‘irrational exuberance’ phase.”  The chart below illustrates shows how Gartner sees the “lay of the land” currently:

Blockchain Hype Cycle

This suggests that blockchain is at the Peak of Inflated Expectations, though it appeared ready to jump off a cliff into the Trough of Disillusionment in Gartner’s 2017 report for emerging technologies. (It wasn’t anywhere on Gartner’s radar before 2016.)

Notably, cryptocurrencies first appeared in 2014, just past the peak of the cycle, even though the first Bitcoin network was established in 2009. Blockchain is the distributed ledger which underpins cryptocurrencies and was invented at the same time.

It’s also interesting that Gartner doesn’t foresee mainstream adoption of blockchain for another 5 to 10 years. Lovelock reiterated that point in February, reporting that “Gartner does not expect large returns on blockchain until 2025.”

All the same, blockchain seems to be progressing through the curve at three times the pace of cloud computing. If cloud computing’s history is any guide, blockchain provider revenues are poised to outperform the hype curve and multiply into some truly serious money during the next five years.

It’s reasonable to think that blockchain has been passing through a phase of “irrational exuberance” lately, but it’s equally reasonable to believe that industry experts are overly cautious and a bit late to the table.

________________________

So that’s one specialist’s view.  What are your thoughts on where we are with blockchain technology? Please share your perspectives with other readers here.

In digital retail, there’s Amazon and then there’s … everyone else.

When it comes to online retailing in the United States, Amazon’s been cleaning up for years. And now we have new data from comScore that reveals that Amazon is as dominant online today as it’s ever been.

This chart illustrates it well:

dms
* MediaMetrix Multi-Platform, US, December 2017. Source: comScore 2018 State of the U.S. Online Retail Economy.

The chart shows that when comparing actual time spent by Americans at each of the Top 10 online retailers, Amazon attracts more viewing time than the other nine entities combined.

Even when considering only mobile minutes, where so much of the growth is happening for digital retailers, Amazon’s mobile viewing time exceeds the combined total digital traffic across eBay, Walmart, Wish, Kohl’s and Etsy.

Pertaining to the mobile sphere, there is an interesting twist that comScore has found in consumer behavior. It turns out, there’s a considerable disparity between the amount of time spent with mobile compared to its share of dollars spent – to the tune of a 40% gap:

srts
MediaMetrix Multi-Platform and ecommerce / mCommerce Measurement, Q4 2017. Source: comScore 2018 State of the U.S. Online Retail Economy.

In essence, the data show that whereas mobile represents nearly two-thirds of the time spent with online retail, it accounts for only one-fourth of the dollars spent on goods and services.

But this difference is easy to explain:  As the largest player in the field, Amazon fulfills a role similar to what Expedia or Trivago do in the travel industry.

Amazon gives consumers a way to scan the marketplace not only for product details but also for prevailing prices, giving them a sense of the expected price ranges for products or services — even if they ultimately choose to purchase elsewhere.