In copywriting, it’s the KISS approach on steroids today.

… and it means “Keep It Short, Stupid” as much as it does “Keep It Simple, Stupid.”

Regardless of the era, most successful copywriters and ad specialists have always known that short copy is generally better-read than long.

And now, as smaller screens essentially take over the digital world, the days of copious copy flowing across a generous preview pane area are gone.

More fundamentally, people don’t have the screen size – let along the patience – to wade through long copy. These days, the “sweet spot” in copy runs between 50 and 150 words.

Speaking of which … when it comes to e-mail subject lines, the ideal length keeps getting shorter and shorter. Research performed by SendGrid suggests that it’s now down to an average length of about seven words for the subject line.

And the subject lines that get the best engagement levels are a mere three or four words.

So it’s KISS on steroids: keeping it short as well as simple.

Note: The article copy above comes in at under 150 words …!

Where’s the Best Place to Live without a Car?

For Americans who live in the suburbs, exurbs or rural areas, being able to live without a car seems like a pipedream. But elsewhere, there are situations where it may actually make some sense.

They may be vastly different in nearly every other way, but small towns and large cities share one trait – being the places where it’s more possible to live without a car.

Of course, within the larger group of small towns and larger cities there can be big differences in relative car-free attractiveness depending on differing factors.

For instance, the small county seat where I live can be walked from one side of town to the other in under 15 minutes. This means that, even if there are places where a sidewalk would be nice to have, it’s theoretically possible to take care of grocery shopping and trips to the pharmacy or the cleaners or the hardware store on foot.

Visiting restaurants, schools, the post office and other government offices is also quite easy as well.

But even slightly bigger towns pose challenges because of distances that are much greater – and there’s usually little in the way of public transport to serve inhabitants who don’t possess cars.

At the other end of the scale, large cities are typically places where it’s possible to move around without the benefit of a car – but some urban areas are more “hospitable” than others based on factors ranging from the strength of the public transit system and neighborhood safety to the climate.

Recently, real estate brokerage firm Redfin took a look at large U.S. cities (those with over 300,000 population) to come up with its listing of the 10 cities it judged the most amenable for living without a car. Redfin compiled rankings to determine which cities have the better composite “walk scores,” “transit scores” and “bike scores.”

Here’s how the Redfin Top 10 list shakes out. Topping the list is San Francisco:

  • #1: San Francisco
  • #2: New York
  • #3: Boston
  • #4: Washington, DC
  • #5: Philadelphia
  • #6: Chicago
  • #7: Minneapolis
  • #8: Miami
  • #9: Seattle
  • #10: Oakland, CA

Even within the Top 10 there are differences, of course. This chart shows how these cities do relatively better (or worse) in the three categories scored:

Redfin has also analyzed trends in residential construction in urban areas, finding that including parking spaces within residential properties is something that’s beginning to diminish – thereby making the choice of opting out of automobile ownership a more important consideration than in the past.

What about your own experience? Do you know of a particular city or town that’s particularly good in accommodating residents who don’t own cars?  Or just the opposite?  Please share your observations with other readers.

More Trouble in the Twittersphere

With each passing day, we see more evidence that Twitter has become the social media platform that’s in the biggest trouble today.

The news is replete with articles about how some people are signing off from Twitter, having “had it” with the politicization of the platform. (To be fair, that’s a knock on Facebook as well these days.)

Then there are reports of how Twitter has stumbled in its efforts to monetize the platform, with advertising strategies that have failed to generate the kind of growth to match the company’s optimistic forecasts. That bit of bad news has hurt Twitter’s share price pretty significantly.

And now, courtesy of a new analysis published by researchers at Indiana University and the University of Southern California, comes word that Twitter is delivering misleading analytics on audience “true engagement” with tweets.  The information is contained in a peer-reviewed article titled Online Human-Bot Interactions: Detection, Estimation and Characterization.

According to findings as determined by Indiana University’s Center for Complex Networks & Systems Research (CNetS) and the Information Sciences Institute at the University of Southern California, approximately 15% of Twitter accounts are “bots” rather than people.

That sort of news can’t be good for a platform that is struggling to elevate its user base in the face of growing competition.

But it’s even more troubling for marketers who rely on Twitter’s engagement data to determine the effectiveness of their campaigns. How can they evaluate social media marketing performance if the engagement data is artificially inflated?

Fifteen percent of all accounts may seem like a rather small proportion, but in the case of Twitter that represents nearly 50 million accounts.

To add insult to injury, the report notes that even the 15% figure is likely too low, because more sophisticated and complex bots could have appeared as a “humans” in the researchers’ analytical model, even if they aren’t.

There’s actually an upside to social media bots – examples being automatic alerts of natural disasters or customer service responses. But there’s also growing evidence of nefarious applications abounding.

Here’s one that’s unsurprising even if irritating: bots that emulate human behavior to manufacture “faux” grassroots political support.  But what about the delivery of dangerous or inciting propaganda thanks to bot “armies”?  That’s more alarming.

The latest Twitter-bot news is more confirmation of the deep challenges faced by this particular social media platform.  What’s next, I wonder?

The downside dangers of IoT: Overblown or underestimated?

In recent weeks, there has been an uptick in articles appearing in the press about the downside risks of the Internet of Things (IoT). The so-called “Weeping Angel” technique, which essentially allows hackers to turn a smart television into a microphone, is one eyebrow-raising example included from the CIA files released by WikiLeaks recently. Another is the potential for hacking into the systems of autonomous vehicles, enabling cargo to be stolen or the vehicles themselves to be held for ransom.

Some of it seems like the stuff of science fiction – or at the very least a modern form of cloak-and-dagger activity. Regular readers of the Nones Notes blog know that when we’re in the midst of a “collective angst” about a topics of this nature, I like to solicit the views of my brother, Nelson Nones, who has been in the fields of IT and operations management for decades.

I asked Nelson to share his perspectives on IoT, what he sees are its pitfalls, and whether the current levels of concern are justified. His comments are presented below:

Back in 1998, I was invited to speak about the so-called “millennium bug” (also known as the “Y2K bug”) at a symposium in Kuching, Malaysia. It was a hot topic at that time, because many computer systems then in use hadn’t been designed or built to deal with calendar dates beyond the end of the 20th century.  

The purpose of my presentation was to educate the audience about the nature of the problem, and how to mitigate it. During the question-and-answer session which followed, a member of the audience rose and began to speak rather hysterically of the threat which the millennium bug posed to civilization as we knew it.  

His principal concern was the millions of embedded sensors and controllers in use throughout industry which were not programmable and would therefore need to be replaced. In his view, very few people knew which of those devices were susceptible to the millennium bug, or where they were running.  

As a result, he felt that many flawed devices would go undetected, causing critical infrastructures such as power generation plants, electricity grids and aircraft to fail.  

Needless to say, his dire predictions did not come to pass and humankind sailed into the 21st century with barely a murmur. This isn’t to say that the millennium bug wasn’t a real threat – it certainly was – but rather that providers and users of information technology (IT) mostly did what was necessary to prepare for it.  As Britain’s Guardian newspaper reported in April 2000, “In truth, there have been bug incidents … none of this, however, adds up to global recession, or infrastructure collapse, or accidental nuclear war, as the most heated prophets were anticipating.”  

It is for similar reasons that I take much of today’s hype over security vulnerabilities of IoT with more than a pinch of salt. 

It’s worth noting that, technologically speaking, IoT isn’t really very new at all. As the prophet of doom at my 1998 symposium (correctly) observed, sensors, software, actuators and electronic controllers have been integral components of automated industrial systems for the past thirty years at least.   

What’s new is that these technologies have begun to be accepted and deployed by consumers. I say “begun” because I don’t know anyone who has actually rigged a “smart home” to work in the all-encompassing way breathlessly envisioned by purveyors of home automation technology; but I do know people who use the technology for specific purposes such as home security, thermostat control and recording TV programs.  

Just last week I spoke with someone who is beta testing a self-driving Tesla automobile, but he confessed that he still won’t take his hands off the wheel because he doesn’t really trust the self-driving technology yet.  

What’s also new is that businesses are extending their use of sensors and controllers well beyond the confines of plants, factories and warehouses. For example, trucking companies routinely use global positioning system (GPS) sensors to monitor fleet locations in real-time.  

Aircraft engine makers such as Rolls-Royce and GE rely on management and monitoring systems to transmit information from sensors to ground stations for real time analysis, during flight.  Many problems which are detected in this manner can be instantly corrected during flight, by relaying instructions back to controllers and actuators installed on the engine.  

The common denominator for what’s new is the use of existing Internet infrastructure; hence the “I” in “IoT.”  

In earlier times, sensors, software and electronic controllers could communicate only through local area networks (LANs) which were physically isolated and therefore impermeable to external attacks. But when those devices are connected to the public Internet, in theory anyone can access them — including cyber-criminals and governments engaged in sabotage or espionage, or who want to hold things for ransom, surreptitiously watch live feeds, or deploy botnets for distributed denial of service (DDoS) attacks.  

It is clear, therefore, that the root causes of privacy and security concerns arising from increasing IoT usage are mainly network security lapses, and not the things themselves.

Ensuring the highest possible degree of network security is no easy task. Above and beyond arcane technical details such as encryption, installing network firewalls, and opening and closing of ports, it means deploying multiple layers of defenses according to specific policies and controls, and that requires skills and knowledge which most consumers, and even many businesses, do not possess. 

Still, one doesn’t have to be a network geek to implement basic security mechanisms that far too many people overlook. In search of easy pickings, cyber-criminals usually prefer to exploit the huge number of unlocked doors begging for their attention, rather than wasting time trying to penetrate even slightly stronger defenses.   

For example, many people install wireless networks in their homes but forget to change the default router password and default network name (SSID) – or they pick a password that’s easy to guess. In addition, many people leave their network “open” to anyone having a wireless card by failing to implement a security key such as a WPA, WPA2 or WEP key, or by choosing a weak security key.   

An attacker can discover those lapses in a matter of seconds, or less, giving them full administrative authority and control over the compromised network with little risk of detection. This, in turn, would give the attacker immediate access to, and remote control over, any device on the network which is switched on but does not require authentication; for example, network printers, data storage devices, cameras, TVs and personal computers (PCs) which are not configured to require a user logon. 

Plugging those security holes doesn’t require specialist knowledge and shouldn’t take more than an hour for most home networks. Recognizing the security concerns, an increasing number of hardware and software vendors are preconfiguring their products in “full lockdown” mode, which provides basic security by default and requires users to apply specialist knowledge in order to open up their networks as necessary for greater convenience.  

This is precisely what Microsoft did over a decade ago, with great success, in response to widely publicized security vulnerabilities in its Windows® operating system and Internet Explorer browser. 

It’s all too easy to imagine the endgames of hypothetical scenarios in which the bad apples win by wresting control over the IoT from the good guys. But just like the millennium bug nearly two decades ago, it is wiser to heed the wisdom of Max Ehrmann’s Desiderata, published back in 1927:  

“Exercise caution in your business affairs, for the world is full of trickery … but do not distress yourself with dark imaginings.”  

Going forward, I’m confident that a healthy dose of risk intelligence, and not fear, will prove to be the key for successfully managing the downside aspects of IoT.

_________________________

So those are Nelson’s views on the Internet of Things. What about you?  Are you in agreement, or are there aspects about which you may think differently?  Please share your thoughts with other readers.

B-to-B content marketers: Not exactly a confident bunch.

In the world of business-to-business marketing, all that really matters is producing a constant flow of quality sales leads.  According to Clickback CEO Kyle Tkachuk, three-fourths of B-to-B marketers cite their most significant objective as lead generation.  Pretty much everything else pales in significance.

This is why content marketing is such an important aspect of commercial marketing campaigns.  Customers in the commercial world are always on the lookout for information and insights to help them solve the variety of challenges they face on the manufacturing line, in their product development, quality assurance, customer service and any number of other critical functions.

Suppliers and brands that offer a steady diet of valuable and actionable information are often the ones that end up on a customer’s “short-list” of suppliers when the need to make a purchase finally rolls around.

Thus, the role of content marketers continues to grow – along with the pressures on them to deliver high-quality, targeted leads to their sales forces.

The problem is … a large number of content marketers aren’t all that confident about the effectiveness of their campaigns.

It’s a key takeaway finding from a survey conducted for content marketing software provider SnapApp by research firm Demand Gen.  The survey was conducted during the summer and fall of 2016 and published recently in SnapApp’s Campaign Confidence Gap report.

The survey revealed that more than 80% of the content marketers queried reported being just “somewhat” or “not very” confident regarding the effectiveness of their campaigns.

Among the concerns voiced by these content marketers is that the B-to-B audience is becoming less enamored of white papers and other static, lead-gated PDF documents to generate leads.

And yet, those are precisely the vehicles that continue to be used most often used to deliver informational content.

According to the survey respondents, B-to-B customers not only expect to be given content that is relevant, they’re also less tolerant of resources that fail to speak to their specific areas of interest.

For this reason, one-third of the content managers surveyed reported that they are struggling to come up with effective calls-to-action that capture attention, interest and action instead of being just “noise.”

The inevitable conclusion is that traditional B-to-B marketing strategies and similar “seller-centric” tactics have become stale for buyers.

Some content marketers are attempting to move beyond these conventional approaches and embrace more “content-enabled” campaigns that can address interest points based on a customer’s specific need and facilitate engagement accordingly.

Where such tactics have been attempted, content marketers report somewhat improved results, including more open-rate activity and an in increase in clickthrough rates.

However, the degree of improvement doesn’t appear to be all that impressive. Only about half of the survey respondents reported experiencing improved open rates.  Also, two-thirds reported experiencing an increase in clickthrough rates – but only by 5% or less.

Those aren’t exactly eye-popping improvements.

But here’s the thing: Engagement levels with traditional “static” content marketing vehicles are likely to actually decline … so if content-enabled campaigns can arrest the drop-off and even notch improvements in audience engagement, that’s at least something.

Among the tactics content marketers consider for their creating more robust content-enabled campaigns are:

  • Video
  • Surveys
  • Interactive infographics
  • ROI calculators
  • Assessments/audits

The hope is that these and other tools will increase customer engagement, allow customers to “self-quality,” and generate better-quality leads that are a few steps closer to an actual sale.

If all goes well, these content-enabled campaigns will also collect data that helps sales personnel accelerate the entire process.

Where the Millionaires Are

Look to the states won by Hillary Clinton in the 2016 presidential election.

Proportion of “millionaire households” by state (darker shades equals higher proportion of millionaires).

Over the past few years, we’ve heard a good deal about income inequality in the United States. One persistent narrative is that the wealthiest and highest-income households continue to do well – and indeed are improving their relative standing – while many other families struggle financially.

The most recent statistical reporting seems to bears this out.

According to the annual Wealth & Affluent Monitor released by research and insights firm Phoenix Marketing International, the total tally of U.S. millionaire households is up more than 800,000 over the past years.

And if we go back to 2006, before the financial crisis and subsequent Great Recession, the number of millionaire households has increased by ~1.3 million since that time.

[For purposes of the Phoenix report, “millionaire households” are defined as those that have $1 million or more in investable assets. Collectively, these households possess approximately $20 billion in liquid wealth, which is nearly 60% of the entire liquid wealth in America.]

Even with a growing tally, so-called “millionaire households” still represent around 5% of all U.S. households, or approximately 6.8 million in total. That percentage is nearly flat (up only slightly to 5.1% from 4.8% in 2006).

Tellingly, there is a direct correlation between the states with the largest proportion of millionaire households and how those states voted in the most recent presidential election. Every one of the top millionaire states is located on the east or west coasts – and all but one of them was won by Hillary Clinton:

  • #1  Maryland
  • #2  Connecticut
  • #3  New Jersey
  • #4  Hawaii
  • #5  Alaska
  • #6  Massachusetts
  • #7  New Hampshire
  • #8  Virginia
  • #9  DC
  • #10  Delaware

Looking at the geographic makeup of the states with the highest share of millionaires helps explain how “elitist” political arguments had a degree resonance in the 2016 campaign that may have surprised some observers.

Nearly half of the jurisdictions Hillary Clinton won are part of the “Top 10” millionaire grouping, whereas just one of Donald Trump’s states can be found there.

But it’s when we look at the tiers below the “millionaire households” category that things come into even greater focus. The Phoenix report shows that “near-affluent” households in the United States – the approximately 14 million households having investable assets ranging from $100,000 to $250,000 – actually saw their total investable assets decline in the past year.

“Affluent” households, which occupy the space in between the “near-affluents” and the “millionaires,” have been essentially treading water. So it’s quite clear that things are not only stratified, but also aren’t improving, either.

The reality is that the concentration of wealth continues to deepen, as the Top 1% wealthiest U.S. households possess nearly one quarter of the total liquid wealth.

In stark contrast, the ~70% of non-affluent households own less than 10% of the country’s liquid wealth.

Simply put, the past decade hasn’t been kind to the majority of Americans’ family finances. In my view, that dynamic alone explains more of 2016’s political repercussions than any other single factor.  It’s hardly monolithic, but often “elitism” and “status quo” go hand-in-hand. In 2016 they were lashed together; one candidate was perceived as both “elitist” and “status quo,” and the result was almost preordained.

The most recent Wealth & Affluent Monitor from Phoenix Marketing International can be downloaded here.

Cross-Currents in the Minimum Wage Debate

usmap-minimum-wages-2017This past November, there were increased minimum wage measures on the ballet in four states – Arizona, Colorado, Maine and Washington. They were approved by voters in every instance.

But are views about the minimum wage actually that universally positive?

A survey of ~1,500 U.S. consumers conducted by Cincinnati-based customer loyalty research firm Colloquy around the same time as the election reveals some contradictory data.

Currently, the federal minimum wage rate is set a $7.25 per hour. The Colloquy research asked respondents for their views in a world where the minimum wage would $15 per hour — a figure which is at the upper limit of where a number of cities and counties are now pegging their local minimum wage rates.

The survey asked consumers if they’d expect to receive better customer service and have a better overall customer experience if the minimum wage were raised to $15 per hour.

Nearly 60% of the respondents felt that they’d be justified in expecting to receive better service and a better overall experience if the minimum wage were raised to that level.  On the other hand, nearly 70% believed that they wouldn’t actually receive better service.

The results show pretty clearly that consumers don’t see a direct connection between workers receiving a substantially increased minimum wage and improvements in the quality of service those workers would provide to their consumers.

Men feel even less this way than women: More than 70% of men said they wouldn’t expect to receive better service, versus around 65% of women.

Younger consumers in the 25-34 age group, who could well be among the workers more likely to benefit from an increased minimum wage, are just as likely to expect little or no improvement in service quality. Nearly 70% responded as such to the Colloquy survey.

One concern some respondents had was the possibility that a dramatic rise in the minimum wage to $15 per hour could lead retailers to add more automation, resulting in an even less satisfying overall experience. (For men, it was ~44% who feel that way, while for women it was ~33%.)

Along those lines, we’re seeing that for some stores, labor-saving alternatives such as installing self-service checkout lanes have negative ramifications to such a degree that any labor savings are more than offset by incidences of merchandise “leaving the store” without having been paid for properly.

Significant numbers of consumers aren’t particularly thrilled with the “forced march” to self-serve checkout lines at some retail outlets, either.

Perhaps the most surprising finding of all in the Colloquy research was that only a minority of the survey respondents were actually in support of raising the minimum wage to $15 per hour. In stark contrast to the state ballot measures which were supported by clear majorities of voters, the survey found that just ~38% of the respondents were in favor.

The discrepancy is likely due to several factors. Most significantly, the November ballot measures were not stipulating such a dramatic monetary increase, but rather minimum wage rates that would increase to only $12 or $13 – and only by the year 2020 rather than immediately.

That, coupled with concerns about automation and little expectation of improved service quality, and it means that this issue isn’t quite as “black-and-white” as some might presume.