Are we now a nation of “data pragmatists”?

Do people even care about data privacy anymore?

You’d think that with the continuing cascade of news about the exposure of personal information, people would be more skittish than ever about sharing their data.

But this isn’t the case … and we have a 2018 study from marketing data foundation firm Acxiom to prove it. The report, titled Data Privacy: What the Consumer Really Thinks, is the result of survey information gathered in late 2017 by Acxiom in conjunction with the Data & Marketing Association (formerly the Direct Marketing Association).

The research, which presents results from an online survey of nearly 2,100 Americans age 18 and older, found that nearly 45% of the respondents feel more comfortable with data exchange today than they have in the past.

Among millennial respondents, well over half feel more comfortable about data exchange today.

Indeed, the report concludes that most Americans are “data pragmatists”:  people who are open about exchanging personal data with businesses if the benefits received in return for their personal information are clearly stated.

Nearly 60% of Americans fall into this category.

On top of that, another 20% of the survey respondents report that they’re completely unconcerned about the collection and usage of their personal data. Among younger consumers, it’s nearly one-third.

When comparing Americans’ attitudes to consumers in other countries, we seem to be a particularly carefree bunch. Our counterparts in France and Spain are much more wary of sharing their personal information.

Part of the difference in views may be related to feelings that Americans have about who is responsible for data security. In the United States, the largest portion of people (~43%) believe that 100% of the responsibility for data security lies with consumers themselves, versus only ~6% who believe that the responsibility resides solely with brands or the government.  (The balance of people think that the responsibility is shared between all parties.)

To me, the bottom-line finding from the Acxiom/DMA study is that people have become so conditioned to receiving the benefits that come from data exchange, they’re pretty inured to the potential downsides.  Probably, many can’t even fathom going back to the days of true data privacy.

Of course, no one wishes for their personal data to be used for nefarious purposes, but who is willing to forego the benefits (be it monetary, convenience or comfort) that come from companies and brands knowing their personal information and their personal preferences?

And how forgiving would these people be if their personal data were actually compromised? From Target to Macy’s, quite a few Americans have already had a taste of this, but what is it going to take for such “data pragmatism” to seem not so practical after all?

I’m thinking, a lot.

For more findings from the Axciom research, click or tap here.

Clueless at the Capitol

Lawmakers’ cringeworthy questioning of Facebook’s Mark Zuckerberg calls into question the government’s ability to regulate social media.

Saul Loeb/AFP/Getty Images
Facebook CEO Mark Zuckerberg on Capitol Hill, April 2018. (Saul Loeb/AFP/Getty Images)

With the testimony on Capitol Hill last week by Facebook CEO Mark Zuckerberg, there’s heightened concern about the negative side effects of social media platforms. But in listening to lawmakers questioning Zuckerberg, it became painfully obvious that our Federal legislators have next to no understanding of the role of advertising in social media – or even how social media works in its most basic form.

Younger staff members may have written the questions for their legislative bosses, but it was clear that the lawmakers were ill-equipped to handle Zuckerberg’s alternatively pat, platitudinous and evasive responses and to come back with meaningful follow-up questions.

Even the younger senators and congresspeople didn’t acquit themselves well.

It made me think of something else, too. The questioners – and nearly everyone else, it seems – are missing this fundamental point about social media:  Facebook and other social media platforms aren’t much different from old-fashioned print media, commercial radio and TV/cable in that that they all generate the vast bulk of their earnings from advertising.

It’s true that in addition to advertising revenues, print publications usually charge subscribers for paper copies of their publications. In the past, this was because 1) they could … but 2) also to help defray the cost of paper, ink, printing and physical distribution of their product to news outlets or directly to homes.

Commercial radio and TV haven’t had those costs, but neither did they have a practical way of charging their audiences for broadcasts – at least not until cable and satellite came along – and so they made their product available to their audiences at no charge.

The big difference between social media platforms and traditional media is that social platforms can do something that the marketers of old could only dream about: target their advertising based on personally identifiable demographics.

Think about it:  Not so many years ago, the only demographics available to marketers came from census publications, which by law cannot reveal any personally identifiable information.  Moreover, the U.S. census is taken only every ten years, so the data ages pretty quickly.

Beyond census information, advertisers using print media could rely on audit reports from ABC and BPA.  If it was a business-to-business publication, some demographic data was available based on subscriber-provided information (freely provided in exchange for receiving those magazines free of charge).  But in the case of consumer publications, the audit information wouldn’t give an advertiser anything beyond the number of copies printed and sold, and (sometimes) a geographic breakdown of where mail subscribers lived.

Advertisers using radio or TV media had to rely on researchers like Nielsen — but that research surveyed only a small sample of the audience.

What this meant was that the only way advertisers could “move the needle” in a market was to spend scads of cash on broadcasting their messages to the largest possible audience. As a connecting mechanism, this is hugely inefficient.

The value proposition that Zuckerberg’s Facebook and other social media platforms provide is the ability to connect advertisers with more people for less spend, due to these platforms’ abilities to use personally identifiable demographics for targeting the advertisements.

Want to find people who enjoy doing DIY projects but who live just in areas where your company has local distribution of your products? Through Facebook, you can narrow-cast your reach by targeting consumers involved with particular activities and interests in addition to geography, age, gender, or whatever other characteristics you might wish to use as filters.

That’s massively more efficient and effective than relying on something like median household income within a zip code or census tract. It also means that your message will be more targeted — and hence more relevant — to the people who see it.

All of this is immensely more efficient for advertisers, which is why social media advertising (in addition to search advertising on Google) has taken off while other forms of advertising have plateaued or declined.

But there’s a downside: Social media is being manipulated (“abused” might be the better term) by “black hats” – people who couldn’t do such things in the past using census information or Nielsen ratings or magazine audit statements.

Here’s another reality: Facebook and other social media platforms have been so fixated on their value proposition that they failed to conceive of — or plan for — the behavior inspired by the evil side of humanity or those who feel no guilt about taking advantage of security vulnerabilities for financial or political gain.

Facebook COO Sheryl Sandberg interviewed on the Today Show, April 2018.

Now that that reality has begun to sink in, it’ll be interesting to see how Mark Zuckerberg and Sheryl Sandberg of Facebook — not to mention other social media business leaders — respond to the threat.

They’ll need to do something — and it’ll have to be something more compelling than CEO Zuckerberg’s constant refrain at the Capitol Hill hearings (“I’ll have my team look into that.”) or COO Sandberg’s litany (“I’m glad you asked that question, because it’s an important one.”) on her parade of TV/cable interviews.  The share price of these companies’ stock will continue to pummeled until investors understand what changes are going to be made that will actually achieve concrete results.

Future shock? How badly is cyber-hacking nibbling away at our infrastructure?

I don’t know about you, but I’ve never forgotten the late afternoon of August 14, 2003 when problems with the North American power grid meant that people in eight states stretching from New England to Detroit suddenly found themselves without power.

Fortunately, my company’s Maryland offices were situated about 100 miles beyond the southernmost extent of the blackout. But it was quite alarming to watch the power outage spread across a map of the Northeastern and Great Lakes States (plus Ontario) in real-time, like some sort of creeping blob from a science fiction film.

According to Wikipedia’s article on the topic, the impact of the blackout was substantial — and far-reaching:

“Essential services remained in operation in some … areas. In others, backup generation systems failed. Telephone networks generally remained operational, but the increased demand triggered by the blackout left many circuits overloaded. Water systems in several cities lost pressure, forcing boil-water advisories to be put into effect. Cellular service was interrupted as mobile networks were overloaded with the increase in volume of calls; major cellular providers continued to operate on standby generator power. Television and radio stations remained on the air with the help of backup generators — although some stations were knocked off the air for periods ranging from several hours to the length of the entire blackout.”

Another (happier) thing I remember from this 15-year-old incident is that rather than causing confusion or bedlam, the massive power outage brought out the best in people. This anecdote from the blackout was typical:  Manhattanites opening their homes to workers who couldn’t get to their own residences for the evening.

For most of the 50 million+ Americans and Canadians affected by the blackout, power was restored after about six hours.  But for some, it would take as long as two days for power restoration.

Upon investigation of the incident, it was discovered that high temperatures and humidity across the region had increased energy demand as people turned on air conditioning units and fans. This caused power lines to sag as higher currents heated the lines.  The precipitating cause of the blackout was a software glitch in the alarm system in a control room of FirstEnergy Corporation, causing operators to be unaware of the need to redistribute the power load after overloaded transmission lines had drooped into foliage.

In other words, what should have been, at worst, a manageable localized blackout cascaded rapidly into a collapse of the entire electric grid across multiple states and regions.

But at least the incident was borne out of human error, not nefarious motives.

That 2003 experience should make anyone hearing last week’s testimony on Capitol Hill about the risks faced by the U.S. power grid think long and hard about what could happen in the not-so-distant future.

The bottom-line on the testimony presented in the hearings is that malicious cyberattacks are becoming more sophisticated – and hence more capable of causing damage to American infrastructure. The Federal Energy Regulatory Commission (FERC) is cautioning that hackers are increasingly threatening U.S. utilities ranging from power plants to water processing systems.

Similar warnings come from the Department of Homeland Security, which reports that hackers have been attacking the U.S. electric grid, power plants, transportation facilities and even targets in commercial sectors.

The Energy Department goes even further, reporting in 2017 that the United States electrical power grid is in “imminent danger” from a cyber-attack. To underscore this threat, the Department contends that more than 100,000 cyber-attacks are being mounted every day.

With so many attacks of this kind happening on so many fronts, one can’t help but think that it’s only a matter of time before we face a “catastrophic event” that’s even more consequential than the one that affected the power grid in 2003.

Even more chilling, if it’s borne out of intentional sabotage – as seems quite likely based on recent testimony – it’s pretty doubtful that remedial action could be taken as quickly or as effectively as what would be done in response to an accidental incident likr the one that happened in 2003.

Put yourself in the saboteurs’ shoes: If your aim is to bring U.S. infrastructure to its knees, why plan for a one-off event?  You’d definitely want to build in ways to cause cascading problems – not to mention planting additional “land-mines” to frustrate attempts to bring systems back online.

Contemplating all the implications is more than sobering — it’s actually quite frightening. What are your thoughts on the matter?  Please share them with other readers.

Legislators tilt at the digital privacy windmill (again).

In the effort to preserve individual privacy in the digital age, hope springs eternal.

The latest endeavor to protect individuals’ privacy in the digital era is legislation introduced this week in the U.S. Senate that would require law enforcement and government authorities to obtain a warrant before accessing the digital communications of U.S. citizens.

Known as the ECPA Modernization Act of 2017, it is bipartisan legislation introduced by two senators known for being polar opposites on the political spectrum: Sen. Patrick Leahy (D-VT) on the left and Sen. Mike Lee (R-UT) on the right.

At present, only a subpoena is required for the government to gain full access to Americans’ e-mails that a over 180 days old. The new ECPA legislation would mean that access couldn’t be granted without showing probable cause, along with obtaining a judge’s signature.

The ECPA Modernization Act would also require a warrant for accessing geo-location data, while setting new limits on metadata collection. If the government did access cloud content without a warrant, the new legislation would make that data inadmissible in a court of law.

There’s no question that the original ECPA (Electronic Communications Privacy Act) legislation, enacted in 1986, is woefully out of date. After all, it stems from a time before the modern Internet.

It’s almost quaint to realize that the old ECPA legislation defines any e-mail older than 180 days as “abandoned” — and thereby accessible to government officials.  After all, we now live in an age when many residents keep the same e-mail address far longer than their home address.

The fact is, many individuals have come to rely on technology companies to store their e-mails, social media posts, blog posts, text messages, photos and other documents — and to do it for an indefinite period of time. It’s perceived as “safer” than keeping the information on a personal computer that might someday malfunction for any number of reasons.

Several important privacy advocacy groups are hailing the proposed legislation and urging its passage – among them the Center for Democracy & Technology and the Electronic Frontier Foundation.

Sophia Cope, an attorney at EFF, notes that the type of information individuals have entrusted to technology companies isn’t very secure at all. “Many users do not realize that an e-mail stored on a Google or Microsoft service has less protection than a letter sitting in a desk drawer at home,” Cope maintains.

“Users often can’t control how and when their whereabouts are being tracked by technology,” she adds.

The Senate legislation is also supported by the likes of Google, Amazon, Facebook and Twitter.

All of which makes it surprising that this type of legislation – different versions of which have been introduced in the U.S. Senate every year since 2013 – has had such trouble gaining traction.

The reasons for prior-year failure are many and varied – and quite revealing in terms of illuminating how crafting legislation is akin to sausage-making.  Which is to say, not very pretty.  But this year, the odds look more favorable than ever before.

Two questions remain on the table: First, will the legislation pass?  And second, will it really make a difference in terms of protecting the privacy of Americans?

Any readers with particular opinions are encouraged to weigh in.

When it comes to city parklands, the Twin Cities of Minneapolis-St. Paul rule.

Minneapolis, Minnesota

Although I lived in five states prior to going away to college, I spent the most time in those formative years of my life residing in the Twin Cities of Minneapolis and St. Paul in Minnesota.

The city parks in both towns are major amenities. Indeed, you could say that the entire fabric of life in the two cities is interwoven with the park systems; they’re that special.  And my family was no exception in taking advantage of everything the parks had to offer.

So it wasn’t much of a surprise to find that both cities are at the very top of the list of U.S. cities with the best parks.

The evaluation is done annually by The Trust for Public Land, and covers the 100 most populous cities in the United States.

The major metric studied is the percent of city residents who live within a 10-minute walk of a park — although other characteristics are also analyzed, such as park size and investment, the number of playgrounds, dog parks and recreation centers in relation to city population, and so on.

In the 2017 evaluation, Minneapolis topped the list of 100 cities, helped by superior park access across all ages and income levels, as well as achieving top scores in park investment as well as the number of senior and recreation centers, plus dog parks.

In total, an incredible 15% of Minneapolis’ entire square mileage is dedicated to park space.

St. Paul was right behind Minneapolis in the #2 slot out of 100 cities evaluated. As visitors to the Twin Cities know, Minneapolis is blessed with seven natural lakes within its borders, whereas next-door St. Paul has just two.  Nevertheless, its commitment to parkland is nearly as strong.

Here’s how the Top 10 list of cities shakes out:

  • #1 Minneapolis, MN
  • #2 St. Paul, MN
  • #3 San Francisco, CA
  • #4 Washington, DC
  • #5 Portland, OR
  • #6 Arlington, VA
  • #7 (tie) Irvine, CA and New York, NY
  • #9 Madison, WI
  • #10 Cincinnati, OH

Several of these cities shine in certain attributes. San Francisco, for instance, scores highest for park access, with nearly every resident living within a 10-minute walk of a park.

Three cities (Arlington, Irvine and Madison), achieved Top 10 ranking for only the second time (all three first made it into the Top 10 ranking in 2016).

What about cities that appear at the bottom of the Trust for Public Land list? They tend to be “car-dominated” cities, where parks aren’t easily accessible by foot for many residents.  For the record, here are the cities that rank lowest in the rankings:

  • #90 (tie) Fresno, CA, Hialeah, FL and Jacksonville, FL
  • #93 (tie) Laredo, TX and Winston-Salem, NC
  • #95 Mesa, AZ
  • #96 Louisville, KY
  • #97 Charlotte, NC
  • #98 (tie) Fort Wayne, IN and Indianapolis, IN
Bottom dweller: A crumbling structure in a padlocked park in Indianapolis, Indiana.

Interestingly, one of these cities – Charlotte – leads all others in median park size (~16 acres). Of course, this likely means that residents’ access to them suffers because there are fewer small parks scattered around the city.

To see the full rankings as well as each city’s score by category evaluated, you can view a comparative chart here.

Based on your experience, do any of the city rankings surprise you? Is there a particular city that you think should be singled out for praise (or pan) about their parklands?

For job seekers in America, the compass points south and west.

Downtown Miami

Many factors go into determining what may be the best cities for job seekers to find employment. There are any number of measures – not least qualitative ones such as where friends and family members reside, and what kind of family safety net exists.

But there are other measures, too – factors that are a little easier to apply across all workers:

  • How favorable is the local labor market to job seekers?
  • What are salary levels after adjusting for cost-of-living factors?
  • What is the “work-life” balance that the community offers?
  • What are the prospects for job security and advancement opportunities?

Seeking to find clues as to which metro areas represent the best environments for job seekers, job posting website Indeed set about analyzing data gathered from respondents who live in the 50 largest metro areas on the Indeed review database.

Indeed’s research methodology is explained here. Its analysis began by posing the four questions above and applying a percentile score for each one based on the feedback it received, followed by additional analytical calculations to come up with a consolidated score for each of the 50 metro areas.

The resulting list shows a definite skew towards the south and west. In order of rank, here are the ten metro areas that scored as the most attractive places for job seekers:

#1. Miami, FL

#2. Orlando, FL

#3. Raleigh, NC

#4. Austin, TX

#5. Sacramento, CA

#6. San Jose, CA

#7. Jacksonville, FL

#8. San Diego, CA

#9. Houston, TX

#10. Memphis, TN

Not all metro areas ranked equally strongly across the four measurement categories. Overall leader Miami scored very highly for work-life-balance as well as job security and advancement, but its cost-of-living factors were decidedly less impressive.

“Where are cities in the Northeast and the Midwest?”, you might ask. Not only are they nowhere to be found in the Top 10, they aren’t in the second group of ten in Indeed’s ranking, either:

#11. Las Vegas, NV

#12. San Francisco, CA

#13. Riverside, CA

#14. Atlanta, GA

#15. Los Angeles, CA

#16. San Antonio, TX

#17. Seattle, WA

#18. Hartford, CT

#19. Charlotte, NC

#20. Tampa, FL

… except for one: Hartford (#18 on Indeed’s list).

Likely, the scarcity of Northeastern and Midwestern cities correlates with the loss of manufacturing jobs, which have typically been so important to those metro areas.  Many of these markets have struggled to become more diversified.

If there are similar characteristics between the top-scoring cities beside geography, it’s that they’re high-tech bastions, highly diversified economies or – very significantly – the seat of state government.

In fact, if you look at the Top 10 metro areas, three of them are state capital cities; in the next group, there are two more.  Not surprisingly, those cities were ranked higher than others for job security.  And salary levels compared to the cost of living in those areas were also quite lucrative.

So much for the adage that a government paycheck is low but the job security is high; it turns out, they both are.

For more details on the Indeed listing, how the ranking was derived, and individual scores by metro area for the four criteria shown above, click here.

The U.S. Postal Services unveils its Informed Delivery notification service – about two decades too late.

Earlier this year, the U.S. Postal Service decided to get into the business of e-mail. But the effort is seemingly a day late and a dollar short.

Here’s how the scheme works: Via sending an e-mail with scanned images, the USPS will notify a customer of the postal mail that will be delivered that day.

It’s called Informed Delivery, and it’s being offered as a free service.

Exactly what is this intended to accomplish?

It isn’t as if receiving an e-mail notification of postal mail that’s going to be delivered within hours is particularly valuable.  If the information were that time-sensitive, why not receive the actual original item via e-mail to begin with?  That would have saved the sender 49 cents on the front end as well.

So the notion that this service would somehow stem the tide of mass migration to e-mail communications seems pretty far-fetched.

And here’s another thing: The USPS is offering the service free of charge – so it isn’t even going to reap any monetary income to recoup the cost of running the program.

That doesn’t seem to make very good business sense for an organization that’s already flooded with red ink.

Actually, I can think of one constituency that might benefit from Informed Delivery – rural residents who aren’t on regular delivery routes and who must travel a distance to pick up their mail at a post office. For those customers, I can see how they might choose to forgo a trip to town if the day’s mail isn’t anything to write home about — if you’ll pardon the expression.

But what portion of the population is made up of people like that? I’m not sure, but it’s likely far fewer than 5%.

And because the USPS is a quasi-governmental entity, it’s compelled to offer the same services to everyone.  So even the notion of offering Informed Delivery as “niche product” to just certain people isn’t relevant.

I guess the USPS deserves fair dues just for trying to come up with new ways to be relevant in the changing communications world. But it’s very difficult to come up with anything worthwhile when the entire foundation of the USPS’s mission has so been eroded over the past generation.