The States Where Your Dollar Goes a Good Deal Further

billsPeople have long suspected that many of America’s “richest” areas, based on salaries and other income, also happen to be where the cost of living is significantly higher.

Silicon Valley plus the New York City, Boston and the DC metro areas are some of the obvious regions, notorious for their out-of-sight housing and real estate prices.

But there are other factors at work as well in these high-cost areas, such as the cost of delivering goods to certain areas well-removed from the nation’s major trunk transportation arteries (think Alaska, Hawaii, Washington State and Minnesota).

And then there are state and local taxes. There appears to be a direct relationship between higher costs of living and higher taxation, too.

It’s one thing to go on hunches. But helpfully, all of these perceptions have been confirmed by the Bureau of Economic Analysis, using Personal Consumption Expenditure and American Community Survey data to do so.  Rolling the data up, the BEA has published comparative figures for all 50 states plus DC pertaining to the relative cost of living.

The approach was simple: consolidate the data to come up with a dollar figure in each state that represents how much $100 can purchase locally compared to the national average.  To get there, average price levels in each state have been calculated for household consumption, including rental housing costs.

Based on 2014 data, the figures have been mapped and are shown below:

100

So, just how far does $100 go?

The answer to that question is this: quite a bit further if you live in the mid-Continent region of the country compared to the Pacific Coast or the Northeast U.S.

In fact, $100 will get you upwards of 15% more goods and services in quite a few states. Here are the Top 10 states how much $100 will actually buy there:

  • Mississippi: $115.74 worth of goods and services
  • Arkansas: $114.16
  • Alabama: $113.51
  • Missouri: $113.51
  • South Dakota: $113.38
  • West Virginia: $112.87
  • Ohio: $112.11
  • Iowa: $111.73
  • Kansas: $111.23
  • Oklahoma: $111.23

At the other end of the scale, $100 is only going to buy about 20% to 30% fewer goods and services in the “Bottom 10” states compared to the “Top 10.” Here’s how it looks state-by-state:

  • DC: $84.60
  • Hawaii: $85.32
  • New York: $86.66
  • New Jersey: $87.64
  • California: $88.57
  • Maryland: $89.85
  • Connecticut: $91.41
  • Massachusetts: $93.28
  • Alaska: $93.37
  • New Hampshire: $94.16

Which states are closest to the $100 reference figure? Those would be Illinois at $99.40, and Oregon at $101.21.

I must say that those last two figures surprised me a bit … as I would have expected $100 to go less far in Illinois and Oregon.

Which of the state results surprise you? If any of them do, please share your observations with other readers.

For many people, what’s “breaking news” isn’t breaking on traditional news media outlets.

First it was Jon Stewart. Now it’s social media. 

(AP)
(AP)

If you suspect that Americans are increasingly getting their news from someplace other than the standard TV/cable, print and online news outlets, you’re right on the money.

In fact, research conducted by the Pew Center in association with the Knight Foundation during 2015 reveals that the share of people for whom Facebook and Twitter serve as a source of news is continuing to rise.

More specifically, nearly two thirds of the 2,000+ Americans age 18 and older surveyed by Pew (~63%) reported that they’re getting news reporting from Facebook.

A similar percentage reported receiving news from Twitter as well.

That compares with ~52% reporting that they received news from Twitter back in 2013 … and ~47% from Facebook.

Although both of these social networks now have the same portion of people getting news from these two sources, the Pew research discovered some nuanced differences as to their strengths.

smnA far bigger portion of people follow “breaking news” on Twitter compared to Facebook (~59% versus ~31%), which underscores Twitter’s strength in providing immediate “as-it-happens” coverage and commentary on live events.

Seeing such behaviors, it shouldn’t come as any surprise that both social networks have been implementing more initiatives that strengthen their positions as news sources even more:

  • Facebook has launched Instant Articles, a functionality that allows media companies to publish stories directly to the Facebook platform instead of linking to outside websites.
  • Facebook has also introduced a new Trending sidebar that allows users to filter news by major topic categories such as sports, entertainment, politics, technology and science.
  • Twitter has introduced live events to its roster, thanks to its purchase of the live video-streaming app Periscope.
  • A related Twitter initiative, dubbed Moments (aka: Project Lightning), allows anyone – even a person without a Twitter account – to view ongoing feeds of tweets, images and videos pertaining to live events.

According to Pew, news exposure is on social media roughly equal among all demographic factors including gender, ethnicity and income. The one exception, of course, is age.

All of these developments underscore the fact that the “traditional” TV, print and online outlets are no longer dominant when it comes to news consumption. And it’s highly unlikely that the trend will ever be reversed, either.

For authenticity in advertising … perhaps it’s time to stop making it “advertising.”

AT

Take a look at the interesting data in the chart above, courtesy of Nielsen.

Among the things it tells us is this: If there’s one thing that’s universally consistent across all age ranges – from Gen Z and Millennials to the Silent Generation – it’s that nothing has a more positive impact on buying decisions than the recommendation of a family member, a friend or a colleague.

Not only is it true across all age ranges, it’s equally true in business and consumer segments.

The chart also shows us that, broadly speaking, younger people tend to be more receptive to various advertising formats than older age segments.

this isn’t too surprising because with age comes experience – and that also means a higher degree of cynicism about advertising.

Techniques like the “testimonials” from so-called “real people” (who are nonetheless still actors) can’t get past the jaundiced eye of veteran consumers who’ve been around the track many more times than their younger counterparts.

Someone from the Boomer or Silent Generation can smell these things out for the fakery they are like nobody else.

But if friends and colleagues are what move the buy needle the best, how does advertising fit into that scenario? What’s the best way for it to be in the mix?

One way may be “influencer” advertising. This is when industry experts and other respected people are willing to go on record speaking positively about a particular product or service.

Of course, influencers have the best “influence” in the fields where they’re already active, as opposed to endorsements from famous people who don’t have a natural connection to the products they are touting. Such celebrity “testimonials” rarely pass the snicker test.

But if you think about other people like this:

  • An industry thought leader
  • A prominent blogger or social networker in a particular field or on a particular topic
  • A person with a genuine passion for interacting with a particular product or service

… Then you have a person who advocates for your brand in a proactive way.

That’s the most genuine form of persuasion aside from hearing recommendations from those trusted relatives, friends and colleagues.

Of course, none of that will happen without the products and services inspiring passion and advocacy at the outset. If those fundamental factors aren’t part of the mix, we’re back to square one with ineffective faux-testimonials that feel about as genuine as AstroTurf® … and the (lack of) results to match.

Sea change: Today, Americans are receiving their political news in vastly different ways.

pnWhere are Americans getting their political news in this very intensive Presidential election year?

There’s no question that the season is turning out to be a news bonanza, beginning with the string of debates featuring interesting and entertaining candidates and continuing on with near-nonstop political coverage on the cable networks.

And then there’s the endless chatter over on talk radio …

Recently, the Pew Research Center asked Americans where they’re receiving their political news. According to its just-issued report, Pew found that nine in ten American adults age 18+ typically consume some sort of news about the presidential election in any given week’s time.

When asked to cite which sources of information of political news are “most helpful,” here’s how the respondents answered:

  • Cable TV news: ~24% cited as the “most helpful” source of information
  • Local TV news: ~14%
  • Social media: ~14%
  • Website or apps: ~13%
  • Radio: ~11%
  • Network nightly news broadcasts: ~10%
  • Late-night comedy TV: ~3%
  • Local newspapers: ~3%
  • National newspapers: ~2%

Looking at this pecking order, several things stand out:

  • Even a few years ago, I doubt social media would have outstripped network TV or radio as a more helpful source of political news.
  • And look at where cable TV news is positioned — not only at the top of the list, but substantially above any other source of political information.
  • As for newspapers … even accounting for the fact that some websites or apps cited as helpful political news sources may actually be digital outlets for newspapers, newspapers’ position at the bottom of the list underscores their rapid loss of importance (and influence) in the political sphere. Aside from inflating a candidate’s own ego, who really cares about newspaper endorsements anymore?

Not surprisingly, the Pew research finds noticeable differences in the preference of political news sources depending on the age of the respondents. For instance, among respondents age 65+, here are the top four “most helpful” sources:

  • Cable TV news: ~43%
  • Network nightly news broadcasts: ~17%
  • Local TV news: ~10%
  • Local newspapers: ~6%

Contrast this with the very youngest respondents (age 18 to 29), where the two most helpful sources of information are social media (~35%) and websites or apps (~18%).

I’m sure readers have their own personal views as to which of the sources of political news are preferable in terms of their veracity. For some, social media and late-night TV comedy programs illustrate a general decline in the “quality” of the news, whereas others might look at radio programs or cable TV news in precisely the same negative terms.

More details on the Pew Research study can be found here.

A nation of “haves” vs. “have-nots”? Gallup tests the perceptions.

pictureIn any presidential political season, there’s always plenty of rhetoric about the American economy, how well it’s performing for the average voter, and people’s perceptions of how they’re doing socioeconomically.

As it turns out, the Gallup Survey has been testing this issue annually for years now — going all the way back to 1988.

The question posed to Americans in Gallup’s surveys is a simple one: Do you consider yourself personally to be part of the “haves” or “have-nots” in America?

Gallup’s latest survey was fielded in July 2015.  Nearly 2,300 U.S. adults aged 18 and older were part of the research.

In response to the “haves vs. have-nots” question, ~58% of respondents considered themselves to be “haves” in U.S. society, while ~38% placed themselves in the “have-nots” segment. (The remaining ~14% see themselves borderline between the two, or they don’t have an opinion.)

Over time, Gallup has found that the percentage of Americans who perceive themselves to be part of the “have-nots” in society rose pretty steadily from 1988 to 1998, but since that time the percentages have leveled off — even during the worst years of the Great Recession from 2009-2011.

And so, the “haves” percentage has fluctuated in a tight band between 57% and 60% in each year since the late 1990s.

It seems that heightened discussions about social inequality in America haven’t resulted in a higher percentage of people thinking that they are on the less fortunate side of the country’s socioeconomic divide.

However, considering that the latest Gallup survey was conducted in July 2015 — and that since that time there have been more news events drawing attention to the issues of social justice — one wonders if we may be on the cusp of some changing thinking on the subject.

Another persistent finding in Gallup’s surveys is this:  Even among families of quite modest means (annual household incomes of $35,000 or lower), only a little more than half in that segment consider themselves to be part of the “have-nots” group.

Education-wise, the survey findings are similar, with fewer than half of the respondents who don’t possess college degrees considering themselves part of the “have-nots” segment.

In reporting on the Gallup survey results, an article published in the November 2015 issue of Quirk’s Marketing Research Review magazine stated:

“The stratification of U.S. society into unequal socioeconomic groups has long been a fixture of philosophic, political and cultural debate. It appears to have remained or even expanded as a fairly dominant leitmotif in the ongoing 2016 election, particularly among the Democratic presidential candidates. 

[Nevertheless,] the results … in this analysis show that a majority of U.S. adults do not think of American society as being divided along economic lines, and a slightly higher percentage say that if society is divided, they personally are on the ‘haves’ side of the equation rather than the ‘have-nots.'”

More information about the Gallup survey results can be viewed here.

What are your thoughts? Do the perceptions Americans have of socioeconomic inequality — or the lack of it — match the reality?  Or are we poised to see some new significant shifts in the way Americans view socioeconomic divisions in this country?

The disappearing American middle class? The Pew Research Center weighs in.

mcIn this political season in the United States — when is it ever not, one wonders? — we hear many of the presidential candidates refer to the so-called “crisis” of the middle class.

It matters not the political party nor ideological stripe of the candidate, we hear copious references to “the disappearing middle class” … the “middle class squeeze” … and that the middle class is “just getting by.”

Considering that the middle class income group represent the single largest block of voters in the country, it isn’t at all surprising that the presidential contenders would talk about middle class issues — and to middle class voters — so frequently.

The question is … is the hand-wringing warranted?

PewWell, if one believes a new Pew Research study on the subject, it may well be the case.

Based on its most recent analysis of government data going back nearly 50 years, Pew reports that there are now fewer Americans in the “middle” of the economic spectrum than at the lower and upper ends.

This is a major development, and it is new.

Pew defines a middle class household as one with annual income ranging from ~$42,000 to ~$126,000 during 2014. Using that definition, Pew calculates that there are now 120.8 million adults living in middle class households, but 121.3 million who are living in either upper- or lower-income households.

Pew characterizes this new set of figures as a kind of tipping point. And it helps to underscore the narrative wherein certain presidential candidates — you-all know which ones — are tapping into a collective “angst” about the decline in middle-income families, and the notion that they are falling behind compared to upper-income adults while unable to access many of the support services available to lower-income households.

Looking at things in a bit more depth, however, one can find explanations — as well as other data points that go against the “narrative” to some degree. Consider the following:

  • Senior citizens have done quite well shifting into the upper category since the 1970s — their share increasing by well over 25% in the upper-income bracket.
  • African-Americans have experienced the largest increase in income status over the same period, meaning that their lower-income category share is lower today.
  • The rapid rise in the number of immigrants in the late 20th century has pushed down median incomes because those new arrivals, on average, make less in income.

I suspect the Pew study findings will be fodder for more discussion — and perhaps some additional sloganeering — in the upcoming weeks and months. But you can judge for yourself whether that’s warranted by reviewing more findings from Pew’s report here.

If you have your own perspectives about what’s happening with (or to) the middle class, I’m sure other readers would be quite interested in hearing them.  Please share your comments here.

What’s in a name? When it comes to senior living communities – plenty.

BrooksideFor those of us “of a certain age,” it seems hard to believe that within five years, most of the Baby Boomer generation will be of retirement age.

… This also means that millions of people will be thinking about downsizing, right-sizing, or whatever the applicable term may be.

All sorts of considerations come into play when making such a decision; climate, social, cultural and recreation opportunities, plus proximity to relatives are some of the most common.

But when the dust settles, most people will actually end up “aging in place.”

That’s one key finding from a recent survey of ~4,000 American Baby Boomer households that was conducted by the Demand Institute Housing & Community.

Not only do nearly two-thirds of the respondents plan to stay in their current homes, the majority of them feel that their homes are well-suited for aging – even if they’re multi-story, don’t offer accessibility features, or aren’t particularly low-maintenance structures.

But the survey suggests another interesting dynamic that may also be at work:  the notion that senior living communities are primarily places for people who have serious health issues or who can’t take care of themselves on their own.

Let’s face it.  Baby Boomers don’t consider themselves part of that cohort at all, which they equate with people who are substantially more elderly than themselves.

When you think about it, so many of the terms used to describe senior living facilities convey exactly the wrong thing to Baby Boomers.  The names may well be accurate descriptions of the properties in question, but they fairly scream “geriatrics.”

community

I’ve run across quite a few descriptors.  A good number of them reside in the same wheelhouse – which is to say, distinctly unattractive.  Meanwhile, other alternative names are often too narrowly descriptive as well, because one important aspect of senior living is to access to continuing care if and when that becomes necessary.

Either way, those charged with marketing these properties clearly prefer the word “community” over the word “center” or “home.”  But you can be the judge of how successful these names really are:

  • 55+ communities
  • Active adult communities
  • Age-restricted communities
  • Continuing care retirement communities
  • Elder cohousing communities
  • Independent living communities
  • Leisure communities
  • Mature living communities
  • Senior housing communities
  • Senior living communities

The bottom line on this is pretty fundamental:  Few people – regardless of how old they are – wish to be reminded of the limitations of life on a downward curve.  It’s just not compatible with the positive attributes that are so much a part of human nature.  Anything we can do to avoid being reminded of our mortality, we’ll do.

Obviously, that reluctance to face the reality of aging is of concern to property developers in the housing industry as well.  One of the actions coming out of field research such as the Diamond study is a new initiative to establish an alternative “umbrella descriptor” that works across the entire spectrum of senior living facilities.

It will be interesting to see where that exercise will end up.  As for me, I’m guessing it’ll still telegraph “geriatric.”  But perhaps we’ll end up being surprised.

“Immigration Nation”: Pew Research Projects U.S. Population Demographics into the Future

immigrantsI’ve blogged before about the immigration issue and its potential impact on the U.S. economy and society.

Now the Pew Research Center has released a report that predicts the U.S. becoming a “no ethnic majority” nation within the next 35 years.

When one considers that the United States population was nearly 85% white Anglo in 1965 … and that percentage has dropped to about 62% now, it isn’t that hard to imagine Pew’s prediction coming true.

Here’s the trajectory Pew predicts over the coming ten-year periods:

  • 2015: ~62% estimated U.S. white Anglo population percentage
  • 2025: ~58% projected white Anglo population percentage
  • 2035: ~56% projected
  • 2045: ~51% projected
  • 2055: ~48% projected
  • 2065: ~46% projected

Perhaps what’s more intriguing is that Pew projects the largest future percentage gains will be among Asian-Americans rather than Latino or Black Americans. The Asian share of the American population is expected to double over the period:

  • 2015: ~6% estimated U.S. Asian population percentage
  • 2025: ~7% projected Asian population percentage
  • 2035: ~9% projected
  • 2045: ~10% projected
  • 2055: ~12% projected
  • 2065: ~14% projected

If these projections turn out to be accurate, the Asian population percentage is on tap to become the nation’s third highest group.

By contrast, the Hispanic population, while continuing to grow, looks as if it will level off at about 22% of the country’s population by 2045. For Black Americans, Pew projects the same dynamics at work, but at the 13% level.

citizenship ceremonyAccording to Pew’s analysis, the biggest driving force for the projected Asian population growth is immigration. By 2055, Pew expects that Asians will supplant Latinos as the largest single source of immigrants — and by 2065 the difference is expected to be substantial (38% Asian vs. 31% Latino immigrants).

Conducted in parallel with Pew’s projection analysis was an online opinion research survey of American adults (18 and over) it conducted in March and April of this year.

Among the attitudinal findings Pew uncovered were these:

  • “Immigrants in the U.S. are making society better”: ~45% of respondents agree … ~37% disagree
  • “I would like to see a reduction in immigration”: ~50% agree
  • “I would like to see the immigration system changed or completely revamped”: ~80% agree

Again, no great surprises in these figures — although if one paid attention only to news accounts in the “popular media,” one might find it surprising to learn that a plurality of Americans actually consider immigration a net positive for American society …

Additional findings from the Pew survey as well as its demographic projections can be found here.

Surprising statistic? One-third of American adults still aren’t on social media.

social mediaFor the many people who use social media on a daily basis, it may seem inconceivable that there are a substantial number of other Americans who aren’t on social media at all.

But that’s the case. The Pew Research Center has been tracking social media usage on an annual basis over the past decade or so, and it finds that about one-third of Americans still aren’t using any social networking sites.

To be sure, today’s ~65% participation rate is about ten times higher than the paltry ~7% participation rate Pew found the first time it surveyed Americans about their social media usage back in 2005.

According to Pew’s field research findings, here’s how the percentage of social media involvement has risen in selected years in the decade since. (The figures measure the percentage of Americans age 18 or over who use at least one social networking site.)

  • 2006: ~11% using at least one social networking site
  • 2008: ~25%
  • 2010: ~46%
  • 2012: ~55%
  • 2015: ~65%

In more recent years, the highest growth in social media participation rates has been among older Americans (over the age of 65), ~35% of whom are using social media today compared to just 11% five years ago.

That still pales in comparison to younger Americans (age 18-29), ~90% of whom use social media platforms.

While it’s a common perception that women are more avid users of social media than men, Pew’s research shows that the participation rate is actually not that far apart. Statistically it isn’t significant, in fact: a ~68% social media participation rate for women versus ~62% for men.

pew-research-centerSimilarly, there are more similarities than differences among the various racial and ethnic groups that Pew surveys — and the same dynamics are at work when it comes to differing education levels, too.

Regional differences in social media practice continue to persist, however, with rural residents less likely to use social media than suburban residents by a ten-point margin (58% versus 68%). City dwellers fall in between.

More details on Pew’s most recent field research on the topic of social media participation can be accessed here. See if you notice any surprising findings among them.

Social media data mining: Garbage-in, garbage-out?

gigoIt’s human nature for people to strive for the most flattering public persona … while confining the “true reality” only to those who have the opportunity (or misfortune) to see them in their most private moments.

It goes far beyond just the closed doors of a family’s household. I know a recording producer who speaks about having to “wipe the bottoms” of music stars — an unpleasant thought if ever there was one.

In today’s world of interactivity and social platforms, things are amplified even more — and it’s a lot more public.

Accordingly, there are more granular data than ever about people, their interests and their proclivities.

The opportunities for marketers seem almost endless. At last we’re able to go beyond basic demographics and other conventional classifications, to now pinpoint and target marketing messages based on psychographics.

And to do so using the very terms and phrases people are using in their own social interactions.

The problem is … a good deal of social media is one giant head-fake.

Don’t just take my word for it. Consider remarks made recently by Rudi Anggono, one of Google’s senior creative staff leaders. He refers to data collected in the social media space as “a two-faced, insincere, duplicitous, lying sack of sh*t.”

Anggono is talking about information he dubs “declared data.” It isn’t information that’s factual and vetted, but rather data that’s influenced by people’s moods, insecurities, social agenda … and any other set of factors that shape someone’s carefully crafted public image.

In other words, it’s information that’s made up of half-truths.

This is nothing new, actually. It’s been going on forever.  Cultural anthropologist Genevieve Bell put her finger on it years ago when she observed that people lie because they want to tell better stories and to project better versions of themselves.

What’s changed in the past decade is social media, of course.  What better way to “tell better stories and project better versions of ourselves” than through social media platforms?

Instead of the once-a-year Holiday Letter of yore, any of us can now provide an endless parade of breathless superlatives about our great, wonderful lives and the equally fabulous experiences of our families, children, parents, A-list friends, and whoever else we wish to associate with our excellent selves.

Between Facebook, Instagram, Pinterest and even LinkedIn, reams of granular data are being collected on individuals — data which these platforms then seek to monetize by selling access to advertisers.

In theory, it’s a whole lot better-targeted than the frumpy, old fashioned demographic selects like location, age, income level and ethnicity.

But in reality, the information extracted from social is suspect data.

This has set up a big debate between Google — which promotes its search engine marketing and advertising programs based on the “intent” of people searching for information online — and Facebook and others who are promoting their robust repositories of psychographic and attitudinal data.

There are clear signs that some of the social platforms recognize the drawbacks of the ad programs they’re promoting — to the extent that they’re now trying to convince advertisers that they deserve consideration for search advertising dollars, not just social.

In an article published this week in The Wall Street Journal’s CMO Today blog, Tim Kendall, Pinterest’s head of monetization, contends that far from being merely a place where people connect with friends and family, Pinterest is more like a “catalogue of ideas,” where people “go through the catalogue and do searches.”

Pinterest has every monetary reason to present itself in this manner, of course.  According to eMarketer, in 2014 search advertising accounted for more than 45% of all digital ad spending — far more than ad spending on social media.

This year, the projections are for more than $26 billion to be spent on U.S. search ads, compared to only about $10 billion in the social sphere.

The sweet spot, of course, is being able to use declared data in concert with intent and behavior. And that’s why there’s so much effort and energy going into developing improved algorithms for generating data-driven predictive information than can accomplish those twin goals.

Rudi Anggono
Rudi Anggono

In the meantime, Anggono’s admonition about data mined from social media is worth repeating:

“You have to prod, extrapolate, look for the intent, play good-cop/bad-cop, get the full story, get the context, get the real insights. Use all the available analytical tools at your disposal. Or if not, get access to those tools. Only then can you trust this data.”

What are your thoughts? Do you agree with Anggono’s position? Please share your perspectives with other readers here.