Not so neat: The rise of the NEET generation.

Millions of Americans age 20 to 24 fall into the NEET category: “Not in Employment, Education or Training.”

The COVID-19 pandemic has exposed some interesting fault-lines in the socio-economic fabric of  the United States.  One of these relates to young adults — those between the ages of 20 and 25. What we find paints a potentially disturbing picture of an economic and employment situation that may not be easily fixable.

A recently issued economic report published by the Center for Economic and Policy Research (CEPR) focuses on the so-called “NEET rate“– young Americans who are not employed, not in school, and not in training.

As of the First Quarter of 2021, the NEET category represented nearly 4 million Americans between the ages of 20 and 24. This eye-popping statistic goes well beyond the particular circumstances of the pandemic and may turn out to be an economically devastating trend with a myriad of adverse ripple effects related to it.

Look at any business newspaper or website these days and you’ll see reports regularly about worker shortages across many sectors — including unfilled jobs at the lower end of the pay scale which offer employment opportunities that fit well with the capabilities of lower skilled workers newly coming into the workplace.

At this moment, pretty much anyone who is willing to look around can easily find employment, schooling, or training of various kinds.  But for millions of Americans in the 20-24 age cohort, the  job opportunities appear to be falling on deaf ears. Bloomberg/Quint‘s reaction to the CEPT study certainly hits home:

“Inactive youth is a worrying sign for the future of the [U.S.] economy, as they don’t gain critical job skills to help realize their future earnings potential.  Further, high NEET rates may foster environments that are fertile for social unrest.”

… Daily urban strife in Portland, Minneapolis and Seattle, anyone?

It doesn’t much help that younger Americans appear to be less enamored with the basic economic foundations of the country than are their older compatriots.  A recent poll by Axios/Momentive has found that while nearly 60% of Americans hold positive views of capitalism, those sentiments are share by a only little more than 40% of those in the 18-24 age category. 

Moreover, more than 50% of the younger group view socialism positively compared to only around 40% of all Americans that feel the same way.

The coronavirus pandemic may have laid bare these trends, but it would be foolish to think that the issues weren’t percolating well before the first U.S. businesses began to lock down in March 2020. 

And more fundamentally, one could question just how much government can do to reverse the trend; perhaps the best thing to do is to stop “helping” so much … ?

More information about the CEPR report can be viewed here.  What are your thoughts on this issue?  Please share your views with other readers here. 

Changing the “work-live location paradigm” in the wake of the coronavirus pandemic.

As the COVID-19 pandemic grinds on, its long-term implications on how we will live and work in the future are becoming clearer. 

Along those lines, a feature article written by urban studies theorist Richard Florida and economist Adam Ozimek that appeared in this past weekend’s Wall Street Journal explores how remote working has the potential to reshape America’s urban geography in very fundamental ways.

Just before the first lockdowns began in April 2020, fewer than 10% of the U.S. labor force worked remotely full-time.  But barely a month later, around half of the labor was working remotely.  And now, even after the slow easing of workplace restrictions that began to take effect in the summer of 2020, most of the workers who were working remotely have continued to do so.

The longer-term forecast is that perhaps 25% of the labor force will continue to work fully remote, even after life returns to “normal” in the post-COVID era.

For clues as to why the “new normal” will be so different from the “old” one, we can start with worker productivity data.  Stanford University economist Nicholas Bloom has studied such productivity trends in the wake of the coronavirus and finds evidence that the productivity boost from remote work could be as high as 2.5%. 

Sure, there may be more instances of personal work being done on company time, but counterbalancing that is the decline of commuting time, as well as the end of time-suck distractions that characterized daily life at the office.

As Florida and Ozimek explain further in their WSJ article:

“Major companies … have already announced that employees working from home may continue to do so permanently.  They have embraced remote work not only because it saves them money on office space, but because it gives them greater access to talent, since they don’t have to relocate new hires.”

The shift to remote working severs the traditional connection between where people live and where they work.  The impact of that change promises to be significant for quite a few cities, towns and regions.  For smaller urban areas especially, they can now build their local economies based on remote workers and thus compete more easily against the big-city, high-tech coastal business centers that have dominated the employment landscape for so long.

Whereas metro areas like Boston, San Francisco, Washington DC and New York had become prohibitively expensive from a cost-of-living standpoint, today smaller metro areas such as Austin, Charlotte, Nashville and Denver are able to use their more attractive cost-of-living characteristics to attract newly mobile professionals who wish to keep more of their hard-earned incomes. 

For smaller urban areas and regions such as Tulsa, OK, Bozeman, MT, Door County, WI and the Hudson Valley of New York it’s a similar scenario, as they become magnets for newly mobile workers whose work relies on digital tools, not physical location.

Pew Research has found that the number of people moving spiked in the months following the onset of the coronavirus pandemic – who suddenly were relocating at double the pre-pandemic rate.  As for the reasons why, more than half of newly remote workers who are looking to relocate say that they would like a significantly less expensive house. The locational choices they have are far more numerous than before, because they can select a place that best meets their own personal or family needs without worrying about how much they can earn in the local business market.

For many cities and regions, economic development initiatives are likely to morph from luring companies with special tax incentives or other financial perks, and more towards luring a workforce through civic services and amenities:  better schools, safer streets, and more parks and green spaces. 

There’s no question that the “big city” will continue to hold attraction for certain segments of the populace.  Younger workers without children will be drawn to the excitement and edginess of urban living without having to regard for things like quality schools.  Those with a love for the arts will continue to value the kind of convenient access to museums, theatres and the symphony that only a large city can provide.  And sports fanatics will never want to be too far away from attending the games of their favorite teams.

But for families with children, or for people who wish to have a less “city” environment, their options are broader than ever before.  Those people will likely be attracted to small cities, high-end suburbs, exurban environments or rural regions that offer attractive amenities including recreation. 

Getting the short end of the stick will be older suburbs or other run-of-the-mill localities with little to offer but tract housing – or anything else that’s even remotely “unique.”

They’re interesting future prospects we’re looking at – and on balance probably a good one for the country and our society as it’s enabling us to smooth out some of the stark regional disparities that had developed over the past several decades.

What are your thoughts on these trends?  Please share your perspectives with other readers.

“You are what you wear.”

Research from Duke University suggests that people who are dressed up buy more and spend more than their casually dressed counterparts.

Ever since the COVID-19 pandemic hit, people have been “dressing down” more than ever.  But recent consumer research suggests that for buying more and spending more, retailers do much better when their customers are dressing sharp.

Researchers at Duke University’s Fuqua School of Business analyzed the shopping habits of two different groups of consumers.  Smartly dressed shoppers — as in wearing dresses or blazers — put more items in their carts and spent more money compared to casual dressers (as in wearing T-shirts and flip-flops).

The difference among the two groups’ shopping behaviors were significant, too:  18% more items purchased and 6% more money spent by the sharp dressers.

The Duke University research findings were written up in a paper titled “The Aesthetics We Wear: How Attire Influences What We Buy,” which was published in the Journal of the Association for Consumer Research.

According to Keisha Cutright, a Duke University professor of marketing and a co-author of the report, when people are dressed up they tend to have more social confidence, which in turn reduces the anxiety people may feel about making certain purchasing decisions:

“We focus on how your dress affects your own perceptions.  When you’re dressed formally, you believe that people are looking at you more favorably and they believe you are more competent.  If you feel competent, you can buy whatever you want without worrying what other people think, or whether they will be judging you negatively.”

Parallel Duke research also found that retailers can actually prompt would-be shoppers to wear nicer outfits when shopping at their stores by featuring nicely dressed models in their advertising.  “So, there are some practical implications from the research for retailers,” Cutright says.

How about you? What sort of dynamics are in play regarding how you’re dressed and what you buy as a result?  Is there a correlation between what you’re wearing and how you’re shopping?  Please share your observations with other readers here.

Restaurants face their demographic dilemmas.

There’s no question that the U.S. economy has been on a roll the past two years. And yet, we’re not seeing similar momentum in the restaurant industry.

What gives?

As it turns out, the challenges that restaurants face are due to forces and factors that are a lot more fundamental than the shape of the economy.

It’s about demographics.  More specifically, two things are happening: Baby boomers are hitting retirement age … and millennials are having children.  Both of these developments impact restaurants in consequential ways.

Baby boomers – the generation born between 1946 and 1964 – total nearly 75 million people. They’ve been the engine driving the consumer economy in this country for decades.  But this big group is eating out less as they age.

The difference in behavior is significant. Broadly speaking, Americans spend ~44% of their food dollar away from home.  But for people under the age of 25 the percentage is ~50% spent away from home, whereas for older Americans it’s just 38%.

Moreover, seniors spend less money on food than younger people. According to 2017 data compiled by the federal government, people between the age of 35 and 44 spend more than $4,200 each year in restaurants, on average.  For people age 65 and older, the average is just $2,500 (~40% less).

Why the difference? The generally smaller appetites of people who are older may explain some of it, but I suspect it’s also due to lower disposable income.

For a myriad of reasons, significant numbers of seniors haven’t planned well financially for their retirement.  Far too many have saved exactly $0, and another ~25% enter retirement with less than $50,000 in personal savings.  Social security payments alone were never going to support a robust regime of eating out, and for these people in particular, what dollars they have in reserve amount to precious little.

Bottom line, restaurateurs who think they can rely on seniors to generate sufficient revenues and profits for their operations are kidding themselves.

As for the millennial generation – the 75 million+ people born between 1981 and 1996 – this group just barely outpaces Boomers as the biggest one of all. But having come of age during the Great Recession, it’s also a relatively poorer group.

In fact, the poverty rate among millennials is higher than for any other generation. They’re majorly in debt — to the tune of ~$42,000 per person on average (mostly not from student loans, either).  In many places they’ve had to face crushingly high real estate prices – whether buying or renting their residence.

Millennials are now at the prime age to have children, too, which means that more of their disposable income is being spent on things other than going out to eat.

If there is a silver lining, it’s that the oldest members of the millennial generation are now in their upper 30s – approaching the age when they’ll again start spending more on dining out.  But for most restaurants, that won’t supplant the lost revenues resulting from the baby boom population hitting retirement age.

Declining DUI arrests: What’s the cause?

Looking back over the past eight years or so, something very interesting has been happening to the arrest rate statistics for people driving under the influence.

DUI arrests have been dropping – pretty steadily and inexorably.

The trend started in 2011, in which year an 8% decline in DUI arrests was experienced over the prior year. In 2012 the decline was 4.1% … in 2013, it was another 7.2%.

And arrest rates didn’t plateau after that, either. DUI arrests have continued to decline — even as police departments have continued to put plenty of cops on the beat for such purposes.

One of the most dramatic examples of the continued decline is in Miami-Dade County — the highest population county in the entire Southeastern U.S.  The Miami-Dade police force made DUI arrests in 2017 that were 65% fewer than four years earlier.

Look around the country and you’ll see similar trends in places as diverse as San Antonio, TX, Phoenix, AZ, Portland, OR and Orange County, CA.

There are common thread, in what’s being seen across the nation:

  • DUI arrest levels are down in major metro areas — but not necessarily in exurban or rural areas.
  • DUI arrest levels have declined the nearly all of the metro areas where ride-sharing services are prominent.

This last point a significant factor to consider:  The increasing popularity of ride sharing services has coincided with the drop in DUI arrests.

A 2017 University of Pennsylvania analysis found that in places where ride-hailing services were readily available, in most cases DUI arrests had declined upwards of 50% or more compared to just a few years earlier.

Ride-hailing services are particularly popular with younger adults, who like the smartphone apps that make them pretty effortless to use.  They’re also popular with people who are looking for more affordable ways to get about town compared to what highly regulated taxi services choose to charge.

Plus, the “cool” factor of ride-sharing leaves old-fashioned taxi services pretty much in the dust.

The few exceptions of locations where DUI arrest declines haven’t been so pronounced are in places like Las Vegas and Reno, NV – tourist destinations that draw out-of-towners who frequently take public transportation, hail taxis, or simply walk rather than rent vehicles to get around town.

With the positive consequences of fewer DUI arrests – which also correlate to reductions in vehicular homicides and lower medical care costs due to fewer people injured in traffic accidents, as well as reductions in the cost of prosecuting and incarcerating the perpetrators – one might think that other urban areas would take notice and become more receptive to ride-sharing services than they have been up to now.

But where taxi services are well-entrenched and “wired” into the political fabric – a situation often encountered in older urban centers like Chicago, St. Louis, Philadelphia and Baltimore — the ancillary benefits of ride-sharing services don’t appear to hold much sway with city councils or city regulators – at least not yet.

One might suppose that overstretched urban police departments would welcome having to spend less time patrolling the streets for DUI drivers.  And if that benefits police departments … well, the police also represents a politically important constituency, too.

It seems that some fresh thinking may be in order.

Social Media Mashup — 2018 Edition

One thing you can say about social media platforms – their world is invariably interesting. Or as a colleague of mine likes to say, “With social media, you drop your pencil, you miss a week.”

The Pew Research Center makes it a point to study the topic twice each year in order to stay on top of the latest shifts in social media usage trends. Pew has just completed its latest report, and what it shows are some findings that confirm longer-term trends along with several evolving new narratives.

One thing hasn’t changed much: Facebook and YouTube continue to dominate the social landscape in the United States.  Facebook remains the primary social media platform for most Americans – with two-thirds of U.S. adults reporting that they use Facebook, and three-fourths of those saying that they access the platform on a daily basis.

What this means is that half of all U.S. adults are going on the Facebook platform every day.

If anything, YouTube is even more ubiquitous – at least in terms of the percentage of people who access the platform (nearly 75% of the respondents in the Pew survey). But the frequency of visits is lower, so one could say that the platform isn’t as “sticky” as Facebook.

No other social media platform is used by more than 35% of American adults, according to the Pew survey:

  • YouTube: ~73% of U.S. adults report that they use this platform
  • Facebook: ~68%
  • Instagram: ~35%
  • Pinterest: ~29%
  • SnapChat: ~27%
  • LinkedIn: ~25%
  • Twitter: ~24%
  • WhatsApp: ~22%

The chart below shows social media usage trends based on Pew Research studies going back to 2012:

Taking a closer look at social media behaviors reveals some stark differences by age group, and they portend even greater changes in the social media landscape as time goes on. In terms of being involved in “any” social media usage, Pew finds significant differences by age cohort:

  • Age 18-29: ~88% use at least one form of social media
  • Age 30-49: ~78%
  • Age 50-64: ~64%
  • Age 65+: ~37%

So, as the current population ages out, social media participation should go even higher.

But what about the composition of platform usage? Within the 18-24 age group, Snapchat, Instagram and Twitter are used significantly more when compared to even the next oldest age group.  Most dramatically, for Snapchat the participation level is ~78% for the youngest group compared to just ~54% for those age 30-49.

Other notable differences among groups include:

  • Pinterest is much more popular among women (~41% use the platform) than with men (just ~16%).
  • WhatsApp is particularly popular among American Hispanics (~41%) compared to blacks (~21%) and whites (~14%).
  • LinkedIn’s niche is upper-income households ($75,000+ annual income), which correlates to higher education levels. Half of American adults with college degrees use LinkedIn, compared to fewer than 10% of those with a high school degree or less.

More detailed results from the Pew Research study can be found here.

Working hard … yet hardly getting ahead.

Many full-time workers in the 25-35 age group with college training don’t need reminding that they’re struggling to balance paying for student loans while at the same time attempting to have decent housing and handling their day-to-day expenses.

I’m not in that age group, but our two children are – and I can see from their friends and work colleagues just how much of a challenge it is for many of them to balance these competing necessities.

One way to deal with the challenge is to settle for the sardine-like living arrangements one encounters in quite a few urban areas, with anywhere from three to six people residing in the same (medium-sized) apartment or (small) house.

Somehow, things just didn’t see so difficult for me “back in the day.” Of course, the entirety of my student loans following college amounted to a monthly payment of $31.28, with seven years to pay it off.

First apartment — a $185 per month rental.

And my first apartment – a one-bedroom flat in an elegant 1920’s building, complete with a beautiful lobby and old-fashioned glam elevator, cost me a mere $185 per month.

Not only that, it was only a five-minute bus ride to my downtown banking job.

Now, a newly released analysis published by the American Consumer’s Newsletter helps quantify the different reality for today’s younger workers.

What the data show is that a college degree does continue to provide higher earnings for younger workers compared to those without one.

But … it also reveals that adjusted for inflation, their earnings are lower than their college-educated counterparts in the past.

According to a National Center for Education Statistics analysis as published by the AC Newsletter, here’s a summary of the median earnings differences for male full-time workers in the 25-34 age cohort, comparing 2016 to the year 2000 in inflation-adjusted dollars:

  • Master’s or higher degree: $71,640 … down 6.4% from 2000
  • Bachelor’s degree: $56,960 … down 8.8%
  • Associate’s degree: $43,000 … down 11.8%
  • Some college, but no degree: $37,980 … down 14.3%
  • High school degree: $34,750 … down 13.6%
  • High school dropout: $28,560 … up 2.8%

Thus, among full-time male workers across all education levels, only high school dropouts have experienced a real increase in earnings between 2000 and 2016.

Among female workers, the trends are a little better, but still hardly impressive – and they also start from lower 2000 income levels to begin with:

  • Master’s or higher degree: $57,690 … down 0.5% from 2000
  • Bachelor’s degree: $44,990 … down 7.5%
  • Associate’s degree: $31,870 … down 12.0%
  • Some college, but no degree: $29,980 … down 13.8%
  • High school degree: $28,000 … down 7.2%
  • High school dropout: $21,900 … up 5.0%

What’s even more challenging for workers carrying student loan debt is that those debt levels are higher than ever – often substantially so.

According to a Brookings Institution comparative study, fewer than 5% of students leaving school in 2000 carried more than $50,000 in student loan debt. In inflation-adjusted terms, by 2014, that percentage had risen to ~17%.

Looked at another way, ~40% of borrowers are carrying student loan debt balances exceeding $25,000. It doesn’t take a finance whiz to figure out how big of a hit that is out of a worker’s paycheck.

It makes the some of today’s realities: people living at home longer following college; having frat- or sorority-like living arrangements; putting off plans to purchase a home, or even putting off marriage plans – all the more understandable.

And I’m not exactly sure what the remedy is, either. When it comes to overburdened education debt, it isn’t as if people can go back and rewrite the script very easily.

America’s “Always On” Dynamics

It’s natural to assume that these days, pretty much all Americans go online regularly. And indeed, that is the case.  According to a survey of ~2,000 Americans age 18 and older conducted recently by the Pew Research Center, more than three in four respondents (~77%) reported that they go online at least once each day.

Compare that to the far smaller cohort of people who don’t use the Internet at all, which is only around 10%.

But even more interesting perhaps is another finding from the Pew survey: More than one in four Americans (~26%) report that they are online “almost constantly”.

That proportion is up from one in five just a couple years ago.

Even for people who go online but don’t use a mobile device, nearly 55% report that they go online at least daily, although just 5% of them report being online continually.

Looking further into the Pew findings, the “always on” population is skewed younger … better educated … ethnically diverse … and with higher incomes:

Gender

  • Men: ~25%
  • Women: ~27%

Age

  • 18-29: ~39%
  • 30-49: ~36%
  • 50-64: ~17%
  • 65 or older: ~8%

Education Level

  • High school degree or less: ~20%
  • Some college: ~28%
  • College degree or more: ~34%

Race

  • Non-white: ~33%
  • White: ~23%

Income Level

  • Less than $30K annual income: ~24%
  • $30-$75K annual income: ~25%
  • $75K or higher annual income: ~35%

Location

  • Living in urban areas: ~32%
  • Living in suburban areas: ~27%
  • Living in rural areas: ~15%

Regarding location, one explanation for the lower “always on” characteristics of rural dwellers may be that interconnectivity isn’t as simple and easy as it is in urban environments.

Or perhaps it’s because rural areas offer more attractive options for people to spend their time doing more fulfilling things than being tethered to the online world 24/7/365 …

Which is it? Your thoughts on this or the other dynamics uncovered by Pew are welcomed.  You can also read more about the survey findings here.

Smartphones go mainstream with all age groups.

Today, behaviors across the board are far more “similar” than they are “different.”

Over the past few years, smartphones have clawed their way into becoming a pervasive presence among consumers in all age groups.

That’s one key takeaway message from Deloitte’s 2017 Mobile Consumer Survey covering U.S. adults.

According to the recently-released results from this year’s research, ~82% of American adults age 18 or older own a smartphone or have ready access to one. It’s a significant jump from the ~70% who reported the same thing just two years ago.

While smartphone penetration is highest among consumers age 18-44, the biggest increases in adoption are coming in older demographic categories.  To illustrate, ~67% of Deloitte survey respondents in the age 55-75 category own or have ready access to smartphones, which is big increase from the ~53% who reported so in 2015.

It represents an annual rate of around 8% for this age category.

The Deloitte research also found that three’s little if any difference in the behaviors of age groups in terms of how they interact with their smartphones. Daily smartphone usage is reported by 9 in 10 respondents regardless of the age bracket.

Similarly-consistent across all age groups is the frequency that users check their phones during any given day. For the typical consumer, it happens 47 times daily on average.  Fully 9 in 10 report looking at their phones within an hour of getting up, while 8 in 10 do the same just before going to sleep.

At other times during the day, the incidence of smartphone usage quite high in numerous circumstances, the survey research found:

  • ~92% of respondents use smartphones when out shopping
  • ~89% while watching TV
  • ~85% while talking to friends or family members
  • ~81% while eating at restaurants
  • ~78% while eating at home
  • ~54% during meetings at work

As for the “legacy” use of cellphones, a smaller percentage of respondent’s report using their smartphones for making voice calls. More than 90% use their smartphone to send and receive text messages, whereas a somewhat smaller ~86% make voice calls.

As for other smartphone activities, ~81% are sending and receiving e-mail messages via their smartphone, ~72% are accessing social networks on their smartphones at least sometimes during the week, and ~30% report making video calls via their smartphones – which is nearly double the incidence Deloitte found in its survey two years ago.

As for the respondents in the survey who use smartwatches, daily usage among the oldest age cohort is the highest of all: Three-quarters of respondents age 55-75 reported using their smartwatches daily, while daily usage for younger consumers was 60% or even a little below.  So, in this one particular category, older Americans are actually ahead of their younger counterparts in adoption and usage.

The Deloitte survey shows pretty definitively that it’s no longer very valid to segregate older and younger generations. While there may be some slight variations among younger vs. older consumers, the reality is that market behaviors are far more the same than they are different.  That’s the first time we’ve seen this dynamic playing out in the mobile communications segment.

Additional findings from the Deloitte research can be found in an executive summary available here.

Does “generational marketing” really matter in the B-to-B world?

For marketers working in certain industries, an interesting question is to what degree generational “dynamics” enter into the B-to-B buying decision-making process.

Traditionally, B-to-B market segmentation has been done along the lines of the size of the target company, its industry, where the company’s headquarters and offices are located, plus the job function or title of the most important audience targets within these other selection criteria.

By contrast, something like generational segmenting was deemed a far less significant factor in the B-to-B world.

But according to marketing and copywriting guru Bob Bly, things have changed with the growing importance of the millennial generation in B-to-B companies.

These are the people working in industrial/commercial enterprises who were born between 1980 and 2000, which places them roughly between the ages of 20 and 40 right now.

There are a lot of them. In fact, Google reports that there are more millennial-generation B-to-B buyers than any other single age group; they make up more than 45% of the overall employee base at these companies.

Even more significantly, one third of millennials working inside B-to-B firms represent the sole decision-makers for their company’s B-to-B purchases, while nearly three-fourths are involved in purchase decision-making or influencing to some degree.

But even with these shifts in employee makeup, is it really true that millennials in the B-to-B world go about evaluating and purchasing goods and services all that differently from their older counterparts?

Well, consider these common characteristics of millennials which set them apart:

  • Millennials consider relationships to be more important than the organization itself.
  • Millennials want to have a say in how work gets done.
  • Millennials value open, authentic and real-time information.

This last point in particular goes a long way towards explaining the rise in content marketing and why those types of promotional initiatives are often more effective than traditional advertising.

On the other hand … don’t let millennials’ stated preferences for text messaging over e-mail communications lead you down the wrong path. E-mail marketing continues to deliver one of the highest ROIs of any MarComm tactic – and it’s often the highest by a long stretch.

Underscoring this point, last year the Data & Marketing Association [aka Direct Marketing Association] published the results of a comparative analysis showing that e-mail marketing ROI outstripped social media and search engine marketing (SEM) ROI by a factor of 4-to-1.

So … it’s smart to be continually cognizant of changing trends and preferences. But never forget the famous French saying: Plus ça change, plus c’est la même chose