Not so neat: The rise of the NEET generation.

Millions of Americans age 20 to 24 fall into the NEET category: “Not in Employment, Education or Training.”

The COVID-19 pandemic has exposed some interesting fault-lines in the socio-economic fabric of  the United States.  One of these relates to young adults — those between the ages of 20 and 25. What we find paints a potentially disturbing picture of an economic and employment situation that may not be easily fixable.

A recently issued economic report published by the Center for Economic and Policy Research (CEPR) focuses on the so-called “NEET rate“– young Americans who are not employed, not in school, and not in training.

As of the First Quarter of 2021, the NEET category represented nearly 4 million Americans between the ages of 20 and 24. This eye-popping statistic goes well beyond the particular circumstances of the pandemic and may turn out to be an economically devastating trend with a myriad of adverse ripple effects related to it.

Look at any business newspaper or website these days and you’ll see reports regularly about worker shortages across many sectors — including unfilled jobs at the lower end of the pay scale which offer employment opportunities that fit well with the capabilities of lower skilled workers newly coming into the workplace.

At this moment, pretty much anyone who is willing to look around can easily find employment, schooling, or training of various kinds.  But for millions of Americans in the 20-24 age cohort, the  job opportunities appear to be falling on deaf ears. Bloomberg/Quint‘s reaction to the CEPT study certainly hits home:

“Inactive youth is a worrying sign for the future of the [U.S.] economy, as they don’t gain critical job skills to help realize their future earnings potential.  Further, high NEET rates may foster environments that are fertile for social unrest.”

… Daily urban strife in Portland, Minneapolis and Seattle, anyone?

It doesn’t much help that younger Americans appear to be less enamored with the basic economic foundations of the country than are their older compatriots.  A recent poll by Axios/Momentive has found that while nearly 60% of Americans hold positive views of capitalism, those sentiments are share by a only little more than 40% of those in the 18-24 age category. 

Moreover, more than 50% of the younger group view socialism positively compared to only around 40% of all Americans that feel the same way.

The coronavirus pandemic may have laid bare these trends, but it would be foolish to think that the issues weren’t percolating well before the first U.S. businesses began to lock down in March 2020. 

And more fundamentally, one could question just how much government can do to reverse the trend; perhaps the best thing to do is to stop “helping” so much … ?

More information about the CEPR report can be viewed here.  What are your thoughts on this issue?  Please share your views with other readers here. 

Changing the “work-live location paradigm” in the wake of the coronavirus pandemic.

As the COVID-19 pandemic grinds on, its long-term implications on how we will live and work in the future are becoming clearer. 

Along those lines, a feature article written by urban studies theorist Richard Florida and economist Adam Ozimek that appeared in this past weekend’s Wall Street Journal explores how remote working has the potential to reshape America’s urban geography in very fundamental ways.

Just before the first lockdowns began in April 2020, fewer than 10% of the U.S. labor force worked remotely full-time.  But barely a month later, around half of the labor was working remotely.  And now, even after the slow easing of workplace restrictions that began to take effect in the summer of 2020, most of the workers who were working remotely have continued to do so.

The longer-term forecast is that perhaps 25% of the labor force will continue to work fully remote, even after life returns to “normal” in the post-COVID era.

For clues as to why the “new normal” will be so different from the “old” one, we can start with worker productivity data.  Stanford University economist Nicholas Bloom has studied such productivity trends in the wake of the coronavirus and finds evidence that the productivity boost from remote work could be as high as 2.5%. 

Sure, there may be more instances of personal work being done on company time, but counterbalancing that is the decline of commuting time, as well as the end of time-suck distractions that characterized daily life at the office.

As Florida and Ozimek explain further in their WSJ article:

“Major companies … have already announced that employees working from home may continue to do so permanently.  They have embraced remote work not only because it saves them money on office space, but because it gives them greater access to talent, since they don’t have to relocate new hires.”

The shift to remote working severs the traditional connection between where people live and where they work.  The impact of that change promises to be significant for quite a few cities, towns and regions.  For smaller urban areas especially, they can now build their local economies based on remote workers and thus compete more easily against the big-city, high-tech coastal business centers that have dominated the employment landscape for so long.

Whereas metro areas like Boston, San Francisco, Washington DC and New York had become prohibitively expensive from a cost-of-living standpoint, today smaller metro areas such as Austin, Charlotte, Nashville and Denver are able to use their more attractive cost-of-living characteristics to attract newly mobile professionals who wish to keep more of their hard-earned incomes. 

For smaller urban areas and regions such as Tulsa, OK, Bozeman, MT, Door County, WI and the Hudson Valley of New York it’s a similar scenario, as they become magnets for newly mobile workers whose work relies on digital tools, not physical location.

Pew Research has found that the number of people moving spiked in the months following the onset of the coronavirus pandemic – who suddenly were relocating at double the pre-pandemic rate.  As for the reasons why, more than half of newly remote workers who are looking to relocate say that they would like a significantly less expensive house. The locational choices they have are far more numerous than before, because they can select a place that best meets their own personal or family needs without worrying about how much they can earn in the local business market.

For many cities and regions, economic development initiatives are likely to morph from luring companies with special tax incentives or other financial perks, and more towards luring a workforce through civic services and amenities:  better schools, safer streets, and more parks and green spaces. 

There’s no question that the “big city” will continue to hold attraction for certain segments of the populace.  Younger workers without children will be drawn to the excitement and edginess of urban living without having to regard for things like quality schools.  Those with a love for the arts will continue to value the kind of convenient access to museums, theatres and the symphony that only a large city can provide.  And sports fanatics will never want to be too far away from attending the games of their favorite teams.

But for families with children, or for people who wish to have a less “city” environment, their options are broader than ever before.  Those people will likely be attracted to small cities, high-end suburbs, exurban environments or rural regions that offer attractive amenities including recreation. 

Getting the short end of the stick will be older suburbs or other run-of-the-mill localities with little to offer but tract housing – or anything else that’s even remotely “unique.”

They’re interesting future prospects we’re looking at – and on balance probably a good one for the country and our society as it’s enabling us to smooth out some of the stark regional disparities that had developed over the past several decades.

What are your thoughts on these trends?  Please share your perspectives with other readers.

Restaurants face their demographic dilemmas.

There’s no question that the U.S. economy has been on a roll the past two years. And yet, we’re not seeing similar momentum in the restaurant industry.

What gives?

As it turns out, the challenges that restaurants face are due to forces and factors that are a lot more fundamental than the shape of the economy.

It’s about demographics.  More specifically, two things are happening: Baby boomers are hitting retirement age … and millennials are having children.  Both of these developments impact restaurants in consequential ways.

Baby boomers – the generation born between 1946 and 1964 – total nearly 75 million people. They’ve been the engine driving the consumer economy in this country for decades.  But this big group is eating out less as they age.

The difference in behavior is significant. Broadly speaking, Americans spend ~44% of their food dollar away from home.  But for people under the age of 25 the percentage is ~50% spent away from home, whereas for older Americans it’s just 38%.

Moreover, seniors spend less money on food than younger people. According to 2017 data compiled by the federal government, people between the age of 35 and 44 spend more than $4,200 each year in restaurants, on average.  For people age 65 and older, the average is just $2,500 (~40% less).

Why the difference? The generally smaller appetites of people who are older may explain some of it, but I suspect it’s also due to lower disposable income.

For a myriad of reasons, significant numbers of seniors haven’t planned well financially for their retirement.  Far too many have saved exactly $0, and another ~25% enter retirement with less than $50,000 in personal savings.  Social security payments alone were never going to support a robust regime of eating out, and for these people in particular, what dollars they have in reserve amount to precious little.

Bottom line, restaurateurs who think they can rely on seniors to generate sufficient revenues and profits for their operations are kidding themselves.

As for the millennial generation – the 75 million+ people born between 1981 and 1996 – this group just barely outpaces Boomers as the biggest one of all. But having come of age during the Great Recession, it’s also a relatively poorer group.

In fact, the poverty rate among millennials is higher than for any other generation. They’re majorly in debt — to the tune of ~$42,000 per person on average (mostly not from student loans, either).  In many places they’ve had to face crushingly high real estate prices – whether buying or renting their residence.

Millennials are now at the prime age to have children, too, which means that more of their disposable income is being spent on things other than going out to eat.

If there is a silver lining, it’s that the oldest members of the millennial generation are now in their upper 30s – approaching the age when they’ll again start spending more on dining out.  But for most restaurants, that won’t supplant the lost revenues resulting from the baby boom population hitting retirement age.

Downtown turnaround? In these places, yes.

Downtown Minneapolis (Photo: Dan Anderson)

For decades, “going downtown” meant something special – probably from its very first use as a term to describe the lower tip of Manhattan, which was then New York City’s heart of business, commercial and residential life.

Downtown was literally “where it was at” – jobs, shopping, cultural attractions and all.

But then, beginning in post-World War II America, many downtowns lost their luster, as people were drawn to the suburbs thanks to cheap land and easy means to traveling to and fro.

In some places, downtowns and the areas immediately adjoining them became places of high crime, industrial decay, shopworn appearances and various socio-economic pathologies.

Things hit rock bottom in the late 1970s, as personified by the Times Square area of New York City. But since then, many downtowns have slowly come back from those near-death experiences, spurred by new types of residents with new and different priorities.

Dan Cort, author of the book Downtown Turnaround, describes it this way:  “People – young ones especially – love historical buildings that reintroduce them to the past.  They want to live where they can walk out of the house, work out, go to a café, and still walk to work.”

There are a number of cities where the downtown areas have come back in the big way over the past several decades. Everyone knows which ones they are:  New York, Seattle, San Francisco, Minneapolis …

But what about the latest success stories? Which downtowns are those?

Recently, Realtor.com analyzed the 200 largest cities in the United States to determine which ones have the made the biggest turnaround since 2012. To determine the biggest successes, it studied the following factors:

  • Downtown residential population growth
  • Growth in the number of restaurants, bars, grocery stores and food trucks per capita
  • Growth in the number of independent realtors per capita
  • Growth in the number of jobs per capita
  • Home price appreciation since 2012 (limited to cities where the 2012 median home price was $400,000 or lower)
  • Price premium of purchasing a home in the downtown district compared with the median home price of the whole city
  • Residential and commercial vacancy rates

Based on these criteria, Realtor.com’s list of the Top 10 cities where downtown is making a comeback are these:

  • #1 Pittsburgh, PA
  • #2 Indianapolis, IN
  • #3 Oakland, CA
  • #4 Detroit, MI
  • #5 Columbus, OH
  • #6 Austin, TX
  • #7 Los Angeles, CA
  • #8 Dallas, TX
  • #9 Chicago, IL
  • #10 Providence, RI

Some of these may surprise you. But it’s interesting to see some of the stats that are behind the rankings.  For instance, look at what’s happened to median home prices in some of these downtown districts since 2012:

  • Detroit: +150%
  • Oakland: +111%
  • Los Angeles: +63%
  • Pittsburgh: +31%

And residential population growth has been particularly strong here:

  • Pittsburgh: +32%
  • Austin: +25%
  • Dallas: +25%
  • Chicago: +21%

In the coming years, it will be interesting to see if the downtown revitalization trend continues – and spreads to more large cities.

And what about America’s medium-sized cities, where downtown zones continue to struggle. If you’ve been to Midwestern cities like Kokomo, IN, Flint, MI or Lima, OH, those downtowns look particularly bleak.  Can the sort of revitalization we see in the major urban centers be replicated there?

I have my doubts … but what is your opinion? Feel free to share your thoughts below.

Where the Millionaires Are

Look to the states won by Hillary Clinton in the 2016 presidential election.

Proportion of “millionaire households” by state (darker shades equals higher proportion of millionaires).

Over the past few years, we’ve heard a good deal about income inequality in the United States. One persistent narrative is that the wealthiest and highest-income households continue to do well – and indeed are improving their relative standing – while many other families struggle financially.

The most recent statistical reporting seems to bears this out.

According to the annual Wealth & Affluent Monitor released by research and insights firm Phoenix Marketing International, the total tally of U.S. millionaire households is up more than 800,000 over the past years.

And if we go back to 2006, before the financial crisis and subsequent Great Recession, the number of millionaire households has increased by ~1.3 million since that time.

[For purposes of the Phoenix report, “millionaire households” are defined as those that have $1 million or more in investable assets. Collectively, these households possess approximately $20 billion in liquid wealth, which is nearly 60% of the entire liquid wealth in America.]

Even with a growing tally, so-called “millionaire households” still represent around 5% of all U.S. households, or approximately 6.8 million in total. That percentage is nearly flat (up only slightly to 5.1% from 4.8% in 2006).

Tellingly, there is a direct correlation between the states with the largest proportion of millionaire households and how those states voted in the most recent presidential election. Every one of the top millionaire states is located on the east or west coasts – and all but one of them was won by Hillary Clinton:

  • #1  Maryland
  • #2  Connecticut
  • #3  New Jersey
  • #4  Hawaii
  • #5  Alaska
  • #6  Massachusetts
  • #7  New Hampshire
  • #8  Virginia
  • #9  DC
  • #10  Delaware

Looking at the geographic makeup of the states with the highest share of millionaires helps explain how “elitist” political arguments had a degree resonance in the 2016 campaign that may have surprised some observers.

Nearly half of the jurisdictions Hillary Clinton won are part of the “Top 10” millionaire grouping, whereas just one of Donald Trump’s states can be found there.

But it’s when we look at the tiers below the “millionaire households” category that things come into even greater focus. The Phoenix report shows that “near-affluent” households in the United States – the approximately 14 million households having investable assets ranging from $100,000 to $250,000 – actually saw their total investable assets decline in the past year.

“Affluent” households, which occupy the space in between the “near-affluents” and the “millionaires,” have been essentially treading water. So it’s quite clear that things are not only stratified, but also aren’t improving, either.

The reality is that the concentration of wealth continues to deepen, as the Top 1% wealthiest U.S. households possess nearly one quarter of the total liquid wealth.

In stark contrast, the ~70% of non-affluent households own less than 10% of the country’s liquid wealth.

Simply put, the past decade hasn’t been kind to the majority of Americans’ family finances. In my view, that dynamic alone explains more of 2016’s political repercussions than any other single factor.  It’s hardly monolithic, but often “elitism” and “status quo” go hand-in-hand. In 2016 they were lashed together; one candidate was perceived as both “elitist” and “status quo,” and the result was almost preordained.

The most recent Wealth & Affluent Monitor from Phoenix Marketing International can be downloaded here.

Saving for college: Millennial parents seem to have figured it out better.

sfcRecently SLM Corp. (aka Sallie Mae) released the results of a survey of parents that asked about how they’re saving for their children’s college education. The survey, which was conducted for Sallie Mae by research firm Ipsos Public Affairs, uncovered some pretty interesting stats.

Here’s something that surprised me: When it comes to saving for kids’ college tuition, it turns out that Millennial parents – those age 35 and younger – have already saved significantly more than their GenX counterparts.

Millennial parents reported having saved more than $20,000 toward kids’ college, whereas GenX parents – those between the age of 36 and 51 – have saved only around $18,000.

What’s up with that?

The report lists several possible explanations. First, Millennials are more likely to have started saving earlier for their kids’ college education because of their expectation of having paying a higher share of college costs compared to older generations of people

Likely, their remembering their own (more recent) college experiences.

Here’s another contributing factor: Many GenX parents tended to be hit harder financially than Millennials during the recent recession.  They’re the ones who were more likely to have lost a job further into their careers, when it’s can be more difficult to bounce back quite as easily and at the same level of salary.

At the same time, it’s often the GenXers who have higher mortgage and other debts already racked up when compared to Millennials.

Under those circumstances, saving for children’s college is a commitment that’s much easier to place on hold until other, more pressing financial matters are dealt with.

On the other hand, for Millennials the recession caught them at the beginning of their careers when fewer financial commitments (other than student loans) were yet made, and their flexibility more fluid.

It’s easier to roll with the punches when you don’t have a pile of fixed financial obligations already hanging over your head.

Another factor the Sallie Mae/Ipsos report cites is that GenX parents are more likely to have become over-leveraged in their personal finances — a situation exacerbated by income stagnation, declining stock investment values and declining home values (even being underwater on home mortgages in some cases).

It seems that all of these factors have colored GenX attitudes about saving for kids’ college education in ways that go beyond what the raw numbers show. When asked how confident they feel about being able to meet the college financial obligations for their children, 32% of Millennials stated that they felt quite confident.

But among GenXers, it was just 17%.

Overall, this doesn’t paint a very pretty picture for GenX parents.  But it seems that the Millennial generation has figured out the “college cost recipe” a little more successfully.

For more “topline” statistical findings from the survey, click or tap here.

Quick-change artistry: Masculinity goes from “alpha-male” to “alta-male” inside of a generation.

Alpha-male: Venezuelan actor Alejandro Nones
Alpha-male: Venezuelan/Mexican actor Alejandro Nones

Many people contend that changes in society are driven by many influences – not least movies and music. Certainly, the popular arts reflect the current culture, but they also drive its evolution.

This view was underscored recently in the results of field research conducted by a British- and Singapore-based survey firm Join the Dots for Dennis Publishing, which has just launched Coach, a magazine in the U.K.

The magazine’s audience consists of men who are committed to lifestyles that make themselves “healthier, fitter and happier.”

The research aimed to figure out what are today’s characteristics of being “male.” An in-depth qualitative focus group session with men aged 22 to 60 helped establish the set of questions that was then administered in a quantitative survey of ~1,000 respondents (including women as well as men) between the ages of 25 and 54 years old.  The survey sample represented a diverse mix of family status, sexual preferences, incomes, professions and interests.

The survey questions focused on the habits and aspirations of men … and the results showed how far we’ve come from the heydays of the “alpha-male” barely 25 years ago.

The researchers contrasted good and not-go-good alpha-male stereotypes (self-absorbed … unwilling or unable to talk about insecurities or vulnerabilities) with a new persona they dubbed the “alta-male.”

The alta-male is a man who values work/life balance and finds personal fulfillment as much in self-improvement as in material wealth.

Additionally, the alta-male tends to reject male role models from earlier generations, instead opting to establish their own identity based on a myriad of diverse influences.

Of course, it’s one thing to aspire to these goals and quite another to actually attain them. The study found that two-thirds of the respondents are finding it difficult to achieve the satisfactory work/life balance they desire.

On the other hand, alta-males tend to be more adaptable, and they’re willing to embrace uncertainty more than the alpha-males of yore.

Even more strongly, alta-males are seekers of experiences, which they value over “mere money” – despite recognizing that it takes money to partake in many such life experiences.

Perhaps most surprising, the study found little difference in perspectives between older and younger male respondents.

It turns out that older men are just as likely to have an “alta-male” attitude towards life.  So clearly, the culture has been rubbing off on them, too.

From my own personal standpoint (as someone whose been around the track quite a few times over the decades), I sense a similar shift in my own personal perspectives as well.

What about the rest of you?

Is there an emerging crisis in caregiving?

ecIt surprised me to learn recently that nearly 45 million American adults are providing unpaid care to family members. And in nearly 80% of the cases, the persons being cared for are over the age of 50.

As it turns out, there are tens of millions of parents and spouses who require daily care.  And for many of folks who take on the role of unpaid caregivers, these tasks are like a second job for them.

Typically, they require a minimum of 20 hours per week (and often far more than that).

Juggling work and life under such circumstances is more than merely an inconvenience. It can also put caregivers’ own health and emotional well-being at risk – not to mention the added financial burdens where all the little extra expenses can add up to be quite a lot.

AARP is asking the question: Is there a coming crisis in senior care?  Looking at this chart, there very may well be one looming on the horizon:

tc

 

What this chart tells us is that in 2010, the ratio of caregivers to care recipients was about 7:1.  In just 20 years, that ratio will be down to 4:1.  And it’s going to get even worse after that.

It comes down to simple math. The population of Americans over the age of 80 is expected to grow by nearly 45% between 2030 and 2040, whereas the number of available caregivers won’t even begin to keep up (rising by just 10%).

And then there’s looking at the role of caregiving from the other side. As it turns out, there are nearly 8 million children who live in families where a grandparent is the head of household.

The reasons for “grand-families” are varied: military deployment of parents, incarceration, substance abuse or mental illness.  And where these families are found crosses all societal lines.

When you look at it from all the angles, caregiving is a big challenge with no easy answers.  But considering its ubiquity, it’s an issue that doesn’t get the kind of attention it probably should.

If you have personal observations to share based on your own family’s caregiving experiences, I’m sure other readers will find them worth hearing — so jot them down in the comment section below.

State migration trends: A familiar pattern reasserts itself?

During the “great recession” that began in 2008, the United States saw an interruption in a decades-long pattern of more Americans moving away from states in the Midwest and Northeast than moving in, while more moved to states in the South and West.

The break in this pattern was good news for those who had bemoaned the loss of political clout based on the relative changes in state population.

To wit:  In the four census periods between the 1960s and the 2000s, the 11 Northeast states plus DC saw their Electoral College percentage plummet by 15 percentage points, so that today they represent barely 20% of the electoral votes required to elect the U.S. president, with s similar lack of clout in the House of Representatives.

For the 12 Midwestern states, the trends haven’t been much better.

But for a few years at least, states in the Midwest and Northeast stopped shedding quite so many of their residents to the Southern and Western regions of the country.

The question was, would it last?  Now we know the answer:  Nope.

Instead, new census data is showing a return to familiar patterns.  In the past few years, the states with the largest net in-migration of people are these predictable ones in the Sunbelt region:

  • Florida: +200,000 net new migrants
  • Texas: +170,000
  • Colorado: +54,000
  • Arizona: +48,000
  • South Carolina: +46,000

By comparison, woebegone Illinois, beset by state fiscal crises, a mountain of over-bloated state pension obligations and a crumbling infrastructure that causes some people major-league heartburn on a daily basis, experienced a net loss of more than 100,000 people.

New York, Pennsylvania, Michigan, New Jersey and other “rust belt” states aren’t far behind.

This map, courtesy of The Washington Post, pretty much says it all:

 

untitled

 

There is one glaring difference today compared to previous decades. Back then, California was always at or near the top in terms of positive net in-migration from other states.  That’s all different now – as California is a state with one of the largest net losses.

What’s particularly telling is that California, which used to share many of the characteristics of other Sunbelt and Western states, now seems much more in line with the Northeast and Industrial Midwest when it comes to fundamental factors like state personal and corporate tax rates, real estate prices, environmental regulations, employment dynamics, unionization rates, and the size of public payrolls as a share of the total labor force.

Based on the entrenched and intractable socio-political dynamics at work, don’t look for the migration patterns to change anytime soon.

What are your own personal perspectives, based on where you live?

Inked in stone: One societal trend that’s going off the charts.

ttWhat’s one of the biggest societal trends in America nowadays? Believe it or not, it’s the rapidly growing popularity of tattoos.

Once the province of just a slim slice of the American population, today we’re smack in the middle of dramatic changes in attitudes about tattoos.

Let’s begin with figures published by the Harris Poll recently, based on its survey or more than 4,000 American conducted in late 2015. That survey finds that nearly one in three Americans age 18 or older have at least one tattoo (29%, to be exact).

Not only did that percentage surprise me, but also the increase that represents over a similar Harris Poll conducted just four years ago. In that survey, ~21% reported having a tattoo … which means that the nearly 40% more people have tattoos today than in 2010.

Pretty amazing, I thought.  And the surprises don’t stop there, either. Of those people who are tattooed, more than two-thirds report that they have more than one tattoo.

What isn’t surprising at all is that the tattoo craze is most prevalent among younger Americans:

  • Millennials: ~47% report having at least one tattoo
  • GenX: ~36%
  • Baby Boomers: ~13%
  • Matures: ~10%

There are also some locational differences, with rural and urban Americans somewhat more likely to be tattooed than those people who reside in suburbia:

  • Rural folks: ~35% have at least one tattoo
  • Urban dwellers: ~33%
  • Suburbanites: ~25%

[There’s no discernible difference at all between people of differing partisan/political philosophies, according to Harris.]

With such a big increase in tattooing, the next question is, what’s behind the trend? For clues, we can see how respondents described what their tattoo(s) mean to them personally:

  • Makes me feel more sexy: ~33% of tattooed Americans cited
  • Makes me feel more attractive: ~32%
  • Makes me feel more non-conformist/rebellious: ~27%
  • Makes me feel more spiritual: ~20%

[Far smaller percentages felt that their tattoo(s) made them feel more intelligent, more employable, more respected or more healthy.]

But what about regrets? Are there people who wish they hadn’t taken the plunge?  The Harris survey found that approximately one in four respondents do feel at least some regrets about having a tattoo.  The reasons why are varied — yet all pretty obvious:

  • Personality changes … doesn’t fit my present lifestyle
  • The tattoo includes someone’s name I’m no longer with
  • The tattoo was poorly done … doesn’t look professional
  • The tattoo is no longer meaningful to me
  • I was too young when I got the tattoo

In conclusion, I think it’s safe to conclude that tattoos are a generational thing. Those of us north of age 50 don’t have any tattoos — and likely will never get one.

But for the younger generations, not only have tattoos gone “mainstream,” for many they’re a decidedly aspirational thing.  And that, of course, means ever widening acceptance of tattoos along with encountering more of them than ever before.

Comments … thoughts anyone?