ESPN: What the heck just happened … and who’s to blame?

Last week, ESPN announced the layoffs of some 100 staffers, most of them on-air talent. This comes after layoffs of ~300 other personnel in 2015, but since those were behind-the-scenes employees, the news didn’t seem as momentous.

There are several factors coming together that make life particularly difficult for the sports network. One big problem is the commitment ESPN has made to pay top-dollar for the right to air professional sports events, particularly NFL and NBA games.

These financial commitments are set in stone and are made well into the future, which means that ESPN is committed to high long-term fixed costs (broadcast rights) in exchange for what’s turning out to be declining variable revenues (viewer subscription fees and advertising).

This isn’t a very good financial model at all.

Which brings us to the second big factor: declining subscribers.

Since 2011, the network has lost ~15 million subscribers. So far in 2017, the network has experienced an average loss of ~10,000 people per day.

The financial impact of these losses is significant. All of those lost subscribers amounts to more than $1.3 billion per year in money that’s no longer going on ESPN’s books.

Sports journalist Clay Travis predicts that if the current trajectory of subscriber losses continues, ESPN will begin losing money in 2021. (And that’s assuming the subscriber base losses don’t accelerate, an assumption that might be a little too rosy.)

The fundamental question is why so many people are no longer subscribing to ESPN. The predictable answer is because services like Hulu, Netflix and Amazon, with their on-demand services, are squeezing cable/satellite TV and its subscription business model.

One way Disney (ESPN’s parent company) has attempted to maximize viewer subscription revenues over the years has been by bundling the network with other, less lucrative Disney media properties like the History Channel, Vice, Disney Junior and the Lifetime Movie Network. In the Disney constellation of channels, ESPN has been the acknowledged “driver” of subscription revenues all along, with die-hard sports fans being willing to subsidize the other Disney channels – often never watched by these subscribers – as the price of access.

But something else is happening now:  ESPN itself has begun to lose viewers as well.

According to television industry publication Broadcasting & Cable, ESPN’s viewership rating has declined ~7% so far this year.  ESPN2’s rating is down even further – an eye-popping ~34%.

Percentages like those aren’t driven by “sidebar” incidental factors. Instead, they cut to the core of the programming and the content that’s being offered.

If there’s one programming factor that’s tracked nearly on point with ESPN’s viewership declines, it’s been the explosion in “sports-talk” programming versus actual “sports game” programming at the network. As Townhall opinion journalist Sean Davis has written:

“If you talk to sports fans and to people who have watched ESPN religiously for most of their lives, they’ll tell you that the problem is the lack of sports and a surplus of shows featuring people screaming at each other. The near-universal sentiment … is that the content provider sidelined actual sports in favor of carnival barkers.”

Davis points out the flaw in ESPN’s shift in colorful terms:

“ESPN went from the worldwide leader in sports to yet another expensive network of dumb people yelling dumb things at other dumb people, all the while forgetting that the most popular entertainment of people yelling about sports stuff for several hours a day – sports talk radio – is free.”

There’s an additional factor in the mix that’s a likely culprit in ESPN’s tribulations – the mixing of sports and politics. That programming decision has turned out to be a great big lightning rod for the network – with more downside than upside consequences.

The question is, why did ESPN even go in that direction?

Most likely, ESPN execs saw the tough numbers on higher costs, subscriber losses and lower ratings, and decided that it needed a larger content pie to attract more consumers.

The reasoning goes, if some people like sports and others like politics, why not combine the two to attract a larger audience, thereby changing the trajectory of the figures?

But that reasoning flies in the face of how people consume political news. In the era of Obama and now Trump, political diehards gravitate to outlets that reinforce their own worldviews:  conservatives want news from conservatives; liberals want news from liberals.

MSNBC and the Fox News Channel have figured this out big-time.

But if you’re starting with a cross-partisan mass media audience for sports, as the original ESPN audience most certainly was, trying to combine that with politics means running the risk of losing one-half of your audience.

That’s what’s been happening with ESPN. Intertwining sports with coverage about bathrooms in North Carolina, transgender sports stars, gun control laws and proper national anthem etiquette only gets your numbers going in one direction (down).

The question for ESPN is how it plans to recalibrate and refocus its programming to truly defend its position as the worldwide leader in sports broadcasting. However it decides to position itself in terms of the delivery of its content – television, online, subscription, pay-per-view or other methods – it should refocus on covering live sports events.

Not sports talk … not debate … not politics or sociology, but the sports themselves.

At one time, not so long ago, sports were a safe refuge from politics and the news. ESPN would do itself – and its viewers – a favor if it sought to recapture that spirit.

More raps for Google on the “fake reviews” front.

Google’s trying to not have its local search initiative devolve into charges and counter-charges of “fake news” à la the most recent U.S. presidential election campaign – but is it trying hard enough?

It’s becoming harder for the reviews that show up on Google’s local search function to be considered anything other than “suspect.”

The latest salvo comes from search expert and author Mike Blumenthal, whose recent blog posts on the subject question Google’s willingness to level with its customers.

Mr. Blumenthal could be considered one of the premiere experts on local search, and he’s been studying the phenomenon of fake information online for nearly a decade.

The gist of Blumenthal’s argument is that Google isn’t taking sufficient action to clean up fake reviews (and related service industry and affiliate spam) that appear on Google Maps search results, which is one of the most important utilities for local businesses and their customers.

Not only that, but Blumenthal also contends that Google is publishing reports which represent “weak research” that “misleads the public” about the extent of the fake reviews problem.

Mike Blumenthal

Google contends that the problem isn’t a large one. Blumenthal feels differently – in fact, he claims the problem as growing worse, not getting better.

In a blog article published this week, Blumenthal outlines how he’s built out spreadsheets of reviewers and the businesses on which they have commented.

From this exercise, he sees a pattern of fake reviews being written for overlapping businesses, and that somehow these telltale signs have been missed by Google’s algorithms.

A case in point: three “reviewers” — “Charlz Alexon,” “Ginger Karime” and “Jen Mathieu” — have all “reviewed” three very different businesses in completely different areas of the United States:  Bedoy Brothers Lawn & Maintenance (Nevada), Texas Car Mechanics (Texas), and The Joint Chiropractic (Arizona, California, Colorado, Florida, Minnesota, North Carolina).

They’re all 5-star reviews, of course.

It doesn’t take a genius to figure out that “Charlz Alexon,” “Ginger Karime” and “Jen Mathieu” won’t be found in the local telephone directories where these businesses are located. That’s because they’re figments of some spammer-for-hire’s imagination.

The question is, why doesn’t Google develop procedures to figure out the same obvious answers Blumenthal can see plain as day?

And the follow-up question: How soon will Google get serious about banning reviewers who post fake reviews on local search results?  (And not just targeting the “usual suspect” types of businesses, but also professional sites such as physicians and attorneys.)

“If their advanced verification [technology] is what it takes to solve the problem, then stop testing it and start using it,” Blumenthal concludes.

To my mind, it would be in Google’s own interest to get to the bottom of these nefarious practices. If the general public comes to view reviews as “fake, faux and phony,” that’s just one step before ceasing to use local search results at all – which would hurt Google in the pocketbook.

Might it get Google’s attention then?

When people think “search,” they still think “Google.”

… And they might say it, too — thanks to the rise of voice search.

Over the years, many things have changed in the world of cyberspace. But one thing seems to be pretty much a constant:  When people are in “search” mode online, most of them are playing in Google’s ballpark.

This behavior has been underscored yet again in a new survey of ~800 consumers conducted by Fivesight Research.

Take a look at these two statistics that show how strong Google’s search popularity remains today:

  • Desktop users: ~79% of searches are on Google
  • Smartphone users: ~86% use Google

The smartphone figure above is even more telling in that the percentage is that high whether users are on an iPhone or an Android system.

But here’s another very interesting finding from the Fivesight survey: Google’s biggest competition isn’t other search engines like Bing or Yahoo.  Instead, it’s Siri, which now accounts for ~6% of mobile search market share.

So what we’re seeing in search isn’t a shift to other providers, but rather a shift into new technologies. To illustrate, nearly three in four consumers are using voice technologies such as Siri, Google Now and Microsoft Cortana to supplement their traditional search activities.

Some marketing specialists contend that “voice search is the new search” – and it’s hard not to agree with them. Certainly, voice search has become easier in the past year or so as more mobile devices as well as personal home assistants like Amazon Alexa have been adopted by the marketplace.

It also helps that voice recognition technology continues to improve in quality, dramatically reducing the incidences of “machine mistakes” in understanding the meaning of voice search queries.

But whether it’s traditional or voice-activated, I suspect Google will continue to dominate the search segment for years to come.

That may or may not be a good thing for consumers. But it’s certainly a good thing for Google – seeing as how woefully ineffective the company has been in coming up with any other business endeavor even remotely as financially lucrative as its search business.

In copywriting, it’s the KISS approach on steroids today.

… and it means “Keep It Short, Stupid” as much as it does “Keep It Simple, Stupid.”

Regardless of the era, most successful copywriters and ad specialists have always known that short copy is generally better-read than long.

And now, as smaller screens essentially take over the digital world, the days of copious copy flowing across a generous preview pane area are gone.

More fundamentally, people don’t have the screen size – let along the patience – to wade through long copy. These days, the “sweet spot” in copy runs between 50 and 150 words.

Speaking of which … when it comes to e-mail subject lines, the ideal length keeps getting shorter and shorter. Research performed by SendGrid suggests that it’s now down to an average length of about seven words for the subject line.

And the subject lines that get the best engagement levels are a mere three or four words.

So it’s KISS on steroids: keeping it short as well as simple.

Note: The article copy above comes in at under 150 words …!

More Trouble in the Twittersphere

With each passing day, we see more evidence that Twitter has become the social media platform that’s in the biggest trouble today.

The news is replete with articles about how some people are signing off from Twitter, having “had it” with the politicization of the platform. (To be fair, that’s a knock on Facebook as well these days.)

Then there are reports of how Twitter has stumbled in its efforts to monetize the platform, with advertising strategies that have failed to generate the kind of growth to match the company’s optimistic forecasts. That bit of bad news has hurt Twitter’s share price pretty significantly.

And now, courtesy of a new analysis published by researchers at Indiana University and the University of Southern California, comes word that Twitter is delivering misleading analytics on audience “true engagement” with tweets.  The information is contained in a peer-reviewed article titled Online Human-Bot Interactions: Detection, Estimation and Characterization.

According to findings as determined by Indiana University’s Center for Complex Networks & Systems Research (CNetS) and the Information Sciences Institute at the University of Southern California, approximately 15% of Twitter accounts are “bots” rather than people.

That sort of news can’t be good for a platform that is struggling to elevate its user base in the face of growing competition.

But it’s even more troubling for marketers who rely on Twitter’s engagement data to determine the effectiveness of their campaigns. How can they evaluate social media marketing performance if the engagement data is artificially inflated?

Fifteen percent of all accounts may seem like a rather small proportion, but in the case of Twitter that represents nearly 50 million accounts.

To add insult to injury, the report notes that even the 15% figure is likely too low, because more sophisticated and complex bots could have appeared as a “humans” in the researchers’ analytical model, even if they aren’t.

There’s actually an upside to social media bots – examples being automatic alerts of natural disasters or customer service responses. But there’s also growing evidence of nefarious applications abounding.

Here’s one that’s unsurprising even if irritating: bots that emulate human behavior to manufacture “faux” grassroots political support.  But what about the delivery of dangerous or inciting propaganda thanks to bot “armies”?  That’s more alarming.

The latest Twitter-bot news is more confirmation of the deep challenges faced by this particular social media platform.  What’s next, I wonder?

B-to-B content marketers: Not exactly a confident bunch.

In the world of business-to-business marketing, all that really matters is producing a constant flow of quality sales leads.  According to Clickback CEO Kyle Tkachuk, three-fourths of B-to-B marketers cite their most significant objective as lead generation.  Pretty much everything else pales in significance.

This is why content marketing is such an important aspect of commercial marketing campaigns.  Customers in the commercial world are always on the lookout for information and insights to help them solve the variety of challenges they face on the manufacturing line, in their product development, quality assurance, customer service and any number of other critical functions.

Suppliers and brands that offer a steady diet of valuable and actionable information are often the ones that end up on a customer’s “short-list” of suppliers when the need to make a purchase finally rolls around.

Thus, the role of content marketers continues to grow – along with the pressures on them to deliver high-quality, targeted leads to their sales forces.

The problem is … a large number of content marketers aren’t all that confident about the effectiveness of their campaigns.

It’s a key takeaway finding from a survey conducted for content marketing software provider SnapApp by research firm Demand Gen.  The survey was conducted during the summer and fall of 2016 and published recently in SnapApp’s Campaign Confidence Gap report.

The survey revealed that more than 80% of the content marketers queried reported being just “somewhat” or “not very” confident regarding the effectiveness of their campaigns.

Among the concerns voiced by these content marketers is that the B-to-B audience is becoming less enamored of white papers and other static, lead-gated PDF documents to generate leads.

And yet, those are precisely the vehicles that continue to be used most often used to deliver informational content.

According to the survey respondents, B-to-B customers not only expect to be given content that is relevant, they’re also less tolerant of resources that fail to speak to their specific areas of interest.

For this reason, one-third of the content managers surveyed reported that they are struggling to come up with effective calls-to-action that capture attention, interest and action instead of being just “noise.”

The inevitable conclusion is that traditional B-to-B marketing strategies and similar “seller-centric” tactics have become stale for buyers.

Some content marketers are attempting to move beyond these conventional approaches and embrace more “content-enabled” campaigns that can address interest points based on a customer’s specific need and facilitate engagement accordingly.

Where such tactics have been attempted, content marketers report somewhat improved results, including more open-rate activity and an in increase in clickthrough rates.

However, the degree of improvement doesn’t appear to be all that impressive. Only about half of the survey respondents reported experiencing improved open rates.  Also, two-thirds reported experiencing an increase in clickthrough rates – but only by 5% or less.

Those aren’t exactly eye-popping improvements.

But here’s the thing: Engagement levels with traditional “static” content marketing vehicles are likely to actually decline … so if content-enabled campaigns can arrest the drop-off and even notch improvements in audience engagement, that’s at least something.

Among the tactics content marketers consider for their creating more robust content-enabled campaigns are:

  • Video
  • Surveys
  • Interactive infographics
  • ROI calculators
  • Assessments/audits

The hope is that these and other tools will increase customer engagement, allow customers to “self-quality,” and generate better-quality leads that are a few steps closer to an actual sale.

If all goes well, these content-enabled campaigns will also collect data that helps sales personnel accelerate the entire process.