Programmatic ad buying takes a hit.

There are some interesting new trends we’re now seeing in programmatic ad buying. For years, purchasing online ads programmatically instead of directly with specific publishers or media companies has been on a steady increase.  No more.

MediaRadar has just released its latest Consumer Advertising Report covering ad spending, formats and buying patterns. The new report states that programmatic ad buying declined ~12% when comparing the first quarter of 2017 to the same period in 2016.

More specifically, whereas ~45,000 advertisers purchased advertising programmatically in Q1 2016, that figure has dropped to around ~39,500 for the same quarter this year.

This change in fortunes may come as a surprise to some. The market has generally been bullish on programmatic ad buying because it is far less labor-intensive to administrator those types of programs compared to direct advertising programs.

There have been ongoing concerns about the potential of fraud, the lack of transparency on ad pricing, and control over where advertisers’ placements actually appear, but up until now, these concerns weren’t strong enough to reverse the steady migration to programmatic buying.

Todd Krizelman, CEO of MediaRadar, had this to say about the new findings:

“For many years, the transition of dollars from direct ad buying to programmatic seemed inevitable, and impossible to roll back. But the near-constant drumbeat of concern over brand safety and fraud in the first six months of 2017 has slowed the tide.  There’s more buying of direct advertising, especially sponsored editorial, and programmatically there is a ‘flight to quality’.”

Krizelman touches on another major new finding from the MediaRadar report: how much better native advertising performs over traditional ad units. Audiences tend to look at advertorials more frequently than display ads, and the clickthrough rates on mobile native advertising, in particular, are running four times higher than what mobile display ads garner.

Not surprisingly, the top market categories for native advertising are ones which lend themselves well to short, pithy stories. Travel, entertainment, home, food and apparel categories score well, as do financial and real estate stories.

The MediaRadar report is based on some pretty exhaustive statistics, with data analyzed from more than 265,000 advertisers covering the buying of digital, native, mobile, video, e-mail and print advertising. For more detailed findings, follow this link.

ESPN: What the heck just happened … and who’s to blame?

Last week, ESPN announced the layoffs of some 100 staffers, most of them on-air talent. This comes after layoffs of ~300 other personnel in 2015, but since those were behind-the-scenes employees, the news didn’t seem as momentous.

There are several factors coming together that make life particularly difficult for the sports network. One big problem is the commitment ESPN has made to pay top-dollar for the right to air professional sports events, particularly NFL and NBA games.

These financial commitments are set in stone and are made well into the future, which means that ESPN is committed to high long-term fixed costs (broadcast rights) in exchange for what’s turning out to be declining variable revenues (viewer subscription fees and advertising).

This isn’t a very good financial model at all.

Which brings us to the second big factor: declining subscribers.

Since 2011, the network has lost ~15 million subscribers. So far in 2017, the network has experienced an average loss of ~10,000 people per day.

The financial impact of these losses is significant. All of those lost subscribers amounts to more than $1.3 billion per year in money that’s no longer going on ESPN’s books.

Sports journalist Clay Travis predicts that if the current trajectory of subscriber losses continues, ESPN will begin losing money in 2021. (And that’s assuming the subscriber base losses don’t accelerate, an assumption that might be a little too rosy.)

The fundamental question is why so many people are no longer subscribing to ESPN. The predictable answer is because services like Hulu, Netflix and Amazon, with their on-demand services, are squeezing cable/satellite TV and its subscription business model.

One way Disney (ESPN’s parent company) has attempted to maximize viewer subscription revenues over the years has been by bundling the network with other, less lucrative Disney media properties like the History Channel, Vice, Disney Junior and the Lifetime Movie Network. In the Disney constellation of channels, ESPN has been the acknowledged “driver” of subscription revenues all along, with die-hard sports fans being willing to subsidize the other Disney channels – often never watched by these subscribers – as the price of access.

But something else is happening now:  ESPN itself has begun to lose viewers as well.

According to television industry publication Broadcasting & Cable, ESPN’s viewership rating has declined ~7% so far this year.  ESPN2’s rating is down even further – an eye-popping ~34%.

Percentages like those aren’t driven by “sidebar” incidental factors. Instead, they cut to the core of the programming and the content that’s being offered.

If there’s one programming factor that’s tracked nearly on point with ESPN’s viewership declines, it’s been the explosion in “sports-talk” programming versus actual “sports game” programming at the network. As Townhall opinion journalist Sean Davis has written:

“If you talk to sports fans and to people who have watched ESPN religiously for most of their lives, they’ll tell you that the problem is the lack of sports and a surplus of shows featuring people screaming at each other. The near-universal sentiment … is that the content provider sidelined actual sports in favor of carnival barkers.”

Davis points out the flaw in ESPN’s shift in colorful terms:

“ESPN went from the worldwide leader in sports to yet another expensive network of dumb people yelling dumb things at other dumb people, all the while forgetting that the most popular entertainment of people yelling about sports stuff for several hours a day – sports talk radio – is free.”

There’s an additional factor in the mix that’s a likely culprit in ESPN’s tribulations – the mixing of sports and politics. That programming decision has turned out to be a great big lightning rod for the network – with more downside than upside consequences.

The question is, why did ESPN even go in that direction?

Most likely, ESPN execs saw the tough numbers on higher costs, subscriber losses and lower ratings, and decided that it needed a larger content pie to attract more consumers.

The reasoning goes, if some people like sports and others like politics, why not combine the two to attract a larger audience, thereby changing the trajectory of the figures?

But that reasoning flies in the face of how people consume political news. In the era of Obama and now Trump, political diehards gravitate to outlets that reinforce their own worldviews:  conservatives want news from conservatives; liberals want news from liberals.

MSNBC and the Fox News Channel have figured this out big-time.

But if you’re starting with a cross-partisan mass media audience for sports, as the original ESPN audience most certainly was, trying to combine that with politics means running the risk of losing one-half of your audience.

That’s what’s been happening with ESPN. Intertwining sports with coverage about bathrooms in North Carolina, transgender sports stars, gun control laws and proper national anthem etiquette only gets your numbers going in one direction (down).

The question for ESPN is how it plans to recalibrate and refocus its programming to truly defend its position as the worldwide leader in sports broadcasting. However it decides to position itself in terms of the delivery of its content – television, online, subscription, pay-per-view or other methods – it should refocus on covering live sports events.

Not sports talk … not debate … not politics or sociology, but the sports themselves.

At one time, not so long ago, sports were a safe refuge from politics and the news. ESPN would do itself – and its viewers – a favor if it sought to recapture that spirit.

More raps for Google on the “fake reviews” front.

Google’s trying to not have its local search initiative devolve into charges and counter-charges of “fake news” à la the most recent U.S. presidential election campaign – but is it trying hard enough?

It’s becoming harder for the reviews that show up on Google’s local search function to be considered anything other than “suspect.”

The latest salvo comes from search expert and author Mike Blumenthal, whose recent blog posts on the subject question Google’s willingness to level with its customers.

Mr. Blumenthal could be considered one of the premiere experts on local search, and he’s been studying the phenomenon of fake information online for nearly a decade.

The gist of Blumenthal’s argument is that Google isn’t taking sufficient action to clean up fake reviews (and related service industry and affiliate spam) that appear on Google Maps search results, which is one of the most important utilities for local businesses and their customers.

Not only that, but Blumenthal also contends that Google is publishing reports which represent “weak research” that “misleads the public” about the extent of the fake reviews problem.

Mike Blumenthal

Google contends that the problem isn’t a large one. Blumenthal feels differently – in fact, he claims the problem as growing worse, not getting better.

In a blog article published this week, Blumenthal outlines how he’s built out spreadsheets of reviewers and the businesses on which they have commented.

From this exercise, he sees a pattern of fake reviews being written for overlapping businesses, and that somehow these telltale signs have been missed by Google’s algorithms.

A case in point: three “reviewers” — “Charlz Alexon,” “Ginger Karime” and “Jen Mathieu” — have all “reviewed” three very different businesses in completely different areas of the United States:  Bedoy Brothers Lawn & Maintenance (Nevada), Texas Car Mechanics (Texas), and The Joint Chiropractic (Arizona, California, Colorado, Florida, Minnesota, North Carolina).

They’re all 5-star reviews, of course.

It doesn’t take a genius to figure out that “Charlz Alexon,” “Ginger Karime” and “Jen Mathieu” won’t be found in the local telephone directories where these businesses are located. That’s because they’re figments of some spammer-for-hire’s imagination.

The question is, why doesn’t Google develop procedures to figure out the same obvious answers Blumenthal can see plain as day?

And the follow-up question: How soon will Google get serious about banning reviewers who post fake reviews on local search results?  (And not just targeting the “usual suspect” types of businesses, but also professional sites such as physicians and attorneys.)

“If their advanced verification [technology] is what it takes to solve the problem, then stop testing it and start using it,” Blumenthal concludes.

To my mind, it would be in Google’s own interest to get to the bottom of these nefarious practices. If the general public comes to view reviews as “fake, faux and phony,” that’s just one step before ceasing to use local search results at all – which would hurt Google in the pocketbook.

Might it get Google’s attention then?

When people think “search,” they still think “Google.”

… And they might say it, too — thanks to the rise of voice search.

Over the years, many things have changed in the world of cyberspace. But one thing seems to be pretty much a constant:  When people are in “search” mode online, most of them are playing in Google’s ballpark.

This behavior has been underscored yet again in a new survey of ~800 consumers conducted by Fivesight Research.

Take a look at these two statistics that show how strong Google’s search popularity remains today:

  • Desktop users: ~79% of searches are on Google
  • Smartphone users: ~86% use Google

The smartphone figure above is even more telling in that the percentage is that high whether users are on an iPhone or an Android system.

But here’s another very interesting finding from the Fivesight survey: Google’s biggest competition isn’t other search engines like Bing or Yahoo.  Instead, it’s Siri, which now accounts for ~6% of mobile search market share.

So what we’re seeing in search isn’t a shift to other providers, but rather a shift into new technologies. To illustrate, nearly three in four consumers are using voice technologies such as Siri, Google Now and Microsoft Cortana to supplement their traditional search activities.

Some marketing specialists contend that “voice search is the new search” – and it’s hard not to agree with them. Certainly, voice search has become easier in the past year or so as more mobile devices as well as personal home assistants like Amazon Alexa have been adopted by the marketplace.

It also helps that voice recognition technology continues to improve in quality, dramatically reducing the incidences of “machine mistakes” in understanding the meaning of voice search queries.

But whether it’s traditional or voice-activated, I suspect Google will continue to dominate the search segment for years to come.

That may or may not be a good thing for consumers. But it’s certainly a good thing for Google – seeing as how woefully ineffective the company has been in coming up with any other business endeavor even remotely as financially lucrative as its search business.

In copywriting, it’s the KISS approach on steroids today.

… and it means “Keep It Short, Stupid” as much as it does “Keep It Simple, Stupid.”

Regardless of the era, most successful copywriters and ad specialists have always known that short copy is generally better-read than long.

And now, as smaller screens essentially take over the digital world, the days of copious copy flowing across a generous preview pane area are gone.

More fundamentally, people don’t have the screen size – let along the patience – to wade through long copy. These days, the “sweet spot” in copy runs between 50 and 150 words.

Speaking of which … when it comes to e-mail subject lines, the ideal length keeps getting shorter and shorter. Research performed by SendGrid suggests that it’s now down to an average length of about seven words for the subject line.

And the subject lines that get the best engagement levels are a mere three or four words.

So it’s KISS on steroids: keeping it short as well as simple.

Note: The article copy above comes in at under 150 words …!

More Trouble in the Twittersphere

With each passing day, we see more evidence that Twitter has become the social media platform that’s in the biggest trouble today.

The news is replete with articles about how some people are signing off from Twitter, having “had it” with the politicization of the platform. (To be fair, that’s a knock on Facebook as well these days.)

Then there are reports of how Twitter has stumbled in its efforts to monetize the platform, with advertising strategies that have failed to generate the kind of growth to match the company’s optimistic forecasts. That bit of bad news has hurt Twitter’s share price pretty significantly.

And now, courtesy of a new analysis published by researchers at Indiana University and the University of Southern California, comes word that Twitter is delivering misleading analytics on audience “true engagement” with tweets.  The information is contained in a peer-reviewed article titled Online Human-Bot Interactions: Detection, Estimation and Characterization.

According to findings as determined by Indiana University’s Center for Complex Networks & Systems Research (CNetS) and the Information Sciences Institute at the University of Southern California, approximately 15% of Twitter accounts are “bots” rather than people.

That sort of news can’t be good for a platform that is struggling to elevate its user base in the face of growing competition.

But it’s even more troubling for marketers who rely on Twitter’s engagement data to determine the effectiveness of their campaigns. How can they evaluate social media marketing performance if the engagement data is artificially inflated?

Fifteen percent of all accounts may seem like a rather small proportion, but in the case of Twitter that represents nearly 50 million accounts.

To add insult to injury, the report notes that even the 15% figure is likely too low, because more sophisticated and complex bots could have appeared as a “humans” in the researchers’ analytical model, even if they aren’t.

There’s actually an upside to social media bots – examples being automatic alerts of natural disasters or customer service responses. But there’s also growing evidence of nefarious applications abounding.

Here’s one that’s unsurprising even if irritating: bots that emulate human behavior to manufacture “faux” grassroots political support.  But what about the delivery of dangerous or inciting propaganda thanks to bot “armies”?  That’s more alarming.

The latest Twitter-bot news is more confirmation of the deep challenges faced by this particular social media platform.  What’s next, I wonder?

B-to-B content marketers: Not exactly a confident bunch.

In the world of business-to-business marketing, all that really matters is producing a constant flow of quality sales leads.  According to Clickback CEO Kyle Tkachuk, three-fourths of B-to-B marketers cite their most significant objective as lead generation.  Pretty much everything else pales in significance.

This is why content marketing is such an important aspect of commercial marketing campaigns.  Customers in the commercial world are always on the lookout for information and insights to help them solve the variety of challenges they face on the manufacturing line, in their product development, quality assurance, customer service and any number of other critical functions.

Suppliers and brands that offer a steady diet of valuable and actionable information are often the ones that end up on a customer’s “short-list” of suppliers when the need to make a purchase finally rolls around.

Thus, the role of content marketers continues to grow – along with the pressures on them to deliver high-quality, targeted leads to their sales forces.

The problem is … a large number of content marketers aren’t all that confident about the effectiveness of their campaigns.

It’s a key takeaway finding from a survey conducted for content marketing software provider SnapApp by research firm Demand Gen.  The survey was conducted during the summer and fall of 2016 and published recently in SnapApp’s Campaign Confidence Gap report.

The survey revealed that more than 80% of the content marketers queried reported being just “somewhat” or “not very” confident regarding the effectiveness of their campaigns.

Among the concerns voiced by these content marketers is that the B-to-B audience is becoming less enamored of white papers and other static, lead-gated PDF documents to generate leads.

And yet, those are precisely the vehicles that continue to be used most often used to deliver informational content.

According to the survey respondents, B-to-B customers not only expect to be given content that is relevant, they’re also less tolerant of resources that fail to speak to their specific areas of interest.

For this reason, one-third of the content managers surveyed reported that they are struggling to come up with effective calls-to-action that capture attention, interest and action instead of being just “noise.”

The inevitable conclusion is that traditional B-to-B marketing strategies and similar “seller-centric” tactics have become stale for buyers.

Some content marketers are attempting to move beyond these conventional approaches and embrace more “content-enabled” campaigns that can address interest points based on a customer’s specific need and facilitate engagement accordingly.

Where such tactics have been attempted, content marketers report somewhat improved results, including more open-rate activity and an in increase in clickthrough rates.

However, the degree of improvement doesn’t appear to be all that impressive. Only about half of the survey respondents reported experiencing improved open rates.  Also, two-thirds reported experiencing an increase in clickthrough rates – but only by 5% or less.

Those aren’t exactly eye-popping improvements.

But here’s the thing: Engagement levels with traditional “static” content marketing vehicles are likely to actually decline … so if content-enabled campaigns can arrest the drop-off and even notch improvements in audience engagement, that’s at least something.

Among the tactics content marketers consider for their creating more robust content-enabled campaigns are:

  • Video
  • Surveys
  • Interactive infographics
  • ROI calculators
  • Assessments/audits

The hope is that these and other tools will increase customer engagement, allow customers to “self-quality,” and generate better-quality leads that are a few steps closer to an actual sale.

If all goes well, these content-enabled campaigns will also collect data that helps sales personnel accelerate the entire process.

Is the Phenomenon of “Fake News” Overhyped?

fnIn the wake of recent election campaigns and referenda in places like the United States, the United Kingdom, France, Austria and the Philippines, it seems that everyone’s talking about “fake news” these days.

People all across the political and socio-economic spectrum are questioning whether the publishing and sharing of “faux” news items is having a deleterious impact on public opinion and actually changing the outcome of consequential events.

The exact definition of the term is difficult to discern, as some people are inclined to level the “fake news” charge against anyone with whom they disagree.

Beyond this, I’ve noticed that some people assign nefarious motives – political or otherwise – to the dissemination of all such news stories.  Often the motive is different, however, as over-hyped headlines – many of them having nothing to do with politics or public policy but instead focusing on celebrities or “freak” news events – serve as catnip-like clickbait for viewers who can’t resist their curiosity to find out more.

From the news consumer’s perspective, the vast majority of people think they can spot “fake” news stories when they encounter them. A recent Pew survey found that ~40% of respondents felt “very confident” knowing whether a news story is authentic, and another ~45% felt “somewhat confident” of that fact.

But how accurate are those perceptions really? A recent survey from BuzzFeed and Ipsos Public Affairs found that people who use Facebook as their primary source of news believed fake news headlines more than eight out of ten times.

That’s hardly reassuring.

And to underscore how many people are using Facebook versus more traditional news outlets as a “major” source for their news, this BuzzFeed chart showing the Top 15 information sources says it all:

  • CNN: ~27% of respondents use as a “major source” of news
  • Fox News: ~27%
  • Facebook: ~23%
  • New York Times: ~18%
  • Google News: ~17%
  • Yahoo News: ~16%
  • Washington Post: ~12%
  • Huffington Post: ~11%
  • Twitter: ~10%
  • BuzzFeed News: ~8%
  • Business Insider: ~7%
  • Snapchat: ~6%
  • Drudge Report: ~5%
  • Vice: ~5%
  • Vox: ~4%

Facebook’s algorithm change in 2016 to emphasize friends’ posts over publishers’ has turned that social platform into a pretty big hotbed of fake news activity, as people can’t resist sharing even the most outlandish stories to their network of friends.

Never mind Facebook’s recent steps to change the dynamics by sponsoring fact-checking initiatives and banning fraudulent websites from its ad network; by the accounts I’ve read, it hasn’t done all that much to curb the orgy of misinformation.

Automated ad buying isn’t helping at all either, as it’s enabling the fake news “ecosystem” big-time. As Digiday senior editor Lucia Moses explains it:

“One popular method … is tapping the competitive market for native ad widgets. Taboola, Revcontent, Adblade and Content.ad are prominently displayed on sites identified with fake news, while there are a few retargeted and programmatic ads sprinkled in. Publishers install these native ad widgets with a simple snippet of code — typically after an approval process — and when readers click on paid links in the widget, the host publisher makes money.  The ads are made to appear like related-content suggestions and often promote sensational headlines and direct-marketing offers.”

So attempting to solve the “fake news” problem is a lot more complicated than some people might realize – and it certainly isn’t going to improve because of any sort of “political” change of heart. Forrester market analyst Susan Bidel sums it up thus:

“While steps taken by … entities to curb fake news are admirable, as long as fake news generators can make money from their efforts, the problem won’t go away.”

So there we are. Bottom-line, fake news is going to be with us for the duration – whether people like it or not.

What about you? Do you think you can spot every fake news story?  Or do you think at least of few of them come in below radar?

Thanks to IOT, search is morphing into “just-in-time knowledge.”

aeIn today’s world of marketing, it’s been obvious for some time that the pace of technological change is dramatically shortening the life cycle of marketing techniques.

Consider online search. Twenty-five years ago it was hardly a blip on the radar screen.  Picking up momentum, paid search soon began to rival traditional forms of advertising, as companies took advantage of promo programs offered by Google and others that aligned neatly with consumers when they were on the hunt for products, services and solutions..

Google has attracted billions upon billions of dollars in search advertising revenue, becoming one of the biggest corporations in the world, even as entire industries have grown up around optimizing companies’ website presence and relevance so as to rank highly in search query results.

And now, thanks to continuing technology evolution and the emergence of the Internet of Things, the next generation of search is now upon us – and it’s looking likely to make keyboards and touchscreens increasingly irrelevant within a few short years.

afhSearches without screens are possible thanks to technology like Google Assistant, Amazon Echo/Alexa, and software development kits from providers like Soundhound and Microsoft.

This past October, market forecasting firm Gartner came out with an interesting prediction: Within four years, it forecasts that ~30% of all searches will be carried out without a screen.

It’s happening already, actually. In web search, Amazon Echo answers voice queries, while the Bing knowledge and action graph allows Microsoft to provide answers to queries rather than a set of answer possibilities in the form of links as has been the case up to now.

Gartner envisions voice interactions overtaking typing in search queries because it is so much easier, faster and more intuitive for consumers. By eliminating the need for people to use eyes and hands for search and browsing, voice interactions improve the utility of web sessions even while multitasking takes on ever-increasing degrees of shared activity (walking, driving, socializing, exercising and the like).

Related to this, Gartner also predicts that one in five brands will have abandoned offering mobile apps by 2019. Already, many companies have found disappointing levels of adoption, engagement and ROI pertaining to the mobile apps they’ve introduced, and the prognosis is no better going forward; the online consumer is already moving on.

Gartner’s predictions go even further. It envisions ever-higher levels of what it calls “just-in-time knowledge” – essentially trading out searching for knowledge by simply getting answers to voice queries.

Speaking personally, this prediction concerns me a little. I think that some people may not fully grasp the implications of what Gartner is forecasting.

To me, “just-in-time knowledge” sounds uncomfortably close to being “ill-educated” (as opposed to “uneducated”).  Sometimes, knowing a little bit about something is more dangerous than knowing nothing at all. Bad decisions often come from possessing a bit of knowledge — but with precious little “context” surrounding it.

With “just-in-time knowledge,” think of how many people could now fall into that kind of trap.