Evidently, America isn’t in IKEA’s manufacturing future …

Going, going, gone …

Over the past several years, the political mantra has been that jobs are now coming back to the United States – particularly manufacturing ones.

That may well be. But this past week we’ve learned that IKEA plans to close its last remaining U.S. production facility.  The iconic home furnishings company has announced that it will be closing its manufacturing plant in Danville, Virginia by the end of the year.

The Danville plant makes wood-based furniture and furnishings for IKEA’s retail store outlets in the United States and Canada.

The reason for the plant closure, as it turns out, is a bit ironic. According to IKEA, high raw materials costs in North America are triggering the move, because those costs are actually significantly lower in Europe than they are here.  Even accounting for other input costs like labor that are higher in Europe, shifting production to Europe will keep product prices lower for U.S. retailers, IKEA claims.

So much for the notion that imports from Europe are overpriced compared to domestically produced ones!

The Danville plant isn’t even that old, either. Far from being some multi-story inefficient dinosaur left over from a half-century ago, the manufacturing facility opened only in 2008, making it only about a decade old.  At its peak the plant employed around 400 people.

IKEA made staff cuts or around 20% earlier in the year, before following up with this latest announcement that will wipe out 300 more jobs in a community that can scarcely withstand such large economic shocks.

With the closure of Danville, IKEA will still have more than 40 production plants operating around the world. It employs around 20,000 workers in those plants (out of a total workforce of ~160,000, most of which are employed in the company’s vast retail and distribution business activities).

So, it doesn’t appear that IKEA will be exiting the manufacturing sector anytime soon.  It’s just that … those manufacturing activities no longer include the United States.

As a certain well-known U.S. political leader might say, “Sad!”

DMARC’s job of demarcating: How well is it doing?

In the drive to keep the onslaught of fake e-mail communications under control, DMARC’s checks on incoming e-mail is an important weapon in the Internet police’s bag of tricks.  A core weapon of cyber felons is impersonation, which is what catches most unwitting recipients unawares.

So … how is DMARC doing?

Let’s give it a solid C or C+.

DMARC, which stands for Domain-based Message Authentication, Reporting and Conformance, is a procedure that checks on the veracity of the senders of e-mail. Nearly 80% of all inboxes – that’s almost 5.5 billion – conduct DMARC checks, and nearly 750,000 domains apply DMARC as well.

Ideally, DMARC is designed to satisfy the following requirements to ensure as few suspicious e-mails as possible make it to the inbox:

  • Minimize false positives
  • Provide robust authentication reporting
  • Assert sender policy at receivers
  • Reduce successful phishing delivery
  • Work at Internet scale
  • Minimize complexity

But the performance picture is actually rather muddy.

According to a new study by cyber-security firm Valimail, people are being served nearly 3.5 billion suspicious e-mails each day. That’s because DMARC’s success rate of ferreting out and quarantining the faux stuff runs only around 20%.  And while America has much better DMARC performance than other countries, the Unites States still accounts for nearly 40% of all suspicious e-mail that makes it through to inboxes due to the shear volume of e-mails involved.

In developing its findings, Valimail analyzed data from billions of authentication requests and nearly 20 million publicly accessible DMARC and SPF (Sender Policy Framework) records.  The Valimail findings also reveal that there’s a pretty big divergence in DMARC usage based on the type of entity. DMARC usage is highest within the U.S. federal government and large technology companies, where it exceeds 20% of penetration.  By contrast, it’s much lower in other commercial segments.

The commercial sector’s situation is mirrored in a survey of ~1,000 e-mail security and white-collar professionals conducted by GreatHorn, a cloud-native communication security platform, which found that nearly one in four respondents receive phishing or other malicious e-mails daily, and an additional ~25% receive them weekly.  These include impersonations, payload attacks, business services spoofing, wire transfer requests, W2 requests and attempts at credential theft.

The GreatHorn study contains this eyebrow-raising finding as well:  ~22% of the businesses surveyed have suffered a breach caused by malicious e-mail in the last quarter alone.  The report concludes:

“There is an alarming sense of complacency at enterprises at the same time that cybercriminals have increased the volume and sophistication of their e-mail attacks.”

Interestingly, in its study Valimail finds that the government has the highest DMARC enforcement success rate, followed by U.S. technology and healthcare firms (but those two sectors lag significantly behind). It may be one of the few examples we have of government performance outstripping private practitioners.

Either way, much work remains to be done in order to reduce faux e-mail significantly more.  We’ll have to see how things improve in the coming months and years.

Have KPIs become a crutch for businesses?

Relying on Key Performance Indicators has become the norm in many business operations. And why not?  Properly defined and managed, KPIs help businesses focus on the right priorities and chart progress towards their goals.

But even well-designed KPIs have their limitations. By their nature, they’re not greatly insightful (they’re indicators, after all).  The problem is that very often, KPIs are used as if they are.

One of the attractions of focusing on KPIs is their simplicity. Managers love boiling things down to concise, action-oriented statements and phrases.  We hear it all the time from senior leadership.  “Give us the bottom-line finding,” they emphasize.

“Business by bullet-point,” if you will.

But here’s the thing: Because of their distilled simplicity, KPIs can lure many a businessperson into overestimating the insights that they’re able to provide.

KPIs do provide a jumping-off point, but the underlying “why” is often still conjecture or a hypothesis. It takes discipline to look for deeper insights and corroborating evidence to really understand what KPIs are saying to us.

Addressing this issue, Shiv Gupta, data analytics specialist par excellence and head of Quantum Sight, has noted:

“Anyone who has worked on developing KPIs knows that it is a game of balance and compromise based on business objectives. The need for actionable information battles with the desire for simple metrics.”

Database marketer Stephen Yu of Willow Data Strategy makes another great point when he writes:

“We all have seen many “death by KPI” [situations] when organizations look at things the wrong way. When someone is lost while driving, [to] keep looking at the dashboard of the car won’t get the driver out of trouble. In a time like that, one must turn on a navigator.  Different solutions call for different analytics, and popular KPIs – no matter how insightful they may have been – often do not lead to solutions.”

What have been your experiences in working with KPIs in your business? How have they helped … or not?  Please share your thoughts and perspectives with other readers here.

“By any means necessary”: China’s Huawei Technologies flies close to the sun in its quest commandeer proprietary technology.

Not all-smiles at the moment … Chinese leader Xi Jinping.

In China, it’s difficult to discern where private industry ends and the government begins. At some level, we’ve been aware of that conundrum for decades.

Still … opportunities for doing business in the world’s largest country have been a tempting siren call for American companies. And over the past 15+ years, conducting that business has seemed like the “right and proper” thing to do — what with China joining the G-8+5 economic powers along with incessant cheerleading by the U.S. Department of Commerce, abetted by proactive endeavors of other quasi-governmental groups promoting the interests of American commerce across the globe.

But it’s 2019 and circumstances have changed. It began with a change in political administrations in the United States several years ago, following which a great deal more credence has been given to the undercurrent of unease businesspeople have felt about the manner in which supposedly proprietary engineering and manufacturing technologies have suddenly popped up in China as if by magic, pulling the rug out from under American producers.

Nearly three years into the new presidential administration, we’re seeing evidence of this “new skepticism” begin to play out in concrete ways. One of the most eye-catching developments – and a stunning fall from grace – is Huawei Technologies Co., Ltd. (world headquarters: Shenzhen, China), one of the world’s largest makers of cellphones and high-end telecom equipment.

As recounted by NPR’s Weekend Edition reporter Emily Feng a few days ago, Huawei stands accused of some of the most blatant forms of technology-stealing.  Recently, the Trump administration banned all American companies from using Huawei equipment in its 5G infrastructure and is planning to implement even more punitive measures that will effectively prevent U.S. companies from doing any business at all with Huawei.

Banning of Huawei equipment in U.S. 5G infrastructure isn’t directly related to the theft of intellectual property belonging to Huawei’s prospective U.S. suppliers.  Rather, it’s a response to the perceived threat that the Chinese government will use Huawei equipment installed in U.S. 5G mobile networks to surreptitiously conduct espionage for military, political or economic purposes far into the future.

In other words, as one of the world’s largest telecom players, Huawei is perceived as a direct threat to non-Chinese interests not just on one front, but two: the demand side and the supply side.  The demand-side threat is why the Trump administration has banned Huawei equipment in U.S. 5G infrastructure, and it has also publicly warned the U.K. government to implement a similar ban.

As for the supply side, the Weekend Edition report recounts the intellectual property theft experience of U.S.-based AKHAN Semiconductor when it started working with Huawei. AKHAN has developed and perfected an ingenious form of diamond-coated glass – a rugged engineered surface perfectly suited for smartphone screens.

Huawei expressed interest in purchasing the engineered glass for use in its own products. Nothing wrong with that … but Huawei used product samples provided by AKHAN under strict usage-and-return guidelines to reverse-engineer the technology, in direct contravention of those explicit conditions – and in violation of U.S. export control laws as well.

AKHAN discovered the deception because its product samples had been broken into pieces via laser cutting, and only a portion of them were returned to AKHAN upon demand.

When confronted about the matter, Huawei’s company officials in America admitted flat-out that the missing pieces had been sent to China.  AKHAN enlisted the help of the FBI, and in the ensuing months was able to build a sufficient case that resulted in a raid on Huawei’s U.S. offices in San Diego.

The supply side and demand side threats are two fronts — but are related.  One of the biggest reasons why Huawei kit has been selected, or is being considered, for deployment on 5G mobile networks worldwide is due to its low cost. The Chinese government, so the thinking goes, “seduces” telecom operators into buying the Huawei kit by undercutting all competitors, thereby gaining access to countless espionage opportunities. To maintain its financial footing Huawei must keep its costs as low as it can, and one way is to avoid R&D expenses by stealing intellectual property from would-be suppliers.

AKHAN is just the latest – if arguably the most dramatic – example of Huawei’s pattern of technology “dirty tricks” — others being a suit brought by Motorola against Huawei for stealing trade secrets (settled out of court), and T-Mobile’s suit for copying a phone-testing robot which resulted in Huawei paying millions of dollars in damages.

The particularly alarming – and noxious – part of the Huawei saga is that many of its employees in the United States (nearly all of them Chinese) weren’t so keen on participating in the capers, but found that their concerns and warnings went unheeded back home.

In other words – the directive was to get the technology and the trade secrets, come what may.

This kind of behavior is one borne from something that’s far bigger than a single company … it’s a directive that’s coming from “China, Inc.”  Translation: The Chinese government.

The actions of the Trump administration regarding trade policy and protecting intellectual property can seem boorish, awkward and even clumsy at times. But in another sense, it’s a breath of fresh air after decades of the well-groomed, oh-so-proper “experts” who thought they were the smartest people in the room — but were being taken to the cleaners again and again.

What are your thoughts about “yesterday, today and the future” of trade, industrial espionage and technology transfer vis a vis China? Are we in a new era of tougher controls and tougher standards, or is this going to be only a momentary setback in China’s insatiable desire to become the world’s most important economy?  Please share your thoughts and perspectives with other readers here.

Boeing: Late to the reputation recovery party? Or not showing up at all?

Debris field from the Ethiopian Airlines plane crash (March 10, 2019).

It’s been exactly two months since the crash of the Ethiopian Airlines 737 Max 8 Boeing plane that killed all 157 passengers and crew on board. But as far as Boeing’s PR response is concerned, it might as well never ever happened.

Of course, sticking one’s corporate head in the sand doesn’t make problems go away — and in the case of Boeing, clearly the markets have been listening.

Since the crash, Boeing stock has lost more than $27 billion in market value — or nearly 15% — from its top value of $446 per share.

The problem is, the Ethiopian incident has laid bare stories of whistle blowers and ongoing maintenance issues regarding Boeing planes. But the company seems content to let these stories just hang out there, suspended in the air.

With no focused corporate response of any real coherence, it’s casting even greater doubt in the minds of the air traveling public about the quality and viability of the 737 planes — and Boeing aircraft in general.

Even if just 20% or 25% of the air traveling public ends up having bigger doubts, that would have (and is having) a big impact on the share price of Boeing stock.

And so the cycle of mistrust and reputational damage continues.  What has Boeing actually done in the past few months to reverse the significant market value decline of the company? Whatever the company may or may not be undertaking isn’t having much of an impact on the “narrative” that’s taken shape about Boeing being a company that doesn’t “sweat the small stuff” with proper focus.

For an enterprise of the size and visibility of Boeing, being reactive isn’t a winning PR strategy. Waiting for the next shoe to drop before you develop and launch your response narrative doesn’t cut it, either.

Far from flying below radar, Boeing’s “non-response response” is actually saying something loud and clear. But in its case, “loud and clear” doesn’t seem to be ending up anyplace particularly good for the Boeing brand and the company’s

What are your thoughts about the way Boeing has handled the recent news about its mode 737 aircraft? What do you think could have done better?  Please share your thoughts with other readers here.

Ruling the roost: Poultry is poised to become the world’s most consumed protein.

This 2015 projection of protein production published by The Wall Street Journal has been upended by the spread of African swine flu; poultry will overtake pork this year instead.

In the United States it seems hardly news that poultry is the most-consumed protein. In recent years poultry consumption in America has grown while beef consumption has stagnated, weighed down by high prices at the consumer level.

At the same time, the National Pork Board committed an unforced error earlier in this decade when it abandoned its longstanding (and doubtless highly effective) tagline “The Other White Meat” in favor of the mealy-mouthed platitude “Pork: Be inspired” – a slogan that convinces no one of anything.

Persistent reports from the medical community that red meat is less healthy than consuming poultry and fish products haven’t helped, either.

But poultry’s prominence in the American market hasn’t necessarily extended to many parts of the rest of the world. But that’s now changing.

In fact, according to reporting from a recently-concluded International Poultry Council meeting in the Netherlands, poultry is poised to become the most consumed meat protein in 2019.

The precipitating factor is African swine fever, which is now affecting pig herds in 15 countries on three continents. Pork production losses this year are expected to represent ~14% of the world’s pork supply – and that’s just the minimum forecast; the losses could go higher.

Interestingly, African swine fever’s most significant initial outbreaks were in Russia and Eastern Europe, but now East Asia is being affected most significantly. The first cases were found in China beginning in August 2018 but now have spread rapidly throughout the country.  For a country that is responsible for nearly half of the world’s supply of pigs, that’s a very big deal.

The swine fever is spreading to the nearby country of Viet Nam as well – which is the world’s fifth largest producer of pork.

The problem for pig growers is that African swine fever is the quintessential death sentence: The disease has a 100% mortality rate, and no vaccine has been developed to guard against its spread.

According to global food and agriculture financing firm Rabobank, China is expected to experience a ~30% drop in pork supplies this year, which in turn will mean a decline in total world protein supplies. The twin results of these development:  an increase in prices for all proteins … and poultry will overtake pork this year as the world’s most consumed protein.

Until such time when an effective vaccine against African swine fever is developed, we can expect that production of other proteins like poultry, eggs, beef and seafood will rise. So, it seems as though poultry’s presence as the world’s most-consumed protein will likely endure.  Poultry’s position as the protein leader may have stemmed from a different impetus in the United States than in the rest of the world, but everyone has ended up in the same place.

For PCs, a new lease on life.

There are some interesting results being reported so far this year in the world of “screens.” While smartphones and tablets have seen lackluster growth — even a plateauing or a decline of sales — PCs have charted their strongest growth in years.

As veteran technology reporter Dan Gallagher notes in a story published recently in The Wall Street Journal, “PCs have turned out to be a surprising bright spot in tech’s universe of late.”

In fact, Microsoft and Intel Corporation have been the brightest stars among the large-cap tech firms so far this year. Intel’s PC chip division’s sales are up ~16% year-over-year and now exceed $10 billion.

The division of Microsoft that includes licensing from its Windows® operating system plus sales of computer devices reports revenues up ~15% as well, nearing $11 billion.

The robust performance of PCs is a turnaround from the past five years or so. PC sales actually declined after 2011, which was the year when PC unit sales had achieved their highest-ever figure (~367 million).  Even now, PC unit sales are down by roughly 30% from that peak figure.

But after experiencing notable growth at the expense of PCs, tablet devices such as Apple’s iPad and various Android products have proven to be unreservedly solid replacements for PCs only at the bottom end of the scale — for people who use them mainly for tasks like media consumption and managing e-mail.

For other users — including most of the corporate world that runs on Windows® — tablets and smartphones can’t replace a PC for numerous tasks.

But what’s also contributing to the return of robust PC sales are so-called “ultra-mobile” devices — thin, lightweight laptops that provide the convenience of tablets with all of the functionality of a PC.  Those top-of-the-line models are growing at double-digit rates and are expected to continue to outstrip rates of growth in other screen segments including smartphones, tablets, and conventional-design PCs.

On top of this, the continuing adoption of Windows 10 by companies who will soon be facing the end of extended support by Microsoft for the Windows 7 platform (happening in early 2020) promises to contribute to heightened PC sales in 2019 and 2020 as well.

All of this good news is being reflected in the share prices of Intel and Microsoft stock; those shares have gone up following their most recent earnings reports, whereas all of the other biggies in the information tech sector — including Alphabet, Amazon, Apple, Facebook, IBM, Netflix and Texas Instruments — are down.

It’s interesting how these things ebb and flow …

China-bashing is taking its toll.

Over the past year, Americans have been fed a fairly steady stream of news about the People’s Republic of China – and most of it hasn’t been particularly positive.

While Russia may get the more fevered news headlines because of the various political investigations happening in Washington, the current U.S. presidential administration hasn’t shied away from criticizing China on a range woes – trade policy in particular most recently, but also diverse other issues like alleged unfair technology transfer policies, plus the building of man-made islands in the South China Sea thereby bringing Chinese military power closer to other countries in the Pacific Rim.

The drumbeat of criticism could be expected to affect popular opinion about China – and that appears to be the case based on a just-published report from the Pew Research Center.

The Pew report is based on a survey of 1,500 American adults age 18 and over, conducted during the spring of 2018.  It’s a survey that’s been conducted annually since 2012 using the same set of questions (and going back annually to 2005 for a smaller group of the questions).

The newest study shows that the opinions Americans have about China have become somewhat less positive over the past year, after having nudged higher in 2017.

The topline finding is this: today, ~38% of Americans have a favorable opinion of China, which is a drop of six percentage points from Pew’s 2017 finding of ~44%.  We are now flirting with the same favorability levels that Pew was finding during the 2013-2016 period [see the chart above].

Drilling down further, the most significant concerns pertain to China’s economic competition, not its military strength. In addition to trade and tariff concerns, another area of growing concern is about the threat of cyber-attacks from China.

There are also the perennial concerns about the amount of U.S. debt held by China, as well as job losses to China; this has been a leading issue in the Pew surveys dating back to 2012. But even though debt levels remain a top concern, its raw score has fallen pretty dramatically over the past six years.

On the other hand, a substantial and growing percentage of Americans expresses worries about the impact of China’s growth on the quality of the global environment.

Interestingly, the proportion of Americans who consider China’s military prowess to be a bigger threat compared to an economic threat has dropped by a statistically significant seven percentage points over the past year – from 36% to 29%. Perhaps unsurprisingly, younger Americans age 18-29 are far less prone to have concerns over China’s purported saber-rattling – differing significantly from how senior-age respondents feel on this topic.

Taken as a group, eight issues presented by Pew Research in its survey revealed the following ranking of factors, based on whether respondents consider them to be “a serious problem for the United States”:

  • Large U.S. debt held by China: ~62% of respondents consider a “serious problem”
  • Cyber-attacks launched from China: ~57%
  • Loss of jobs to China: ~52%
  • China’s impact on the global environment: ~49%
  • Human rights issues:  ~49%
  • The U.S. trade deficit with China: ~46%
  • Chinese territorial disputes with neighboring countries: ~32%
  • Tensions between China and Taiwan: ~21%

Notice that the U.S. trade deficit isn’t near the top of the list … but Pew does find that it is rising as a concern.

If the current trajectory of tit-for-tat tariff impositions continues to occur, I suspect we’ll see the trade issue being viewed by the public as a more significant problem when Pew administers its next annual survey one year from now.

Furthermore, now that the United States has just concluded negotiations with Canada and Mexico on a “new NAFTA” agreement, coupled with recent trade agreements made with South Korea and the EU countries, it makes the administration’s target on China as “the last domino” just that much more significant.

More detailed findings from the Pew Research survey can be viewed here.

3D printing: The newest market disrupter?

3D printing was in the “popular press” a few weeks back when there was a dust-up about plans to publish specifications online for the manufacturing of firearms using 3D technology.

Of course, that news hit the streets because of the hot-button issues of access to guns, the lack of ability to trace a firearm manufactured via 3D printing, plus concerns about avoiding detection in security screenings due to the composition of the pieces and parts (in this case, polymer materials).

But 3D printing should be receiving more press generally. It may well be the latest market disrupter because of its promise to fundamentally change the way parts and components are designed, sourced and made.

3D printing technologies – both polymer and metal – have been emerging for some time now, and unlike some other technologies, they have already found a highly receptive commercial audience. That’s because 3D printing technology can be used with great efficiency to manufacture production components for applications that are experiencing some of the hottest market demand – like medical instrumentation as well as aerospace, automation and defense products.

One the baseline benefits of 3D printing is that it can reduce lead times dramatically on custom-designed components. Importantly, 3D printing requires no upfront tooling investment, saving both time and dollars for purchasers.  On top of that, there are no minimum order requirements, which is a great boon for companies that may be testing a new design and need only a few prototypes to start.

Considering all of these benefits, 3D printing offers customers the flexibility to innovate rapidly with virtually no limitations on design geometries or other parameters. Small minimum orders (even for regular production runs) enable keeping reduced inventories along with the ability to rely on just-in-time manufacturing.

The question is, which industry segments will be impacted most by the rise of 3D printing? I can see ripple effects that potentially go well-beyond the mortal danger faced by tool and die shops.  How many suppliers are going to need to revisit their capabilities in order to support smaller production runs and über-short lead-times?

And on the plus side, what sort of growth will we see in companies that invest in 3D printing capabilities?  Most likely we’ll be seeing startup operations that simply weren’t around before.

One thing’s for sure – it will be very interesting to look back on this segment five years hence to take stock of the evolution and how quickly it came about.  Some market forecasts have the sector growing at more than 25% per year to exceed $30 billion in value by that time.

Like some other rosy predictions in other emerging markets that ultimately came up short, will those predictions turn out to be too bullish?

Future shock? How badly is cyber-hacking nibbling away at our infrastructure?

I don’t know about you, but I’ve never forgotten the late afternoon of August 14, 2003 when problems with the North American power grid meant that people in eight states stretching from New England to Detroit suddenly found themselves without power.

Fortunately, my company’s Maryland offices were situated about 100 miles beyond the southernmost extent of the blackout. But it was quite alarming to watch the power outage spread across a map of the Northeastern and Great Lakes States (plus Ontario) in real-time, like some sort of creeping blob from a science fiction film.

According to Wikipedia’s article on the topic, the impact of the blackout was substantial — and far-reaching:

“Essential services remained in operation in some … areas. In others, backup generation systems failed. Telephone networks generally remained operational, but the increased demand triggered by the blackout left many circuits overloaded. Water systems in several cities lost pressure, forcing boil-water advisories to be put into effect. Cellular service was interrupted as mobile networks were overloaded with the increase in volume of calls; major cellular providers continued to operate on standby generator power. Television and radio stations remained on the air with the help of backup generators — although some stations were knocked off the air for periods ranging from several hours to the length of the entire blackout.”

Another (happier) thing I remember from this 15-year-old incident is that rather than causing confusion or bedlam, the massive power outage brought out the best in people. This anecdote from the blackout was typical:  Manhattanites opening their homes to workers who couldn’t get to their own residences for the evening.

For most of the 50 million+ Americans and Canadians affected by the blackout, power was restored after about six hours.  But for some, it would take as long as two days for power restoration.

Upon investigation of the incident, it was discovered that high temperatures and humidity across the region had increased energy demand as people turned on air conditioning units and fans. This caused power lines to sag as higher currents heated the lines.  The precipitating cause of the blackout was a software glitch in the alarm system in a control room of FirstEnergy Corporation, causing operators to be unaware of the need to redistribute the power load after overloaded transmission lines had drooped into foliage.

In other words, what should have been, at worst, a manageable localized blackout cascaded rapidly into a collapse of the entire electric grid across multiple states and regions.

But at least the incident was borne out of human error, not nefarious motives.

That 2003 experience should make anyone hearing last week’s testimony on Capitol Hill about the risks faced by the U.S. power grid think long and hard about what could happen in the not-so-distant future.

The bottom-line on the testimony presented in the hearings is that malicious cyberattacks are becoming more sophisticated – and hence more capable of causing damage to American infrastructure. The Federal Energy Regulatory Commission (FERC) is cautioning that hackers are increasingly threatening U.S. utilities ranging from power plants to water processing systems.

Similar warnings come from the Department of Homeland Security, which reports that hackers have been attacking the U.S. electric grid, power plants, transportation facilities and even targets in commercial sectors.

The Energy Department goes even further, reporting in 2017 that the United States electrical power grid is in “imminent danger” from a cyber-attack. To underscore this threat, the Department contends that more than 100,000 cyber-attacks are being mounted every day.

With so many attacks of this kind happening on so many fronts, one can’t help but think that it’s only a matter of time before we face a “catastrophic event” that’s even more consequential than the one that affected the power grid in 2003.

Even more chilling, if it’s borne out of intentional sabotage – as seems quite likely based on recent testimony – it’s pretty doubtful that remedial action could be taken as quickly or as effectively as what would be done in response to an accidental incident likr the one that happened in 2003.

Put yourself in the saboteurs’ shoes: If your aim is to bring U.S. infrastructure to its knees, why plan for a one-off event?  You’d definitely want to build in ways to cause cascading problems – not to mention planting additional “land-mines” to frustrate attempts to bring systems back online.

Contemplating all the implications is more than sobering — it’s actually quite frightening. What are your thoughts on the matter?  Please share them with other readers.