SBN

Why We Value Inquiry Visibility Over … Well … Over Everything Else?

This fun discussion on industry analyst craft reminded me of an unfinished post I had sitting in my draft folder … for a year. And now it is finished!

When we create research and decide to include or mention vendors [uh-oh, careful with the topic, Anton :-)], we don’t do it at random or for corrupt reasons. We are guided by various research methods, formal and informal.

Specifically, in our team’s research, one thing often stands out from the crowd – end-user inquiry visibility of the vendor [Reminder to those not paying full attention: this is ***NOT*** about how many inquiries the vendor has with us, if they are a client. This is NOT about how many briefings they book. This is about END USERS asking / talking / sharing about their technology.]

For example, “A Comparison of UEBA Technologies and Solutions” (our 2017 UEBA tech comparison) has this:

“The criteria for inclusion in this comparison document were:

  • […]
  • Vendor is able to provide a list with a significant number of existing clients with the solution in production (paid production deployments, not POCs)
  • Vendor is visible in Gartner inquiry calls, mentioned by Gartner clients during the past year

In fact, I’d venture a guess that most our documents that compare vendors have something similar. This probably does NOT apply to Gartner Market Guides that often feature long, long lists of vendors that seek to be more or less comprehensive or at least broadly representative in some sense, bla-bla, legal disclaimers apply :-)

IMHO (and, by god, I mean it this time!), we are not a product discovery engine! We are not a marketing message amplifier! We are not about page views and ads, like journalists!

To me (and again, this is about a personal view, not a policy – hence another IMHO), our mission is to advise clients about PRODUCTS THAT WORK WELL IN REAL LIFE. Not about products are new, innovative, cool (well, occasionally that too), disruptive, widely-marketed, promising, lab-tested, interesting. But those that work. And “work” is defined NOT as “if you compile the code, the tool would run” :-), but as “really run and deliver real value to real clients under real world conditions of real reality” [this is “real” to the power of 6, no? :-)]

Got it?

So, how do we know they work? Well, because organizations that use those products call us with various questions related to the tools, and we use this data to learn what works for them, and how well.

Contrary to what some people think, this IMHO does NOT introduce a large vendor favor bias. Why not? Because customers of large vendors often call to tell us that their large vendor stuff does NOT work and is in fact total shite (“well, sir, this was the Security Product of the Year in 2001, and they are now planning a NG version. WIN!”).

For example, “Vendor C” was very well known to enterprise SOC and CIRT professionals when they were a 15 (!) person company. They are now much larger and everybody knows them. Similarly, “Vendor P” was once called an “800-lb gorilla of <their space>”, even though they had less than $5m in revenue at that time. So, NO, this approach does not favor big over small, but it does favor “real world proven” over “promising.”

Now, it may occasionally not surface a very new approach that is genuinely promising and will prove to work well if/when tried by enough people. Thus, we need to actively correct this approach to avoid the dreaded “past bias” or inertia bias (“well, we always used X and it sort of worked. so why look for anything new?”). As I say about some domains of security: “there are NO best practices here since we collectively didn’t practice this enough to know what is best.”

So, to us, product inquiry visibility [mind you, a measure that is devilishly hard to fake or engineer, if you are a vendor!], reference clients, even competitive mentions matter. A lot. They mean your tech is out there in the real world…

Related posts on analyst work:

*** This is a Security Bloggers Network syndicated blog from Anton Chuvakin authored by Anton Chuvakin. Read the original post at: https://blogs.gartner.com/anton-chuvakin/2018/07/13/why-we-value-inquiry-visibility-over-well-over-everything-else/