The internet has taught us to be wary of what we read. Most of us can quickly spot a fake restaurant review on Yelp, a phony news story trending on Facebook, or customer testimonial that seems a little too enthusiastic to be real.
Sound research is an important part of any buying process, but it’s particularly important with cybersecurity software. IBM Chairman and CEO Ginni Rometty has called cybercrime the greatest threat to every company in the world. Cybercrime is expected to cost the world $6 trillion annually by 2021, and it hits companies of all sizes. In this climate, there’s no excuse to be lax about data breach prevention.
Software reviews are an important component of the research process, but not all reviews are equal – far from it, in fact. The quality of reviews depends heavily on the source, the methodology and the funding.
Here’s how to spot a good one:
The reviewer has credentials
Know who the reviewer is (and not just the name). Find out how he or she is qualified to assess cyber security software. Is he a professional tech writer or analyst with significant experience in the field? Is she a cyber security software developer, an IT manager or a self-taught tech geek with a popular blog?
Reputable websites will make it obvious who the writer is, rather than obscuring or avoiding that detail. Most sites post at least a short bio that makes it easy to assess the writer’s credentials. If the review is anonymous or the writer’s background doesn’t seem all that relevant, keep looking.
He or she wasn’t paid
It’s OK if the reviewer was paid by his or her employer – i.e. the media company or research firm where they work. We all (hopefully) get paid to do our jobs, and reviewers are obviously no exception.
What’s not OK is if they were paid by a vendor. Vendor-sponsored reviews are inherently biased, for obvious reasons. The reviews or testimonials a vendor posts may be real and accurate, but keep in mind that they probably don’t tell the whole story. Unlike independent, third-party reviews, vendors aren’t like include any information that discusses the software’s drawbacks.
The process was fair
Examining the methodology used to review cyber security software is extremely important. Good reviewers will always explain the process in detail. Usually, you’ll find a methodology section in the review itself or elsewhere on the website. If the methodology isn’t explained, there’s no way to know if the process was fair.
Generally, the methodology should describe how the vendors and products were selected, how the software was tested and evaluated (specifically), whether user feedback was gathered, and how the rating or grading system works. Strong review processes should include an actual demo or test period of the product. It’s not fair to judge a product without having used it.
Methodologies vary, and there’s no one process that’s superior to all others. The most important things is that the process is comprehensive, uniform and fair.
The review was broad
It’s impossible to say a cyber security software product is the best, safest or easiest to use if you’ve only reviewed and tested a handful of products. Steer clear of reviews that use these superlatives without having tested a very wide variety of products.
That’s not to say a review or product recommendation is no good unless every single vendor is the market was included. In most cases, that would be impossible. However, if the review looks at more than one product and makes comparisons, the methodology should include strong reasoning for why those specific products were chosen. The selection process shouldn’t be arbitrary.