Sunday, January 23, 2011

Security Scoreboard - Join the Conversation

This week Security Scoreboard made an exciting announcement - the company received angel funding and Dominique Levin has joined as full-time CEO.

Now that we have an expanded team and some cash (both good things), we would like to share some of our plans with the community. And more importantly, we would like to invite the community to join in and help shape the future of Security Scoreboard.

A bit of background...

Security Scoreboard's mission is to provide unbiased end user
 experiences with security solutions in order to help security professionals find the right vendor for their organization's challenges. Almost
 exactly one year ago, I launched Security Scoreboard out of a 
need that I felt as a security practitioner: I was not happy with the 
information available about end user experiences with security
 solutions. If you tried researching a security solution, you found plenty of product information from vendors themselves. If you were lucky you might have found some analyst and third party or trade publication reviews. All potentially relevant - but what about actual end-user experiences? There was a lack of information from users who had actually bought, implemented, and used different security solutions.

Security Scoreboard was built to answer this need.

 The response within the community was extremely positive and underscored the urgent need for a credible platform for unfiltered end-user voices. It also became clear over time that the Security Scoreboard movement had grown beyond the capability of one person to build and operate in their spare time. I am very excited that Dominique Levin - an industry veteran well known to many of you from her time heading up LogLogic - shares the original vision and has joined Security Scoreboard as its full-time CEO.

Challenges to Building a New Ecosystem

Security Scoreboard seeks to fundamentally change the way CISOs, CIOs and other "security consumers" evaluate vendors. There are four key ingredients to achieve this:

1. TRUST - Users need a way to determine the credibility of reviews
2. PRIVACY - Users need to be able to leave reviews with a reasonable degree of privacy
3. ACTIONABLE INFORMATION - Users need a way to get the information that matters to them quickly and efficiently
4. TRANSPARENCY - Users need to know how the site funds itself and the formula behind any pay-for-play.

Consumer reviews sites like TripAdvisor and Yelp face similar challenges in the consumer space. And while security professionals might be a slightly more skeptical bunch than your average person, the basic challenges Security Scoreboard faces are the same as other community driven review sites. These challenges are make-or-break for Security Scoreboard, so we want to share our thoughts on each one with the community -


1. TRUST - How do you know whether a review is legitimate?

Screening reviews for obvious plugs or badmouthing is a critical challenge. Users need to know how legit each review is. As Hoff, Lenny Zeltser, and others have pointed out, developing a reputation system allowing users to evaluate Security Scoreboard reviews is critical to our success.

We envision Security Scoreboard having tiered reviews – those written by loosely authenticated reviewers should be taken with a grain of salt, while those written by reviewers who have been vouched for by reputable entities should carry more weight. The nature of this reputation system needs to be rooted in the existing security community. We are exploring a number of tools to factor into this reputation system – from transitive tokens (more on this below) to leveraging existing security organizations and communities. At the same time we are studying what has worked and what doesn't work in other online communities facing the same challenge.

2. PRIVACY

Many security managers do not feel comfortable posting comments about vendors in public forums. Some might even regard their use of a particular solution as confidential information. On the other hand, as discussed above Security Scoreboard needs to vet that reviews have been posted by legitimate users.

Currently we have an informal and not completely scalable approach to vetting reviews while not publishing reviewers' identifying information. As we grow, we are building a more formal structure around reviewer identification. We are also looking into some fancier token-based systems, so that a current trusted user of the site can distribute these tokens to trusted colleagues without the site being aware of their identity. This can spill over into the privacy-overkill zone, so we intend to restrict ourselves to those reasonable privacy measures that would make typical users comfortable leaving reviews on the site. This is tightly bound with the credibility issue and is an issue we intend to continuously involve the community in.

3. ACTIONABLE INFORMATION

Credible reviews are only valuable if they lead to easily accessible and actionable data. Security Scoreboard strongly believes in openness of data and metrics (check out the analytics data for product categories or register to see the popular keywords associated with each individual vendor). As we gather more reviews and evolve the authentication schemes described above, we plan to build more sophisticated accompanying metrics to slice and dice data according to parameters that are important to end-users. Reviewer credibility will become an important factor in these algorithms.

There are some other obvious improvements that are on our short terms product roadmap. Some of you have noticed that Security Scoreboard currently does not let you rate a vendor’s individual products. For small vendors with one main product, rating the product and rating the company is pretty much the same thing. But for large companies like Symantec, McAfee, Microsoft, etc there is an obvious need to rate individual products rather than the vendor as a whole. We’re onto this, and will be shortly introducing changes to allow for rating of specific products as well as direct product comparisons.

4. TRANSPARENCY

Nothing kills credibility faster than backdoor pay-for-play. This lack of transparency affects a large portion of the third party information available today for IT systems in general.

Right now we are focused on building the community at Security Scoreboard and have not yet decided on a final revenue model. Vendors will play a role in this model, but we intend to keep completely openness about how the bills are being paid. Sponsored content and objective results are not mutually exclusive; for example the existence of Google Adwords has not eroded confidence in the organic results produced by the Page Rank algorithm. At Security Scoreboard we intend to have a similarly transparent and open revenue model from day one.

Help us build the future of Security Scoreboard

We are looking for community insight and input on all four of these challenges, and especially in building our reputation and privacy systems. The Security Scoreboard cause will stand or fall with the authenticity and credibility 
of product reviews and ratings.

 This is a movement for and by end users, so if you have some time to chime in, we would love to hear from you.

Joining the Discussion

If you want to join the discussion, please just send an email to voice at securityscoreboard dot com with your name and affiliation. Don't worry about spam - we'll be happy to take you off the list whenever you want.

This mailing list is open to anyone in the security community and beyond who is interested in contributing to our discussion - end-users, vendors, academics, and the like. If you think that Security Scoreboard is a useful tool and you are interested in influencing our future direction, please be sure to sign up and join the discussion!

1 comment:

  1. Sounds like fantastic project.

    As a penetration tester, it's often hard to identify the best solution to a particular challenge. We often find ourselves dealing with vastly differing technical areas on a daily basis - all of which require differing software solutions, some free and others paid for, from a range of different vendors. Any solution that provides end-user reviews of security products is, in our opinion, a great step forward.

    If we can lend a hand, either commercially or non-commercially, please let us know - this sounds like it good be a great thing for the security community.

    One thing we'd love to see implemented is a system that allows for filtering based upon the type of end-user that write the review. For example, non-technical end users would almost always prefer reviews written by non-technical users of the system; whilst penetration testers would likely prefer reviews written by other penetration testers.

    ReplyDelete

Note: Only a member of this blog may post a comment.