Pages

Tuesday, June 8, 2010

Proposal Evaluations: Scoring

OpenOffice.org Calc 3.0.0 on Ubuntu 8.10Image via Wikipedia

One of the most often misunderstood measures of the Request for Proposals process is the proposal evaluation scoring matrix. Scoring a proposal is not like marking a school assignment, giving an A, B, C or 90%, 80%, 70% based upon the content. Instead each criterion in the Request for Proposals is weighted according to importance/relevance and each individual question is scored on a scale. Sometimes that scale is 0-4, or 0-5 or 0-10...in some cases the scale is A-D or A-F so evaluation committee members can't 'calculate' the score ahead of the consensus meeting. For example a 0 (or F) usually means "no information provided", on the 0-5 scale the scoring would typically be:
0 = No information provided
1 = Meets few requirements
2 = Meets most, but not all requirements
3 = Meets all requirements
4 = Meets and exceeds some requirements
5 = Meets and exceeds all requirements

You can find publicly available examples in the BC Ministry Guide to the RFP Process. There are similar guides in other public sector organizations (you just need to know where to look!)

The first step as an evaluator is to score each proposal on your own (individually). At this point it's just the individual stakeholder/end-user/subject matter expert reading/scoring without discussion with anyone else - I find it easier to read a proposal once, then score it on my second reading, but everyone has their own way of doing this. They can print the evaluation sheets and score by hand, or enter their scores into an evaluation spreadsheet, and save each as a separate file.

For the public sector, anything written down in the proposal or evaluation sheets IS FOI-able (it is deemed to be leading to a decision). Now on the other hand, I advise evaluators to not hold back from writing notes! I beg them to PLEASE, keep notes on why they scored things a certain way or if information is missing or unclear in the proposals. I also ask them to reference proposal page numbers in the comments so that our consensus discussion will move along more quickly, and we can easily find the information if ever asked to review the evaluations later.

As well, I advise evaluators that to keep things in context, when scoring, always consider how we will explain this score to the proponent in a debriefing? It's much easier to do debriefings if we have diligent notes in the scorebooks eg. "On pg 19, the vendor stated they did X but didn't demonstrate it" "On pg 24-27 they provide a case study of similar development work and the outcomes of the learning" (demonstration is done through action statements, and/or case studies).

As noted in a separate blog post, after the individuals finish scoring, we meet for consensus scoring.

No comments: