How many questions should a 360 assessment have?

Clients sometimes ask for input in designing a 360 assessment.  Two of their favorite questions are:

  • How many questions should an assessment have?
  • How many of those questions should be open-ended?

Let’s tackle the first question.  I have seen assessments with as few as 12 questions and as many as 90 questions.  Most of the assessments we help to design run between 45 and 60 questions.

To determine the “right” size for the assessment, you should think about:

  • What is (are) the objective(s) of the assessment?
  • Given the objective(s) of the assessment, what topic areas should be addressed (e.g., leadership, decision-making, communication, negotiation, integrity)?
  • How many questions will it take to cover each topic area?
  • Will raters have to rate more than one participant (person being rated)?  If yes, how many?

Most of our clients use between 7 and 10 topic areas, with 4 – 6 scalable questions per topic area.  This generally provides enough breadth to cover the objectives, and enough depth to provide meaningful feedback.   An assessment of this size generally can be completed in less than 10 minutes, and is considerate of the situation where some raters may have to complete multiple assessments.  A word of caution: when writing questions, you should scrutinize each question according to the following guidelines. Continue rewriting each question as needed without altering the original purpose of the question.

  • Keep each question simple and single-minded
  • Keep each question as short as possible
  • Use understandable and clear language
  • Be specific

Do not try to make an assessment shorter by combining several questions into one – the rater may be confused on how to answer the question, and the participant may be confused by the feedback.

As to how many open-ended questions should there be, most clients do one of two things – they put comment questions after each topic area, and either one or two more general questions at the end of the assessment, or they only put the comment questions at the end of the assessment.

For the comment questions at the end of the assessment, here are four ideas on how to approach collecting this important information:

The Start/Stop/Continue Method:

  • What should this participant continue doing?
  • What should this participant stop doing?
  • What should this participant start doing?

The Value Provided Method:

  • I value the contributions of this person as a leader because…
  • I believe this person could be an even more effective leader if…

The Do Well/Needs to Improve Method:

  • What are two things that this person does exceptionally well?
  • What are two things that you would like this person to focus on for improvement?

The General Method:

  • This assessment is intended to provide useful information to the person who you are rating.  If you have any comments or examples that will help the person understand your feedback, please provide them here:

Some clients put comment areas after each question, but most of our clients feel that it makes the assessment too long, it will take the rater too much time to take the assessment, and raters may write comments on the first few questions, but get tired of writing feedback before the end of the assessment.

The length of your survey is governed by your objective(s). Always keep in mind that you have to account for people’s time and attention spans so be as succinct as possible.  Open ended questions provide valuable data, so think carefully about how you want to use them.