360s: How many raters, questions?

When deploying a 360 assessment there are many details to be considered.  Here are some ideas on two of them:

How many raters should be asked to provide feedback?
The number depends on the participant (the person being rated) and their role in the organization.  My philosophy in general is:  Ask the boss, ask all the direct reports, and select 4 – 6 peers/colleagues.  Ask people who can give relevant feedback (e.g., you see them more often than at an annual meeting).  If there are a group of participants going live with their assessments at the same time, make sure that there is some awareness of the number of ratings one person will be requested to provide.

How many questions should an assessment have?
I’ve seen assessments that are 15 questions long, and ones that are 120 questions long.  About 80% of the 360 assessments we deploy are 40 – 60 questions long.  For your assessment first figure out what the objectives are and then determine what topic or competency areas will help you reach your objectives.  Many of our deployments cover about 6 – 9 topic areas, with about 4 – 6 scalable questions per topic.  Then decide if you want to ask a comment question after each topic area or at the end of the assessment (or not at all).  We recommend some area for comments since this information is key in helping to explain the quantitative results of the assessment.

Please let me know if you have other questions on 360 assessments that you would like me to address.

Author