For your annual review, I’ll need to see evidence of what you produced this year.
I don’t recall ever hearing reviews described quite this way but there’s truth in the statement. Reviews are grounded in evidence of your accomplishments, as measured against set goals.
Both aspects of this are important. Back in 2012 I wrote about a (bad) Performance Review system but highlighted what constitutes a good system:
According to the book The One Minute Entrepreneur there are 3 primary parts to an effective performance management system:
- Performance planning. This is where managers and their people get together to agree on goals and objectives to focus on.
- Day to day coaching. This is where managers help their people in anyway they can so they become successful. It doesn’t necessarily mean you meet up or talk about how things are going everyday. Instead managers should work to support their people, praising when things are going right and correcting when things go wrong. This is the stage where feedback happens – where real managing is done.
- Performance evaluation. This is where managers and their people sit down and examine performance over time; also called a performance review.
When I wrote this, the company didn’t have a full performance system in place and it was frustrating to provide evidence of what I produced.
At Promenade Group, things are different. We have a full system in place and it makes things easier on both the manager and employee. As a manager I’ve already seen the evidence over the duration of a year. During our coaching session (like 1:1s) we review both goals and accomplishments in real time and update them. By the time performance reviews come around, the evidence is mostly there and it’s just about final edits.
This reduces stress on the employee who doesn’t have to dig through notes from the past year trying to find something to mention. It reduces the stress of the manager who can see progress over time and prepare the appropriate rewards.
The Association for Software Testing is crowd-sourcing a book, Navigating the World as a Context-Driven Tester, which aims to provide responses to common questions and statements about testing from a context-driven perspective.