360° feedback with mediators

Albina Popova
New Work Development
15 min readMay 31, 2019

--

At XING we have tried a number of ways to design the process of soliciting and providing feedback. 360° feedback with mediators is a process owned by the team to share feedback anonymously via an online survey where all participants are equal in terms of sharing and getting feedback.

This blog post covers the process inside out. In the end, it is more of a manual than a blog post itself. The reason for it — we would like to share not only our Aha!s but to enable others to start running and making their own experience with feedback with mediators. When pressed on time, stick with the “long story short”.

Even though I am documenting the format, there are a number of people who were involved in designing the process with Björn Linder in the lead.

Long story short

It is not always easy to share direct feedback. Be it sharing appreciation or framing a request. In some cultures, expressing a delight of any sort with a colleague could be considered as sucking up and thus avoided. It is much harder to formulate a request, to state that something in the behavior of the other person leads to extra work and extra nerves. The risk of hurting a relationship and appearing hard to work with easily overturns the urge to share. And no one wins.

One of the ways to minimize worries and the hesitation connected with sharing feedback is to rely on the process. Relying on a defined “how”, “when” and “why” eliminates the need to choose — whether or not to share feedback and how to do it.

We wanted the process to feed into our needs of fairness, autonomy, and control. Provide a bidirectional way of sharing feedback, allow participants to define the process that they will be taking part in.

Having these challenges in mind we have designed 360° feedback with mediators process. At its core, there is an anonymous online survey meant to gather feedback. The process overall and content of the survey, in particular, is owned by the team. The team decides which questions to include or exclude from the survey. Which fields should be optional, which must be filled out.

Relying on the process was one way to share the burden of choice and heaviness of risks connected with giving feedback. Relying on a mediator was the way to mitigate our fears of having improper feedback passed and simplifying the feedback-receiving part.

A mediator is a person selected by the feedback receiver who establishes a survey and has access to all of the raw data. The job of a mediator is to review raw replies, to make sure that they are not confusing or obnoxiously aggressive and to prepare a summary for the feedback receiver. Once the summary is ready, the mediator schedules a 1on1 meeting walking feedback receiver through the summary, playing the role of a coach and figuring out together what to do.

Our hope was that introducing the concept of a mediator would

  • Eliminate the constant temptation of checking the survey results every 5 minutes, and instead simply relying on a mediator to provide a summary.
  • Ensure that the feedback provided was matching certain standards, for example, that there were no unclear and unactionable generalizations or dealing with obnoxiously aggressive comments.
  • If there is a lot of similar feedback points, consolidate and group to focus on the main message.
  • Getting a sparring partner while trying to make sense of all of the feedback provided. What is the most important part, what could be the next steps, how to come up with a plan, whether there is a need to come up with a plan.
  • Discussing results with a mediator would help avoid over-thinking and entering analysis paralysis mode.

We have run this format with some variations in approximately a dozen teams. The most controversial part of the process turned out to be anonymity. Some people wanted to take part in the process only when anonymity was in place. At the same time anonymity allowed for confusing feedback, and for an increased level of harshness. The other piece of criticism was that the process was heavy-weight.

Nonetheless, we have been very meticulous when evaluating the result and asked all of the participants to rate different parts of the process, how valuable was their time investment overall. The bottom line was — 360° feedback with mediators turned out to be quite valuable and the vast majority of participants were willing to repeat it.

Designing the process and the nitty-gritty details

In our Growth Cluster (a set teams dedicated to acquiring and activating new users) we had regular Governance meetings, a practice taken from Holocracy. During these meetings, we normally discussed how we could improve the way we worked. One of the topics was how to organize the next round of 360° feedback. After we picked up this topic, it went through a couple of discussion rounds figuring out whether we wanted to collect information via an online survey or hand-written/printed one, anonymous or not, mandatory or voluntary participation, what were the criteria for good feedback, what to include in the survey. During these meetings, we have discussed the concept of a mediator, who would be willing to play the role of a mediator and everyone selected a mediator for themselves.

The result of the preparation work was an agreed set of questions to be included in the survey, a map of feedback receivers to their mediators, what the mediator is and isn’t responsible for.

Survey

We decided to go for an online survey. There would be a separate survey for every team member. The links to these surveys would be available to everyone within the cluster of teams working together. Anyone who cares to provide feedback would be able to do so.

Survey example

The information collected by the survey would be accessible only by the person’s mediator. Who will then review the replies, summarize them and discuss with the feedback receiver during a 1on1 meeting.

Survey template consisted of a numeric section with the net-promoter score and team contributions and free text one with appreciation and improvement suggestions.

Additionally to the sections above, we decided to provide the responder an option to de-anonymize the form, which would allow the mediator to clarify any confusing pieces.

Net Promoter Score

Survey responders would need to answer the question — How likely are you to recommend this person to work on your next team?

There is an option to select from 1 to 10:

  • 10 means the person absolutely must join your next team.
  • 5 would show a neutral opinion or in other words “don’t care”
  • <5 says that you would rather not recommend someone

Contribution Evaluation

The contribution section was inspired by the Valve employee handbook. In Valve’s case, the contributions were used to create an employee ranking to define their compensation. In our case the motivation was different. No ranking and no compensation. We simply wanted to see if such information could be of value for a feedback receiver. Especially if there would be not much that the feedback giver could share in the free text sections.

We decided to ask survey responders to evaluate

  • team contribution
  • product contribution
  • productivity or output
  • skills for the job

The evaluation itself consisted of selecting one of the categories for every contribution type:

  • watch carefully
  • room for improvement
  • solid run
  • above my expectations

Appreciation

A free text section which urged responders to be specific while sharing appreciation & try to focus on behavior rather than qualities. The reason for such a tip was the attempt to avoid vague feedback, be it positive or negative.

Improvement section

The free text section that was pre-empted with a set of tips

  1. Focus on yourself: Regardless of how surprising this sounds when giving feedback for someone else. Instead of describing qualities you don’t like, focus on a person’s behavior and how it makes you feel.

2. Be specific: in which situations did the person display such behavior.

3. Be positive: suggest ways how this behavior can be improved.

4. Be rational: target small incremental steps, the person most won’t be able to change completely from one day to another, even if (s)he wants to.

Option to de-anonymise

A result of one of the discussion during the Growth Cluster Governance meetings was to include the responder details and to provide an option to de-anonymize the survey. Filling out the field would be optional and provide an option for the mediator to clarify some feedback points.

Mediator concept

This kind of 360° feedback experiment was not the first one. During a previous year in another cluster, the set of teams ran a 360° feedback survey and passed on all of the feedback forms to one person. And even though the trust in that one person was great, the biggest piece of criticism for that previous ran of the experiment was — one person knows too much. Especially being the line manager of many. So this time around we wanted to address that piece of criticism and to give our colleagues more control and choice. This time around feedback receivers were able to define who will be the man in the middle themselves. Through a set of discussions, we ended up agreeing that a mediator

  • would be anyone on the same or neighboring teams who would gather feedback on the feedback receiver’s behalf.
  • could be a line manager if that was feedback receivers choice. As it was in my case.
  • should be willing to gather feedback. In other words, when a feedback receiver selects someone to be her mediator, a mediator can say no.
  • should be comfortable running a coaching session with the feedback receiver.

The mediator was responsible for:

  1. Setting up the survey and making the link available for everyone to use, and that the responses submitted are only available to the mediator.
  2. Gathering the results, reminding to fill out the survey if there were a small number of submissions.
  3. Summarising results, following up with feedback givers if they have provided their names in the survey and if their responses were crystal clear.
  4. Scheduling and running the 1on1 meeting, where a mediator would play the role of a coach.

In the retrospect, the most challenging part of the mediator’s job was the coaching session. To balance between summarising the results and not adding your own opinion of the person on top, figuring out which responses should not be included. How to design the discussion in a tense situation, when improvement suggestions significantly outweigh the appreciation section.

Mediators guideline

Once we have agreed on the concept and a set of responsibilities, we have created a short manual for mediators. The manual’s goal was to make sure that a mediator had all of the relevant information in a short and concise way including how not to mess up the survey and tips on running the coaching session.

Mediator to Feedback Receiver Map

Once all of the prep work was behind us, we had a table listing everyone in the cluster with their mediators and survey links.

Example of a mediator to a feedback receiver map

The mapping table is a trimmed example. The real one contained a mapping of all of the participants ( in our case 33 people) to their mediators and their survey links.

Train the feedback muscle workshop

One of the important steps in preparing was to have a workshop on how to provide feedback. Surely, there were tips in the survey itself where to focus, how to frame the message, but all of this experience was meant to improve our feedback giving skill in general, be it in the survey of face to face.

We have designed ourselves a short 1.5-hour workshop based on non-violent communication (NVC) and I-message ideas. It was very much real-world and practice-based. There were around a dozen situations prepared upfront. After getting a 10-minute theory intro into NVC framework and I-messages, we divided into groups and practiced framing feedback based on real-life situations that we had earlier in our cluster. Every group presented to the whole audience how would they frame feedback for a certain situation and then the rest provided their insights on how that feedback was perceived. Was the message clear, was the request clear, if they would be the receiver of such feedback, were they likely to act on it.

Even though fairly short, the workshop allowed us to set the emphasis on the request, rather than simply sharing criticism, on focusing on yourself and your own needs when framing the request, on figuring out how others react to you framing a request.

Feedback² or how did the experiment go?

During the first run of this format, we were quite meticulous about evaluating our progress and the results.

In that first run, there were 33 participants, 10 mediators and around 7 replies to a survey per person. NPS rating and discussions about the experiments were considered least valuable, and improvements and appreciation sections bringing most of the value. The average answer on the questions “On the scale from 1 to 10 how valuable was the time investment?” was 7.6.

Revisiting all kinds of survey and process aspects to figure out their value

What was the participant experience like?

I was both a participant and a mediator. As a feedback receiver, I did not have to do much, just wait for my mediator to schedule a 1on1 to go through the survey results together.

My NPS rating along with contribution evaluations was rather high, but that numeric information did not give me much food for thought. Mostly just giving a sense that there are no red flags on the surface. As in cases with people, for whom I was a mediator, I jumped directly to the appreciation and improvement section.

Improvement snippet #1

I regularly interpret your body language and facial expressions as if you are not interested or bored by what I’m saying. If this is not what you’d like to convey, try to observe your body language and facial expressions and find out if other people get the same impression about you.

Improvement snippet #2

Living the ‘agile’ mindset, Albina is moving very quickly in the ‘solution mode’ even when the specifics and complexity of the problem are not 100% clear to her. Statements like “in team xyz we tackled that particular problem in this and that way” are fine in principles, but sometimes they come too quickly.

Appreciation snippet #1

I’m really glad to have you on our team! Your attitude us that of a true team player and you bring lots of great skills and experience — particularly in the area of bringing product and engineering.

Appreciation snippet #2

What I find most fascinating about working with you is how you manage to always hit the bottom line when saying just 2 sentences and bring the discussion forward when others kept talking for hours.

In almost all improvements suggestions, I had an idea of the situations the feedback givers were referring to and I was able to figure out how to do better.

Reading through the appreciation section was unexpectedly emotional. The level of detail, carefully picked examples and language. I was humbled by the awesomeness of my colleagues, by their effort to put in writing all of the appreciation, even the tiny little things. Now, while writing the article and remembering how it felt reading that section, gives me a warm feeling of being valued.

What was the mediator experience like?

It was surely more relaxing to be in the feedback receiver role.

I was a mediator of seven people and some 1on1 sessions went in a true sparing and coaching spirit. Some were quite tricky.

At the beginning of the experiment, we said that we want to design it in the way that the feedback passed would not be harmful. Unfortunately, in one case we were not able to deliver on the promise, and that case was one of the seven people I was a mediator for. My subjective judgment was that the feedback passed was well-argued, based on examples and it was not obnoxiously aggressive. However, the person perceived the feedback as an attack.

Being a mediator was definitely a challenge. Having seen all of the replies, the job of the mediator was to make sure that the feedback is understandable and if not to clarify (if the responder decided to provide her name in the survey), group similar feedback points and having created a summary have a 1on1 session with the feedback receiver. During which the role of the mediator was to help the person absorb the feedback and come up with an actionable plan. I was planning to ask the following question during the session:

  • What piece of feedback is resonating with you?
  • What would you like to address?
  • What could be the first steps?

In some sessions, my prepared set of questions was helping. We were discussing feedback, cheering together from the appreciation section and coming up with a plan for the improvement one. In others, I had a feeling that the person simply needed the time to absorb and was not ready to talk at all. I had an impression that they were perceiving me as someone who was invading their private space. Even though they have selected me as their mediator. In those cases, the 1on1 session was rather quick and limited to me passing the summary.

Despite the challenging nature of being a mediator, I would volunteer again to be one. I like the coach word in the agile coach and only by entering challenging situations can I actually figure out if I am of any good or help to my colleagues when they actually need one.

It is also worth mentioning that many mediators had no prior coaching experiment, their primary roles were software developers and quality assurance engineerings. One of them told me that being a mediator was a cool challenge to have. Facing the hard-skills challenges day-to-day it was a refreshing switch.

The manual. Or what would you need to run a similar experiment?

If you’d like to run a similar facilitated format of the feedback giving you, here what is needed to get going:

  1. A survey template that mediators could use to establish individual surveys.
  2. Agreed set of responsibilities for a mediator.
  3. A number of people volunteering to act as mediators.
  4. A mediator to feedback receiver map.
  5. A timeline. This kind of format takes a while and it is useful to agree and have the information easily accessible. When the surveys are due, when the 1on1s should happen, when to recap.

The examples of all of the required artifacts are provided in the article above.

To ease the task of communicating the experiment and its goals we are also providing visual support material. Images below can be printed using A3 format and put in the team space as a reminder of key rules and dates.

Visual support material

In the HOW poster, the space near arrows can be used to list additional rules for this experiment and the mediators.

In the WHEN poster, the white space at the bottom of an arrow can be used to add in a date or a period for certain event to happen. Here are examples of how those two posters can be filled out (using sharpy would work as well), plus an empty WHEN poster if the flags are not working out.

Examples of HOW and WHEN posters and empty WHEN poster template

Post Scriptum

When selecting this kind of format, with all of its benefits and the value it brings to participants, beware of the risks. Having access to information others don’t it creates a fertile ground for conspiracy theories. It makes sense to reiterate what the moderator is and isn’t responsible for, how would you ensure the privacy and security of data collected, how the information would NOT be used. We ran variations of feedback with mediators format where the name in the surveys was replaced with a certain code to mitigate the leaking information concerns.

Having provided more details than a normal blog post assumes, we hope that our experience would be helpful to others. If you end up running a similar experiment, let us know who it went in the comments section below.

--

--