Oh No, They Wrecked My Beautiful Design!

“Training – that’s easy. We know the stuff.  We just wing it.” Sound familiar?

Those of us who adhere to principles of Dialogue Education™ and who respect and appreciate adult learners don’t buy that perspective. Whether we’re planning a workshop for personal growth, community education or workplace performance, we design carefully. We have our Steps of Design, we think about what Knowledge, Skills and Attitudes will be learned during the workshop, and we make sure we do not have too much “what” for the “when.” If we can, we connect with some of the learners before the workshop to assess their Learning Needs and Resources. We NEVER wing it!

Usually, we and the learners are delighted by the result, even if they don’t know the time and effort that went into the design. They often mention that they appreciated the tight organization, that they felt listened to, and that they enjoyed the interaction with others in the group. During the course of the workshop, we take time and effort to listen in, to be sure that the energy for learning stays high, and that directions for learning tasks are clear. When it’s all not humming along smoothly, we can and do make adjustments, building on the key principles that guide us.

Last October, I had the pleasure of crafting and critiquing beautiful designs with a group of fellow educators and trainers at the Advanced Learning Design and Evaluation Retreat in Vermont.

But, a thought…what happens when we create the design and someone else is facilitating?  How do we know these facilitators will share our level of commitment and fidelity to the design? While I know that I can be a little obsessive, I also know that learners should be able to trust that the learning objectives will be achieved. Without strict adherence to the design, we cannot know that learners’ knowledge, skills and attitudes were affected by a particular learning event, and cannot justify the resources expended on the event.

Back home, the retreat participants shared stories of how we had applied our learning from Vermont. It turned out that several of us were vexed by the implementation of training (carefully designed by us but facilitated by others). Here are five examples:

SITUATION #1: Workshops on opiate addiction, offered throughout the state to community groups.

Facilitators wanted to use different resource materials than those offered in the design, including videos, with the groups they were leading.

SITUATION #2: A day focused on quality improvement for medical residents.

The facilitators were experienced medical professionals who wanted to keep providing lectures to the residents, as they had done for years. Unfortunately, the lectures were not targeted to the learners, and contained too much “what” for the “when” and the “who.”

SITUATION #3: A county-wide, ten-week program for parents, offering peer-to-peer support and education on topics of children’s emotional and behavioral health.

The program utilized facilitators with minimal experience in leading groups, and whose expertise was grounded mostly in their own experience as parents, and not in knowledge of the principles of children’s health and development. They valued open group sharing more than focused learning activities.

SITUATION #4: Workshops for participants from international budget civil society organizations, aiming to equip them with skills to advocate on budgets to their governments.

Facilitators received the workshop design and went in one of two directions – neither of which achieved the desired outcomes of the workshops. Some facilitators were very free with the design, allowing discussions to go down paths that did not stay focused on the outcomes. Other facilitators, feeling constricted by using a design they had not created, would not deviate from it at all, producing a rigid, non-empowering workshop for participants.

SITUATION #5: Two different projects of professional development for early childhood educators (ECE): statewide in-person training engaging 50 trainers, and the production of 40 printed guides with training activities to be selected as needed by managers and supervisors in over 2000 individual ECE settings across the country.

Designers had no control over how facilitators used the materials.

My four colleagues and I shared our worries, frustrations and eventually our insights into how to develop training that has a high likelihood of being facilitated as designed, with the ultimate goal of meeting the needs of the learning and the learners.

We started with the recognition that the facilitators are really part of the “who” we consider in our Eight Steps of Design. Sometimes we were so focused on the ultimate learner that we simply ignored the facilitator and handed over the plan. We realized that we need to think just as clearly about the people who are on the front lines of the training. So, we will now do the following:

  • Share the information from the Learning Needs and Resources Assessment (LNRA) with the facilitators. That will help them to see that we have a reason – based in what the learners needed – for including specific learning activities and audio-visual materials.
  • Respect the expertise of the facilitators. Whether they are medical professionals with many years of formal education, or parents who are venturing into their first facilitation role, they have their own pride in their knowledge and skills, and their own fears about appearing competent to the trainees. Increasing communication with the facilitators – and if possible, conducting a thorough Training of Trainers (TOT) goes a long way toward assuring that the design is well-understood, and, therefore, followed.
  • Build in flexibility. Be clear on the learning objectives, so that facilitators can make some adjustments in their style of facilitation, and respond to their participant groups, without losing the essential elements of the learning tasks. In one of the ECE projects, the designers asked that facilitators make changes only if they believed the change was needed to support one of the six principles of adult learning: Safety, Respect, Inclusion, Engagement, Relevance or Immediacy.

We recognized that it would be useful to collect data. With two of our projects we created feedback forms so that facilitators could make note of where they made a change in the design.  With those forms returned to us, we could see what needs the facilitators saw within their participant group that required a change, or if there was a design flaw: something unclear, an activity that ran too long or fell flat, etc.

Finally, we gave some thought to publications. We considered how we could provide facilitators’ manuals that give clear information on what is essential information (through use of an icon, perhaps) and when an anecdote from the facilitator’s experience is appropriate.

With the national ECE guides, we offered a variety of learning activities within each module, so that facilitators could choose the ones that best met their situation, and yet achieved the objectives of the module. For the medical education program, we created a “Faculty Guide” that covered the elements that were essential for learners, some ideas for modifications where appropriate, and ideas for further coaching or exploration on a topic.

In the end, no one wrecked our designs. They gave us the opportunity to make them better.  There may have been some tense moments when the design wasn’t followed: in one situation the designer commented that “…the silence in the room spoke volumes to the lack of appropriateness of her presentation….” Stepping up our efforts to work with facilitators, publish helpful guides and collect data on needed changes will lead to more effective and enjoyable training as we go forward – while we still adhere to all the principles of excellent design.

Other resources of interest include:

  1. A Great Learning Design is Only 50% of the Work – a tip sheet
  2. A Dialogue Approach Transforms Corporate Training: A Spectacular Example – a blog

 

What do you do to bridge the gap between the designer and the facilitator when they are not the same person?

* * * * *

Peggy da Silva, MPH, is a longtime practitioner of adult education in out-of-school settings. She develops public programs and staff training systems, with the overall goal of building and supporting healthy communities. Peggy first discovered Jane Vella’s philosophy and methods over twenty years ago, sharing them with California Women, Infants and Children (WIC) programs. The staff training and certification system she and her team developed has been adopted by WIC programs across the country. Peggy often brings creative and learning-centered approaches to organizations that have not historically invested in high quality training, and sees the joy among learners – and positive results for funders – that result from careful design and evaluation. More information about Peggy’s work is available on her website: www.coheco.net .

Contributors

Aideen Gilmore is Senior Program Officer of Training, Technical Assistance and Networking at International Budget Partnership in Washington, DC.

Bridget Hogan Cole, MPH, is Executive Director at Institute for High Quality Care in Los Angeles, California.

Claudia Marieb is Substance Abuse Prevention Consultant at Vermont Department of Health in Springfield, Vermont.

Jesica Radaelli-Nida, MA, is Early Childhood Program Specialist in Albuquerque, New Mexico.

YOU MAY ALSO LIKE