Sunday, February 28, 2010

Doing It Right: Instructional Design without Cutting Corners

What a pleasure it is when you are able to do a learning project the "right way."    This week, my team and I finished training for a group of managers in one of our business segments.   This was a project that we initiated in the fall.  It was carried out according to plan and within the next few weeks, we will have our final set of measures on its overall impact.

When I say we did this the "right way," what I mean is that we were able to follow our instructional design process without having to cut corners along the way.  You might be thinking, "Well, don't you always do that?"  But in truth, we are often forced to make compromises on our projects to meet business deadlines, work within budget constraints, or cater to the expectations of a particularly influential business leader.   But on this project, we were not constrained by any of those things.

The project was to provide training to approximately 35 managers who were mostly long-tenured and experienced, but who have recently had to deal with significant changes to their job.  Here is how it went:

Analysis - We originally approached the head of this business unit to get an understanding of the outcomes that were expected from the changes that were put in place, and to get his perspective on the impact he thought these changes would have on his managers.  Next, we had two rounds of discussions with four managers who were part of the target audience.  After the first meeting with them, we drafted an analysis report to feed back to them our understanding of the audience characteristics, the job, and the key tasks that were changing.  In our second meeting with the managers, we validated and fine-tuned the information gathered in the first meeting.  After that, we presented our analysis findings and a training design proposal, including a draft of the agenda and objectives, to the business unit leader and the Vice Presidents into whom the targeted training audience reported.   They provided some additional insights that we incorporated into our agenda and we were ready to begin designing the program.

Design & Development -  We chose a blended approach including two online assessments and an e-learning module as pre-work, a three-day classroom learning event, and follow-up learning opportunities made available through a SharePoint site set up specifically for this class.  The design process for the classroom event was relatively quick and easy.  Most of the training needs could be addressed with existing material that had been used for other programs.  There were a few key segments that would be new, but they were all on topics that were easy to research.  Finding appropriate content was not an issue.  Designing learning activities that would be effective at making the learning points was a little more challenging. But that is certainly a part of the job that my team enjoys doing.

Pilot & Revisions - Since our total audience was relatively small (at 35 managers) we did not really have the opportunity to conduct a full blown pilot.  We broke the audience into three delivery groups and viewed our first delivery in December as a quasi-pilot.   Overall it went well, but as with any new program for a new audience, there was room for improvement.  We huddled up afterwards, examined our level one feedback, talked to a few of the participants and observers, updated our design document, and made some adjustments for our second and third deliveries.

Implementation - By the time our second delivery rolled around, we were confident that we had the right program to meet their needs.   We were clear on which segments needed the most support and which would meet with resistance, and we prepared ourselves accordingly.  For all three classroom events, we had one of the Vice Presidents with us during delivery.  We carved out a small but important segment for them to specifically deliver, and for the rest of the time they were with us, they were able to provide clarification or join in the discussion as we covered the other items on the agenda.   Their presence and involvement was a key factor in the program's success.

Evaluation & Follow-up - For this program, we used level one (participant reaction) and level three (behavioral change) measurements.  The level one measurements were taken directly at the end of the classroom sessions.  For the level three measurements, we use the Friday5s goal management system over a ten-week period after training.  Each participant was asked to set two specific goals at the end of their classroom session.  These goals get input into the Friday5s online tool where the class participants can go to receive online coaching and track their progress.  Also, we continue the momentum created in the classroom by allowing participants to connect with each other after the event through a SharePoint site that was set up specifically for this program. 

On the whole this was a very satisfying project.  We got to help our managers and help our business by doing what we do best: creating a learning opportunity that met specific needs for a specific audience.   And, we got chance to do it right!

2 comments:

  1. How satisfying! I can definitely relate to being put in a position to have to cut corners. But the quality that can be produced by "doing it right" is amazing.

    ReplyDelete
  2. Couldn't agree more Mike. Luckily we have some of these projects from time to time to do what we can do best. Thanks for sharing.

    ReplyDelete