Wednesday, March 31, 2010

Horizon Report: Technologies to Watch

In my last blog entry, I wrote about the trends described in The Horizon Report (2010 Edition) that are driving the technology adoptions that are anticipated to rapidly increase over the next five years.  In this post, I'd like to share a few thoughts about the six emerging technologies that, according to the report, will be entering the mainstream in that timeframe.

The report places the six technologies along three adoption horizons:  near-term horizon (within one year), second-adoption horizon (within two to three years), and far-term horizon (within four to five years).

Technologies to Watch over the Next Year

Mobile Computing - Oh, those smart phones!  The masses can now connect to the internet wirelessly from virtually anywhere.  They are cheaper and easier to carry around than laptops.   These portable devices have tremendous implications for workplace learning.   I know people have been talking about them for a few years now, but I truly believe the report is on the money by predicting this is the year we will start to see mobile learning move beyond a handful of trendsetters and into common usage by training departments.  And of course, within a few days we will start to see people who don't mind shelling out $500 toting around the new heavily hyped iPad.

Open Content - I would think this trend has probably made many college administrators very nervous.  Why enroll in an expensive university program when information is everywhere for the taking?  But since many prestigious institutions are fostering this trend, they must know what they are doing.  The Horizon Report lists this as one of its less-than-one-year-to-adoption technologies, but I don't think it will impact the workplace so quickly.   I think it will take corporate learning departments a little bit longer to figure out how to harness and repackage of all the free content that is out there in ways that will make sense for their workers.

Technologies to Watch over the Next Two to Three Years

Electronic Books How about these things? I'm a train commuter. I used to love to read over people's shoulders. Books and newspapers articles always seem more interesting when someone else is reading them. But every day more of the train crowd is switching to Kindle readers and the like.  It's just not the same.  You have to be at just the right angle to see the screen.  So, there goes one of my hobbies! But seriously, what a great tool for workers to use to carry around reference material, policy information and anything else they might need for just-in-time performance support.

Simple Augmented Reality - Simple, huh?   A few years ago, I had a training vendor come in to demonstrate their capabilities with augmented reality.  They had done a lot of work creating training programs for the Navy to help people learn various functions on nuclear submarines.  Very impressive stuff!  I was thinking we could use the same approach to create virtual models of our products to train our service department on installation and equipment repair.  It seemed pretty advanced for its time.  Since the Horizon Report lists mainstream use of augmented reality as two to three years away, I guess it was.

Technologies to Watch over the Next Four to Five Years

Gesture-based Computing - Imagine what version 7.0 of the Nintendo Wii will be like.  Imagine us doing away with keyboards and mouses (mice?) as input devices for our computers.   Interfacing with our personal computers will probably be very much like Tom Cruise's experiences in Steven Spielberg's film Minority Report.  The computers will respond to our natural movements and facial expressions.   It gives whole new meaning to the idea of a workplace training simulation.

Visual Data Analysis - According to the Horizon Report, visual data analysis is characterized by "its focus on making use of pattern matching skills that seem to be hard-wired into the human brain" and by the way in which it "facilitates the work of teams working in concert to tease out meaning from complex sets of information."    And,  "it allows for the interactive manipulation of variables in real time."  - Nuff said.  Call me in five years on this one!

Sunday, March 28, 2010

A Look to the Horizon (Report)

I recently picked up a link to The Horizon Report (2010 Edition) from one of the people I follow on Twitter.  Each year the report identifies and describes six emerging technologies that are predicted to have an impact on the academic world and/or the learning industry within the next five years.   The report further divides this adoption period into three phases: near-term horizon (within one year), second adoption horizon (within two to three years), and far-term horizon (within four to five years).   It also identifies key trends that drive adoption of the emerging technologies that are predicted go mainstream in the five-year period.  I found this section of the report interesting.  Clearly, the trends they describe are upon us now.  In this blog entry,  I'd like to share those trends, and some thoughts on their impact.

Key Trends Driving Technology Adoption over the Next Five Years

The abundance of resources and relationships made easily accessible via the internet is increasingly challenging us to revisit our roles as educators in sense-making, coaching, and credentialling.

This statement was obviously written with the context of academic institutions in mind, particularly when thinking about credentialling.  Getting a college degree has been the baseline requirement for any career-minded individual for decades.  Yet, when graduates arrive in the workplace, they are often in need of additional training.  In the world of instructional design, there is an ongoing debate that pops up from time-to-time about the value or necessity of having a degree in the field.  I wrote about this issue back in December.  A year earlier in her wonderful blog Learning Visions, Cammie Bean, speaking about a gathering of instructional designers at DevLearn '09 wrote:   "Of the 25 plus IDs in the room, only two had advanced degrees in ID.  Most people found themselves in the role of ID somewhat by accident – by 'discovering that I had a knack,' demonstrating an affinity for ID, by being a good teacher, etc."  Many people who support the position that a degree in instructional design is not necessary make the argument that a motivated individual can learn everything they need through hands-on experience coupled with an informal education provided by books, articles, blogs and other internet sources.  The ability to get that kind of education in almost any field is rapidly increasing.  In the workplace,  employees no longer look for a company training catalog when they have knowledge gaps.  They turn to Google or Wikipedia as a jumping off point to quickly find the resources they need.

People expect to be able to work, learn, and study whenever and wherever they want to.

Telecommuting, virtual teams, and agile worker programs are becoming commonplace. My company has had telecommuters for a long time.  I have been managing a virtual team for a few years now.     Last year, my company began piloting an agile worker program in several places.  This program is expected to grow rapidly over the next few years.  It is only logical that employees who work virtually will expect to learn virtually.  Our focus has to be on creating virtual learning environments to support this need.

The technologies we use are increasingly cloud-based, and our notions of IT support are decentralized.

A few years ago, there was great concern in the corporate world over hackers getting behind our firewalls.  Now, with cloud computing,  we don't seem to care where our information is stored as long as it is protected and it is accessible when we need it.  This drives more people to access information from mobile devices, which in turn drives our need in the learning industry to be able to capitalize on mobile learning.  While this has been talked about for some time, there have only been a handful of "wow" examples of mobile learning in wide use.  I believe this is the year we will move beyond those few "wow" examples and start to see some mainstream usage which will accelarate this trend.

The work of students is increasingly seen as collaborative by nature, and there is more cross-campus collaboration between departments.

I think everyone can agree that the synergy created by collaborative efforts is a great payoff of working in teams.   However, I personally find this trend maddening in the academic world.   When I started my distance learning graduate program a few years ago, I would occasionally have group activities or projects as part of my classes.  Now it seems that each class is one long group project from beginning to end.  I feel like a victim of this trend.  I don't have the flexibility in my life for this type of commitment.  The reason I chose to be a part-time, distance learning student in the first place was so that I could fit in my classwork according to my schedule and availability.  It is a nightmare trying to coordinate schedules with other working professionals who like myself have jobs that involve travel and are trying to balance that with school and family obligations.   On the plus side, this has made me more sensitive as to how we construct and conduct collaborative learning in our training programs.

Monday, March 15, 2010

What Was the Question?

In my last blog entry, Obsessing over Assessment, I recommended choosing question formats that make sense for the level of learning you need to assess. I then went on to discuss the most popular form of test item: the multiple choice question. However, there are other options, and some of them may be more appropriate for use, depending on what you are trying to assess.

Here are five common question types along with a few guidelines as to how and why you would use them.
  1. Short Answer Questions - also known as fill-in-the-blank - are best used to assess basic information that learners need to commit to memory. They are helpful for testing terminology, facts and simple computations. Short answer completion items should have only one brief correct answer. Typically, the blank for completing the statement is placed at the end of the test item. Blank spaces for all items should be equal in size, and should not be any larger than necessary.
  2. True/False Questions – are second in popularity to multiple choice questions. Like multiple choice questions, they require the participant to select a response. In this case, there are only two options: True or False; Yes or No; Agree or Disagree. They are useful when there is a black and white distinction between two alternatives. There can be no gray areas. Well-written True/False questions are usually stated as declarative sentences that focus on a single idea. A common mistake that test writers make is to put two ideas in one statement, requiring both of them to be true in all cases. Another common error is to tip off the answer by including words such as “always” or “never” in the statement, which usually means it is false. And of course, on the downside test participants always have a 50-50 shot at the answer, so they may be likely to venture a guess.
  3. Matching Column Questions - are used to assess content knowledge and associations between ideas. They fit into the category of selection items along with True/False and Multiple Choice. They are constructed by stating the premise for each test item in the first column, and listing options or responses in random order in the second column. Typically, the items in the first column are identified by number, and the response choices in the second column are identified using letters. Some test makers are reluctant to use matching columns because they seem harder to construct than other question types, but they are really very similar in to multiple choice questions. The format of premises and responses is very clear for the test taker, and they are easy to score.
  4. Multiple Choice Questions - are everyone's favorite, and rightly so because they are so versatile. They give you the ability to go beyond testing for facts. You can write multiple choice questions to measure learning outcomes that test for knowledge, comprehension, and application and, to a lesser degree, analysis, synthesis and evaluation. A good multiple choice question includes a clear, well-written premise, and a list of reasonable response choices, one of them being the correct answer (which should not always be choice C!). You can also couple them with reading passages, tables or charts that include the correct answers to assess your test takers ability to interpret important information.
  5. Essay Questions – are useful for assessing how well test takers can analyze, compare, contrast, evaluate, interpret or integrate ideas. The obvious downsides are that they rely heavily on writing skills and they are a challenge to score. Those issues can be minimized by attending to how the questions are written. If the essay question is too open-ended, it leaves it up to the test taker to decide on which direction to go. Instead write questions that ask specific questions. The person participating in the assessment must be able to clearly identify what it is you want to know from them. Give them parameters, by using phrases such as, “provide five ideas on how you would…” or “give three reasons why you would…” You can also provide direction by asking the test taker to “consider the following factors…” when writing a response. Consider identifying word count minimums and maximums so the person will be able to gauge the level of detail expected. Also, for every question you write, make sure you create a model answer to aid scoring.
For more detailed information, check out Tests & Measurement for People Who (Think They) Hate Tests & Measurement by Neil J. Salkind. Part III of his book is called The Tao and How of Testing. It is a great resource for anyone who needs to construct assessments.


Saturday, March 6, 2010

Obsessing over Assessment

The law of attraction is at it again. Testing and assessment is running through my mind and it seems to be running through my life as well. There is a certification project currently being implemented in my company that involves training and testing. I am taking a course on Inquiry and Measurement that also involves testing. And, during this week’s #lrnchat, I was drawn in by a discussion thread in which the pros and cons (well, mostly cons) of assessment were being discussed. Why am I so preoccupied with Level 2 of Kirkpatrick’s Four Levels of Evaluation? It is mostly because of the certification project at work.

Certification is a term that gets thrown around a little too loosely in training departments these days. Put someone through a course and a post-test and BANG, you are certified – or worse, you are not. It is a tricky thing to put together a certification process that is valid and reliable. Very few companies can stand to wait out validation process, so they jump right in and begin training and “certifying” people.

Certifications do not guarantee that the person being certified has learned more than he or she would in a regular training program, but business leaders often feel it is operationally necessary to validate a level of knowledge or skill required to meet goals and targets. If certification is needed or required, here are some things to consider for the assessment process:

Test items should be directly related to learning objectives, which should be directly derived from performance requirements. This may seem obvious but I have seen many tests that have included filler material alongside valid questions.

Test only on important items, not obscure ones. It is not necessary to test someone on small details unless they are critical. Very few corporate employees are doing life-saving work that needs to be tested at a granular level.

Test items should be straight-forward. Don’t try to be tricky. What is the point? It only serves to confuse the learner and adds no value to the assessment.

Choose question formats that make sense for the level of learning you need to assess.  Multiple choice questions are commonly used on knowledge tests because they are easy to score and easy to tie to outcomes – but they are not always easy to write. Good multiple choice questions will have a clear premise in the stem of the question, a correct answer, and reasonable alternative choices. There should not be any throw away responses or convoluted choices such as “a and b, but not c” or “a and c” only. If your test item has more than one correct answer, then rethink the question format. Consider short answer questions or a matching column.

If possible, use randomized test questions. Most learning management systems have this capability. They allow you to create a bank of questions that can be drawn upon at random so that test-takers will be deterred from sharing answers. But don’t make the bank of test questions so large that everyone feels like they are taking a completely different test.

Pilot your test. This is the hard part, because it takes time and patience. You need to let a few people complete the learning experience and take the test to give you the opportunity to analyze the questions. You will want to take a second look at questions that everyone got right, or everyone got wrong, or questions for which many people chose the same incorrect answer.

Create rubrics for skills assessments.  Skills assessment usually requires direct observation.  It is important that all of your assessors are using the same criteria and weights when judging performance.  Validate the process by having multiple assessors review the same performance.  If they are more than a few points off from each other, either redesign the rubric or re-train your assessors.