What responsibilities do you have after the eLearning Project has been deployed?

Designing Digitally


The adage “cart before the horse” seems appropriate in a few conversations that you can have around eLearning developments, but there is purpose and value to these seemingly backwards conversations. In particular, Designing Digitally, Inc. prefers to discuss Return on Investment (ROI) during project initiation. Analyzing this prior to creating an award winning product means post-product deployment data collection and analysis should move along a bit easier, giving you those ROI answers you are seeking. However, two things must happen: 1) the initial conversation on what constitutes ROI for your company and 2) the administrative and logistical aspects of collecting both quantitative and qualitative data.

One of the best ways to drive an ROI conversation in the beginning is to look at the objectives of the learning program. The objectives of the eLearning development should culminate into one goal that the learner is to achieve. For example a company may seek to improve safety or yield better production numbers on the manufacturing floor. A learner that cannot reach the objectives will impact the projected ROI. This means the design of the training must ensure that the objectives are being met during design and development and after implementation.

To achieve success in meeting objectives, strive to create contextually meaningful scenarios and activities. Bedding the learning in a familiar manner improves retention and transfer of learning to an actual work environment. However, it’s important to make sure to align roll out to maximize transference. Deploying new training on Friday provides little opportunity for the learner to demonstrate application until Monday. Retention rates diminish after 48 hours, so make sure consideration is given to the timing of deployment.

We would be remiss if we did not discuss the most popular form for gathering data: surveys. However, surveys are not just for post-implementation; they can be used while designing and developing the educational product as well. They do not have to be complex or overly in-depth. They can be targeted and simple and be just as effective.

Let’s use our example of creating contextually meaningful activities. Say we have just provided you a prototype of your Serious Game and are encouraging you to have some of your SMEs or learners test it out. A basic question of, “Does this relate to the work you do on a daily basis?” with a scale of agreement (Strongly Agree to Strongly Disagree), and a follow up text-based inquiry of, “Share up to three aspects of this training that relate to your job,” will give insight into whether or not the objectives are being met through the design created, as well as if the learners are, in turn, identifying with this design.

Even the design of the eLearning development can incorporate a pre- and post-survey into the training to gather information on perception, value to the learner, established knowledge versus learned knowledge, and so on. Just remember, surveys do not have to look like surveys. They can be part of a storyline or in the form of a quiz or game among other things.

Obviously a vital area to examine for ROI is skill development. After all, this is one of the main reasons for the investment to improve or enhance the learner’s capabilities to perform their job. We can have all of our metrics revolve around the creation, implementation, and post-learning, but we would be missing the mid- to long-term metrics that may tell us more than the immediate data we collect.

However, much like our encouragement of starting early with gathering metrics, this aspect only provides as much useful data as what was put into it prior to the training. That’s right; we need baseline data on our learners so we can see the difference in performance post-learning intervention. Not just immediately after, but months after and even up to a year after.

Lastly, as you begin to create your plan for ROI analyses, don’t forget other variables that may greatly impact your numbers. Some of the more key factors to consider are:

  • the time of the year when the training was deployed,
  • if the company has undergone any major changes or even within the learning group that will receive the training,
  • the company’s culture and attitude,
  • dynamics of teams and groups within the learning audience, and lastly,
  • the economy.

These factors can all have a positive or negative impact on the learning experience and on the ROI. For example, if the ROI was intended to measure sales growth and the economy is weak for this particular market it may be hard to expect the results that were intended. Another factor is timing the training deployment. Disseminating training after the high season for sales may only show marginal results as opposed to the project ROI. Even tension between colleagues and team members or the uncertainty of a company’s stability given changes can impact intrinsic and extrinsic factors of the learner and ultimately the ROI bottom line. So, even if you have your ROI cart selected, just be sure to pick the right team of horses to pull it through to success!