Lack of measurement in corporate training after initial release.

As an instructional designer I develop training for Qlik Sense a data visualization product.  There are enterprise versions that can be hosted by customers as well as Sass versions.  Regardless the general concept is connecting a database or uploading a spreadsheet, associating like data records, and then building apps with that data.  The apps could have one or multiple sheets with a variety of interactive charts.  There are toolboxes built in with an array of charts and features, as well as scripting abilities that allow companies to customize the app to their specific needs, branding, and desired visual aesthetics.  Then apps can easily be consumed by business users to make data driven decisions.  This is a brief description; capabilities are vast, and setup requires more than a couple of steps.

In my role I work on several different projects including custom training.  Here I meet with our customers to learn about how they use our product, then determine the direction for their training.  Our process includes a meeting where they present their app, and we identify which use cases are most appropriate, this could be collaborative or just me suggesting the direction.  Once we all agree on the direction, I work in their app to develop storyboards which detail out script, screens, actions, and any animation.  We could go through one or several revisions where questions are answered and eventually, we complete the storyboard.  Final production includes synced screen captures, with edited voice over, and all the bells and whistles (call out graphics, transitions, animation, interactions, company branding, etc.).  The customer is provided with their desired output and source files and we move on to the next.

Another project that I work on, and own, is developing a small but growing digital learning library with short task-based how-to videos.  For this I typically play in our app and identify tasks that aren’t obvious, or vital tasks, and lately with all the drastic product updates I am trying to highlight new features.  We offer a full curriculum with online learning paths, but it takes time to get those courses updated.  By addressing it in the task-based videos we can at least have something available. 

Another project that I work on, is developing a curriculum for a Data Analytics learning path we created for our academic program.  We have been pecking away for several years and have college level courses published, and being used, covering intro and advanced data analytics, and data visualization.  We created the content, lecture slide deck as well as a fully asynchronous component with videos, guides, assignments, and a cumulative interactive use case.

I have also worked on numerous one-off projects over the years.  Regardless of project here or in prior work, I find a common theme; get it done as quickly as possible and never expect to hear anything about it again, unless of course it requires version updates.  When myself and/or a team put so much thought and work into something, does anyone learn from it?  Does anyone even use our materials?  Overall corporate training implements a system that has no measurement to see if the training materials influenced knowledge or skill development.  Which is the whole point of creating these materials. 

I can take this a step further in the hard work that goes into planning and developing learning materials is not valued within the company or even in the smaller business unit.  This could easily veer off to an entirely new challenge, to stay on point, fixing a system to include evaluation of learning will not only assist in improving what is produced but it will highlight a tangible value to the training and those that create it.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

error: Content is protected !!