Tuesday, January 17, 2023

How I failed an Information Literacy Assessment

 I often "check out the competition" so to speak. This time it was NorthStar, a St. Paul, MN-based literacy company that offers assessments covering a range of topics from information literacy to operating systems, software packages and career search skills.

Their information literacy assessment consists of 32 performance-based and multiple choice items woven around the stories of three individuals involved in information literacy tasks. It's quite easy to take the assessment, assisted by audio storytelling. I thought I did pretty well and then I got a report at the end informing me I had failed with a 74% accuracy rate.

So I took the assessment again.

Not all the items seem specifically linked to what I'd call information literacy. Several depend on having lived circumstances similar to the case studies.  I did fine on these, having experienced financial deprivation, for example. Nonetheless, answers that might make sense are counted wrong if they violate an implicit principle such as 'don't go deeper into debt by taking out a loan if you are already in debt.' That lesson has to be learned by reading or listening to sage advice or the hard way, by accumulating debts. It's not an information literacy skill, yet it is assessed as one.

Another item resembles an information literacy skill, knowing for what to search. Provided with a list of criteria for finding a job, the task essentially is to click synonyms that match the criteria. Research demonstrates that this is one of the key failures that students make when searching: knowing what to search for. However, the assessment uses these as indicators to tell if and when one finds matching information. Knowing how to find answers in the first place is usually the real challenge and where students tend to stumble.

Among other items that seem removed from information literacy are project management, reading, a basic understanding of careers in healthcare.  Without a doubt information literacy depends on fundamental skills like knowing a language well enough to use it, thinking methodologically, being persistent, learning from failures and a host of others. But these are all primary skills and dispositions. Information literacy is a secondary skill that builds on them. If a student fails in such primary tasks, the solution is not information literacy training.

The assessment does contain some good examples of information literacy:

  • identifying optimal keywords that match one's search criteria
  • Distinguishing between ads and other content
  • How to use search engine filters
  • Knowing how to read results
  • Knowing how to navigate a Web page
  • Knowing where to search for relevant information
  • Evaluating the "fit" of information found

The second time I took the assessment I was more careful and I passed. I still missed three items, though I don't consider them fundamental to information literacy.

Questions that remain:

  • Is knowing how to create a spreadsheet or how to bookmark a page an information literacy skill?
  • In what ways are information literacy or fluency skills distinct from computer or software proficiencies? One answer to this is the Digital Information Model found here.
  • What is a passing score for information literacy? When I failed with a 74% the first time and passed the second time with 87% it reminds me that a numerical cutoff for this cluster of secondary skills is really hard to justify. No one performs at 100% all the time as an effective, efficient, accurate and ethical consumer of online information. We strive to be better than 50%, however. That's why the threshold is set low on our assessments and 75% is considered mastery. That number is borne out in search results from our studies. Being right 3 out of 4 times is a pretty decent accomplishment in the online Wild West.

No comments: