Showing posts with label assessment. Show all posts
Showing posts with label assessment. Show all posts

Tuesday, January 17, 2023

How I failed an Information Literacy Assessment

 I often "check out the competition" so to speak. This time it was NorthStar, a St. Paul, MN-based literacy company that offers assessments covering a range of topics from information literacy to operating systems, software packages and career search skills.

Their information literacy assessment consists of 32 performance-based and multiple choice items woven around the stories of three individuals involved in information literacy tasks. It's quite easy to take the assessment, assisted by audio storytelling. I thought I did pretty well and then I got a report at the end informing me I had failed with a 74% accuracy rate.

So I took the assessment again.

Not all the items seem specifically linked to what I'd call information literacy. Several depend on having lived circumstances similar to the case studies.  I did fine on these, having experienced financial deprivation, for example. Nonetheless, answers that might make sense are counted wrong if they violate an implicit principle such as 'don't go deeper into debt by taking out a loan if you are already in debt.' That lesson has to be learned by reading or listening to sage advice or the hard way, by accumulating debts. It's not an information literacy skill, yet it is assessed as one.

Another item resembles an information literacy skill, knowing for what to search. Provided with a list of criteria for finding a job, the task essentially is to click synonyms that match the criteria. Research demonstrates that this is one of the key failures that students make when searching: knowing what to search for. However, the assessment uses these as indicators to tell if and when one finds matching information. Knowing how to find answers in the first place is usually the real challenge and where students tend to stumble.

Among other items that seem removed from information literacy are project management, reading, a basic understanding of careers in healthcare.  Without a doubt information literacy depends on fundamental skills like knowing a language well enough to use it, thinking methodologically, being persistent, learning from failures and a host of others. But these are all primary skills and dispositions. Information literacy is a secondary skill that builds on them. If a student fails in such primary tasks, the solution is not information literacy training.

The assessment does contain some good examples of information literacy:

  • identifying optimal keywords that match one's search criteria
  • Distinguishing between ads and other content
  • How to use search engine filters
  • Knowing how to read results
  • Knowing how to navigate a Web page
  • Knowing where to search for relevant information
  • Evaluating the "fit" of information found

The second time I took the assessment I was more careful and I passed. I still missed three items, though I don't consider them fundamental to information literacy.

Questions that remain:

  • Is knowing how to create a spreadsheet or how to bookmark a page an information literacy skill?
  • In what ways are information literacy or fluency skills distinct from computer or software proficiencies? One answer to this is the Digital Information Model found here.
  • What is a passing score for information literacy? When I failed with a 74% the first time and passed the second time with 87% it reminds me that a numerical cutoff for this cluster of secondary skills is really hard to justify. No one performs at 100% all the time as an effective, efficient, accurate and ethical consumer of online information. We strive to be better than 50%, however. That's why the threshold is set low on our assessments and 75% is considered mastery. That number is borne out in search results from our studies. Being right 3 out of 4 times is a pretty decent accomplishment in the online Wild West.

Thursday, May 5, 2022

Beyond Information Literacy?

 

The differences between illiteracy, literacy and fluency are fuzzy, at best, when it comes to digital information competencies.

The Spring 2022 Feature article in the Full Circle Kit examines the lines between incompetence and fluency using the results of a study conducted by 21cif at Northwestern University's Center for Talent Development. 

The data suggests that a minimum competency for someone to be identified as 'literate' is a 60% success rate on search and retrieval tasks. The point at which fluency starts is less clear.

Read the whole article here

Thursday, September 23, 2021

Implicit Bias

 


If you think you aren't biased, you're mistaken.

Everyone who has personal preferences or a sense of right vs wrong is biased.  Bias isn't always blatantly racist, sexist, political or religious. It can be implicit, that is, a person with an implicit bias may not be aware of it.

Implicit biases shape how we think and act. We--I include myself--choose to read certain types of online authors, publishers and content and avoid others. 

This fall's Full Circle articles spotlight implicit bias and how it's not enough to teach students to recognize bias in what they read, they also need to recognize it in themselves. Undetected bias is a filter that keeps out disagreeable content, letting in only that which is agreeable. The big danger in never being challenged by contrary beliefs is that the things we hold to be true remain uninformed and hard to defend.

Full Circle Fall 2021 Table of Contents

Friday, February 14, 2020

Information Researcher is up and running



A refresh of Information Researcher is now available. This assessment and tutorial package identifies weaknesses and strengthens skills in information fluency.

The subscription package consists of a six item Pretest, followed by a 9 unit set of interactive tutorials on all information fluency skills. A ten item Certification Exam concludes the learning experience. This was originally developed for the Center for Talent Development at Northwestern University and has been revised based on user feedback.

A free preview of the Tutorials is available here.

To test your information fluency skills, try the Three Free Search Challenges, adapted from the Pretest.

Saturday, August 25, 2018

Keyword Targets

As a follow-up to yesterday's post, here's a glimpse inside the Fall 2018 Full Circle Resource Kit. As an instructional and assessment technique, create a wall poster of an archery target, or just buy one. Use sticky notes for each word in a student's query. Some words are bulls eyes: Proper Nouns and numbers. Others land on the target, coming close: nouns. The rest are likely to miss altogether, unless they are accompanied by a noun or number: verbs, adjectives, adverbs, pronouns, articles, conjunctions, prepositions and exclamation.

A graphic way to make this point is to place sticky notes on the target. It can also be used to provide feedback to students on the effectiveness of their queries.  For example, in the query, Who is the Latina Bronx Tarzan.... The first three keywords miss the target. The proper nouns, when used together, pinpoint the desired information, as shown below:


When one of the effective keywords is removed, however, the two remaining miss the information, coming close.


Next time your students try using complete sentences to search, the target exercise can get the point across that more words is less effective.

Access all the Kit resources with an annual school subscription for $249 (any number of students and staff can enroll for the same price).

Wednesday, June 17, 2015

Out of the Library, Into the Classroom

What's happening in Kansas isn't unique.

Wichita Public Radio's feature,  "The School Librarian is Expendable in many Kansas School Districts" documents a large scale shift in responsibility for information literacy instruction. As the number of school librarians dwindles, information literacy is being integrated into classroom curriculum to be addressed by teachers. How effective this will be, time will tell. Another case of teachers being asked to do more.

Neighboring Colorado is similarly affected, with a nine percent decline in the number of school librarians between 2007 and 2011.

In Illinois, Chicago Public Schools reduced its librarian staff by 44 percent in just two years. Librarians are being reassigned to classrooms as teachers. Faced with a teacher shortage, it's a move that makes sense. But part of the problem of considering librarians a luxury comes down to this:
"There's no required amount of minutes for library instruction (in Illinois), so schools won't face any repercussions if they don't have a librarian or a school library." Source
For the time being, other things are just more important. It's hard to make the argument that digital research skills are as important as learning how to learn when there is no reason to learn them other than they are good skills to have. When they are considered essential skills, the tide may start to change.

In the meantime, policy makers should see how proficient students in Elementary through High School are. This means assessment. The Information Fluency assessments we've tested show that students can't research challenging assignments and consistently make poor choices in the selection of information that is inaccurate, irrelevant, out of date, biased and is not held in high regard by trusted sources. If you are a librarian concerned about your job or a classroom teacher who just doesn't have the time to teach one more thing, request a free test for your students. The results could be eye-opening.

Monday, May 14, 2012

College Ready Information Fluency


Over the past three months I've been working on Information Researcher, our newest self-paced course for middle school and high school students.

This package will be put to the test soon by 1,000 students enrolled in Northwestern University's Center for Talent Development summer programs. The goal is to strengthen these students' digital research skills, to improve their performance in demanding coursework; to achieve college readiness.

What is college readiness? Answers will vary from institution to institution. We've based our definition on the Digital Information Fluency Model, focusing on competencies that individuals need to "get it right" most of the time.  The "it" is online research and has multiple facets.

The course consists of three parts: a 5-item practice test, 14 tutorials and 10 certification test items. Each item is performance-based and involves live searching and/or evaluating involved in representative school assignments.

The practice test gives students an opportunity to test out of the course. A passing score is 80%, a level that most individuals who have mastered search strategies and techniques can attain. It's not easy. No one passes it without training.  The skills assessed are 1) learning how to use an unfamiliar search engine, 2) using backlinks to evaluate the authority of an unknown source, 3) tracking down the owner of an unknown Web site, 4) fact checking the accuracy of content and authority of a source, 5) determining the freshness of information that lacks a published date.

In addition to these, the tutorials involve students in the following tasks: 6) browsing links to home in on information, 7) using keywords effectively with a search engine, 8) truncating URLs to reveal hidden information, 9) triangulating information to fact check accuracy, 10) using advanced operators to retrieve information, 11) detecting bias, 12) tracking down missing information for reports and citations, 13) deep web searching, 14) finding Red Flags and 15) applying search strategies effectively in a variety of challenges. The posttest incorporates the same competencies.

The target average score for middle school students is 65%; 75% for high schoolers. Before training, average scores are ~40% for middle schoolers and ~50% for high schoolers. Repeated exposure to training leads to even greater improvement.

I'd like to hear your thoughts on what constitutes college readiness in terms of information fluency. And if you'd like to preview a bite-sized portion of the course and give some feedback before we put the course online, let me know by writing to carl@21cif.com.




Friday, August 26, 2011

Elementary Fluency


I've been asked by a program director at Northwestern University's Center for Talent Development to think about creating online tutorials and assessments for students in grades 3 - 5. 
This is a challenging project: what do students in elementary grades need to know, what skills should they possess to prepare them for middle school information fluency?
A couple years ago I created an elementary workshop to address these needs. That workshop may be found here
Creating a bridge that builds on those ideas and connects with Information Investigator 3.0 and 3.1 could make an interesting project. Information Investigator 3.x represents our latest thinking about information skills that middle school and high school should possess.  There is surprisingly little difference between the capabilities of middle school and high school students. I'll outline that in an upcoming report based on a study of over 900 students conducted this summer.
If you are interested in having conversations about materials and activities that position learners in elementary grades to be more fluent upon entering middle school, I'd like to work with you. 
Let me know if you are interested!  We will move the conversation to CoolHub.IMSA where we develop projects like this. Email me directly at carl@21cif.com.

Sunday, May 9, 2010

Poop for Power

My Call for Reviewers was a little too successful last week. Before I knew it, 50 people had applied and I only needed 15. In case you went to the website site to apply and it was already taken down, my apologies. Everyone who applied was well-qualified. It was a painful selection process--I hated to turn anyone away.

So here's a glimpse of one of the items that was originally developed for Information Investigator 2.0 that didn't quite make it (because it was too hard).

First, a little about Information Investigator. Part of the package consist of a pretest and posttest designed to measure information fluency, in particular, investigative competencies.  Those competencies include knowledge and skills to use techniques like truncation, browsing, skimming, querying, special operators, etc., to help determine the credibility of information.

The pretest consists of 10 performance items. The posttest has 10 different items, measuring the same sets of skills. It's hard to guess the answers, and as a result, students tend to score in the 45-55% range on the pretest. After training, scores go up by about 15 points on average.

Here's one of the performance items that I developed but didn't use for the posttest. I felt it was too challenging. But it's a good challenge for this blog, nonetheless--one that really brings out the reference librarian in a person.

The back story is related to the use of animal waste to produce energy. There are lots of examples: L'Oreal powering a cosmetics plant with cow manure, the Dutch recycling chicken waste to power nearly 100,000 homes, and then there are some stories that seem a little harder to believe.

One of these is a story dating back to 2006 about San Francisco exploring the possibility of turning dog poop into methane to power households in that city. Here's a sample news report about it. The people of San Francisco have a lot of dogs. Dogs produce a lot of waste. Waste can be turned into methane. Did they ever do it? 

The challenge is:  
Fact check to see if San Francisco is using dog poop for power in 2010.  

Rather than just make it a yes or no question, here are some possibilities (multiple choice):


1. This is a hoax. There is no evidence that San Francisco ever considered using dog poop as a power source. 

2. This was never more than a proposal. Development never started. 

3. San Francisco has not yet started the program but still plans to do so.  

4. San Francisco started to collect dog poop for methane but later discontinued the practice.

5. San Francisco continues to use dog poop as a power source today. 


What do you conclude?  (If you live in San Francisco, this might be easier)

Thursday, May 6, 2010

Reviewers Wanted

I am seeking some Internet Search Challenge followers to help review Information Fluency's latest release: Information Investigator 2.0.  This is an online assessment and self-paced training package for middle school and high school students to raise their skills in searching and evaluating online information.

Interested reviewers may complete an online application here.  Depending on the response--we don't need a large number of reviewers--I will send a link to a selection of the learning objects in the training portion of the package.  The entire review process may take an hour which you can complete at your convenience before May 15.  After May 15, your feedback will be evaluated and inform a refresh of the items before they are released.

As an incentive, reviewers and their students will be permitted to access the training materials for the coming school year.

The training package includes a total of 3 interactive tutorial games (Citation, Evaluating Ownership and Evaluating References (links)) and 11 one-page QuickHelp Guides on information fluency competencies found in the tutorial games, including: how to query effectively, how to browse effectively, how to use truncation, how to evaluate authority, and others.

The complete assessment (pretest and posttest plus the materials above) and training package will be released this summer and available for school licensing.

If you are interested, I hope to receive your application!

Wednesday, July 29, 2009

Adults Do Better

Need proof that adults search and evaluate better than youth? These charts show how students in middle school and high school compare to teachers and librarians. The assessment is the pretest from a course we call "Investigative Searching 20/10."

To date, 449 middle schoolers, 414 high schoolers and 28 adults have taken the 10-item pretest that measures the ability to find critical information and evaluate its credibility. There are several differences that really stand out.

Middle School Results
Mean = 44.3%

Due to the greater number of respondents, the charts for middle school and high school resemble standard distributions. On average, high schoolers do better due to more experience and more advanced reading abilities.

High School Results
Mean = 50.8%

There is less of a range for adults--none of the extremes. It is likely that a standard distribution would be more apparent if the adult sample were as large as the youth sample. But the adults in the sample tend to be search experts: librarians and media specialists. Most of the current group is clustered above 50%, with fewer grouped lower. It would be interesting to see what other adults would do. Does just being an adult provide the advantage? Or is it one's profession in the world of information?

Adult Results
Mean = 58.2%

Are these the results you would expect? Do you think they are artificially low or about right? That's hard to say without seeing the pretest. Without disclosing specific items (in case you want to take the test), the 10 items focus on skills that have been described in previous posts, requiring the application of appropriate techniques to find the author of articles, the name of the publisher, the date of publication, other instances of the content on the Internet and references to web pages. This is difficult because it's not always obvious what technique may be needed (several must be tried)--and this requires knowing the techniques in the first place. What most people do is browse when faced with a search challenge. When browsing doesn't work, can they think of another technique to try?

We believe this is a fair test of one's search abilities. It reveals technical deficiencies--or at least the inability to apply them appropriately. If you'd like to try it out, the 'course' is open through the end of August. There is a fee ($25) but if you find the assessment useful and want to use it with students, we're open to talking about group rates. Here's the link: http://21cif.mrooms.net/course/view.php?id=72 (sign in as a guest).

Monday, July 20, 2009

Yes, There's a Need - Part 2


Using performance assessments to study how people search is very revealing. Several years ago, we identified five major things that students don't know how to do that keeps them from achieving information fluency. In case you're unfamiliar with our work, these are 1) turning questions into queries, 2) selecting appropriate databases, 3) recognizing relevant results/finding better keywords, 4) evaluating the credibility of information, and 5) using information ethically (creating citations).

Upon closer examination, there are five problem areas related to evaluation alone--the facet of information fluency where students' shortcoming are most evident. These have to do with 1) reading carefully, 2) understanding different types of queries and when to apply them, 3) navigating by browsing effectively, 4) truncating urls and 5) checking for page information. Not knowing about a technique or how to apply it is one thing. Even when students (and adults) know the techniques, going too fast tends to erase any advantage the techniques might provide.

Speed is likely the biggest factor responsible for ineffective, inefficient searching. In terms of efficiency, it is ironic that the slower you go the better you perform.

We've consistently noticed "the-slower-the-better" relationship while conducting research. The faster a person goes, the more likely it is that important clues will be overlooked. The place this really has a negative impact on information fluency is (not) finding better keywords in snippets and (not) tracking down leads that can be used in determining the credibility of an author or an author's work. It is possible to find good clues by scanning, but the speed at which students scan a page renders understanding what is read unlikely. This is why we advise students taking our assessments to take their time.

Speed tends to be the factor that explains why a student scores higher on a pretest than a post-test. Our current assessment work utilizes a pair of 10-item tests.* Students who take less than 20 minutes tend not to score very well. That's 2 minutes per item, most of which require submitting a query or navigating to find answers, reading content to find clues and checking on those clues with a subsequent search. It's very hard to do that in 2 minutes. Deliberately slowing down, taking time to look for clues (because you know they are there) makes a big difference.

Because people have other things they want to do with their time, knowing when to slow down and conduct a careful search is paramount. My advice is, slow down when the stakes start to get high. For some, this will be when a grade or money is on the line, or a performance review or a job. Not every search demands the same careful consideration. But when it does, you need to take the time.

So time yourself. After becoming familiar with the instructions, give yourself two minutes to try the following search activity. Without looking at the possible answers (click 'give me a do-over' rather than 'show me the words'), allow yourself five minutes and do it again. Do you notice a difference in your score?

In what contest is one of the prizes a Hasbro action figure of the winner?

http://21cif.com/rkitp/challenge/v1n3/snippetsleuth/ActionFigure/SS_ActionFigure.swf

--

*If you'd like to try the 10-item tests yourself, sign up for Investigative Searching 20/10 (sign in as guest for access). The assessments are featured in that course.

Friday, July 17, 2009

Yes, There's a Need


Take a good look at this chart. What do you see? A normal distribution with a negative skew? A test that's too hard? A need for improvement?

This chart depicts the performance of high school students attempting to apply investigative techniques to Web pages. Not too good, is it? The average score is 52% That means the vast majority of students in this sample (n = over 350) really struggle with the ability to find and evaluate online information to determine its credibility. The average for middle school students is 45%. Very few are at 80%--what we would consider mastery. (By the way, these are gifted students. The top 1-2%.)

If information fluency was a typical school subject, the majority of students would fail. Fortunately for them, this is not a typical school subject; it is neither widely addressed nor seldom practiced.

The result above is sampled from a new assessment we developed at 21cif. The results aren't surprising. These are the same results we've been getting for years. The computer-savvy generation consistently under-performs when asked to locate and evaluate information found on the Internet. They have real difficulty finding authors, publishers, dates of publication, salient facts and claims and references of information on web pages and blogs. Of course if they can't find those things, they have an even harder time evaluating their credibility.

After a few hours of targeted training in investigative techniques, students improve by 10% on average. With more practice, their gains would be even greater. By the way, adults tend to score 10-15% points higher than high school students. Not because they are better with computers; they take the time to read critically. They have better vocabulary skills. Despite all the new age bells and whistles, search mastery still comes down to careful reading and thinking.

The need for information fluency is why we're in this business. Research merely informs what training is needed. Having an opportunity to train students is the challenge of this business. If we continue to say there are more important things to study, we will continue to see the results above. A generation that cannot research proficiently online will lack the ability to use information resources wisely and profit from them.

Preachy, I admit, but what else can one say about the state of information fluency in schools today? It needs work.

If you would like to learn more about the assessment, I'll share more about it over the next few blogs.