Showing posts with label scanning. Show all posts
Showing posts with label scanning. Show all posts
Monday, July 20, 2009
Yes, There's a Need - Part 2
Using performance assessments to study how people search is very revealing. Several years ago, we identified five major things that students don't know how to do that keeps them from achieving information fluency. In case you're unfamiliar with our work, these are 1) turning questions into queries, 2) selecting appropriate databases, 3) recognizing relevant results/finding better keywords, 4) evaluating the credibility of information, and 5) using information ethically (creating citations).
Upon closer examination, there are five problem areas related to evaluation alone--the facet of information fluency where students' shortcoming are most evident. These have to do with 1) reading carefully, 2) understanding different types of queries and when to apply them, 3) navigating by browsing effectively, 4) truncating urls and 5) checking for page information. Not knowing about a technique or how to apply it is one thing. Even when students (and adults) know the techniques, going too fast tends to erase any advantage the techniques might provide.
Speed is likely the biggest factor responsible for ineffective, inefficient searching. In terms of efficiency, it is ironic that the slower you go the better you perform.
We've consistently noticed "the-slower-the-better" relationship while conducting research. The faster a person goes, the more likely it is that important clues will be overlooked. The place this really has a negative impact on information fluency is (not) finding better keywords in snippets and (not) tracking down leads that can be used in determining the credibility of an author or an author's work. It is possible to find good clues by scanning, but the speed at which students scan a page renders understanding what is read unlikely. This is why we advise students taking our assessments to take their time.
Speed tends to be the factor that explains why a student scores higher on a pretest than a post-test. Our current assessment work utilizes a pair of 10-item tests.* Students who take less than 20 minutes tend not to score very well. That's 2 minutes per item, most of which require submitting a query or navigating to find answers, reading content to find clues and checking on those clues with a subsequent search. It's very hard to do that in 2 minutes. Deliberately slowing down, taking time to look for clues (because you know they are there) makes a big difference.
Because people have other things they want to do with their time, knowing when to slow down and conduct a careful search is paramount. My advice is, slow down when the stakes start to get high. For some, this will be when a grade or money is on the line, or a performance review or a job. Not every search demands the same careful consideration. But when it does, you need to take the time.
So time yourself. After becoming familiar with the instructions, give yourself two minutes to try the following search activity. Without looking at the possible answers (click 'give me a do-over' rather than 'show me the words'), allow yourself five minutes and do it again. Do you notice a difference in your score?
In what contest is one of the prizes a Hasbro action figure of the winner?
http://21cif.com/rkitp/challenge/v1n3/snippetsleuth/ActionFigure/SS_ActionFigure.swf
--
*If you'd like to try the 10-item tests yourself, sign up for Investigative Searching 20/10 (sign in as guest for access). The assessments are featured in that course.
Labels:
assessment,
critical reading,
information fluency,
scanning,
speed
Saturday, February 28, 2009
Snippets: Another key to searching
While federated searching brings together a broader field of information for single-point searching, there's nothing especially intelligent about the results. The idea behind federated or meta searching is that it makes the Deep Web more accessible, by enabling searchers to tap into numerous databases at once. The Deep Web is only deep because the information you need is not in the database where you are searching. Knowing WHERE to look for information is critical and federated searching makes that easier (although there is no engine I know of that looks everywhere).
By looking in more databases, your chances of finding what you are looking for ought to improve. But unless you are new to searching and think the top results are always the best ones, pulling results from 10 databases is really no better than pulling them from one. One reason is the keywords you start with. You typically have to speculate about the words to use. Most of the time there are better words that could be used, but you don't know them yet.
To a limited extent, federated searching takes care of speculating where to look for information. But it doesn't eliminate the need for a critical skill you need once you obtain results.
Checking the relevance of results obtained remains extremely important. Effective searchers check how their keywords are used in the snippets, or abstracts, that are returned. If you know what you were looking for, the language of the snippet usually makes it clear whether the information is relevant--whether it makes sense.
The other thing scanning snippets provides is better keywords. Since the words you start with are not always the words you actually need, being observant for better words pays off in terms of search efficiency. A classic example is searching for 'buffalo', a term with many meanings. If you are looking for the number of buffalo alive today in North America, the term 'bison' actually shows up in the snippets. Bison is a better keyword that improves the initial query, because it has only one meaning.
Here are a couple of queries to try. It doesn't matter much which search engine you use (I used Google). Look for relevant results and better keywords in the top ten snippets. If you want to, submit the answers you find (not just the ones you think of) by commenting to this blog.
Relevancy Challenge: Other than the fastest time from bottom to top, what is another record for climbing the Sears Tower?
Suggested query: sears tower climb record
Scan the snippets for an answer other than the fastest time.
Better Keyword Challenge: What types of shoes have been used for climbing the Sears Tower?
Suggested query: sears tower shoe climb
Scan the top ten snippets for names and types of shoes. How many can you find? Are all your findings relevant to the question?
Labels:
federated searching,
meta searching,
relevancy,
scanning,
snippets
Subscribe to:
Posts (Atom)