Showing posts with label questions. Show all posts
Showing posts with label questions. Show all posts

Thursday, November 10, 2011

Baloney Detection Kit

The Baloney Detection Kit is described in a 14 minute video from the producers of Skeptic Magazine.

According to the authors, the kit is a 'scientific' guide to encountering new information. Here are the suggested questions to guide an investigation before acting on information.

  1. How reliable is the source of the claim?
  2. Does the source make similar type claims?
  3. Have the claims been verified by someone else?
  4. Does this fit with the way the world works?
  5. Has anyone tried to disprove the claim?
  6. Where does the preponderance of evidence point?
  7. Is the claimant playing by the rules of science?
  8. Is the claimant providing positive evidence (or just negative evidence)?
  9. Does the new theory account for as many phenomena as the old theory?
  10. Are personal beliefs driving the claim?
These all add up to the need to look thoughtfully at information from a variety of objective perspectives. 

I find that there are other ways to group these questions:

Source: 
  • Who is the author?
  • Has the author written on this topic before?
  • Are personal beliefs driving the claim?  (evidence of bias)
Content:
  • How does this information fit with the way the world (and rules of science) works?
  • Where does the majority of evidence point?
  • Is the evidence all negative?
  • Does the new theory explain as much as the old theory?
External References:
  • Have the claims been verified by someone else?
  • Has anyone tried to disprove the claim?
If you are examining information that is not purporting to be a new theory or scientific claim--e.g., a request for money from a friend who's allegedly been robbed while traveling in a foreign country--then most of these questions are no longer relevant (or helpful).

In the case of the email scam, two questions remain important:

Who is the alleged author (identified by name) and where does most of the evidence point?  You have to know the individual pretty well in this case, including the likelihood that person is out of the country.  

The question about the way the world works isn't helpful since there's always a first time for everything.  If this is the first time you've heard from this friend about this type of situation, then question two--by itself--would help support the request for funds.  I'll leave you to think about the remaining questions.

Simply asking questions is insufficient. Some research is required. When I got a similar message from a daughter-in-law, I did two things: I wrote to her to see if she would confirm  the message, and (not waiting for an answer) I searched online for similar emails (question 3 above: has anyone else made the same claim?). I got the answer to the latter before I heard back from my daughter-in-law. Plenty of people were getting similar emails from friends and relatives.

Friday, January 28, 2011

Semantic Searching

The crew of Challenger's last mission 
Improvements to search engines make searching easier.

Take the semantic application hakia.com, for example. According to their site information, hakia is a semantic search technology company whose mission is to "deploy semantic search solutions to meet the challenges of elevated user expectation, business efficiency and lowest cost."  [source]

To no small extent, "elevated user expectation" stems from the frustration users experience when they can't quite figure out the right keywords to use or right databases to search in Web 1.0 and 2.0 worlds. Remember when search engines ONLY performed literal searches? There are still some of those around, but a new generation of engines is getting to the place where natural language (or free text) searches start to be meaningful to machines.

With improvements in programming and programming languages it is now possible to type in a question as a query and have a search engine interpret the meaning and return relevant results. There will continue to be advances, but this is well beyond the capabilities of the old "ask Jeeves" engine.

This has obvious implications for teaching students how to turn questions into queries, which has always been a challenge.  In time, questions could be the standard for queries.  Keyword searching will still work, but the user won't have to figure out what the important concepts are, which keywords might need to be replaced by more powerful search terms and what stop words not to include.

These advances don't solve the problem of whether the information returned can be trusted, but I have a feeling that isn't too far away. In the not too distant future, search engines will be able to provide a credibility rating based on authorship information, publication date, links to and a host of other factors. The semantic web makes establishing and checking these kind of data connections behind the scenes possible. Authors could have a "credibility score." Not sure how that would be determined, but the technology exists to do it.

A challenging thought, to be sure.

Despite "information crunching" advances that will make (supposedly reliable) information retrieval easier and easier, something tells me being a critical reader will never go out of fashion.

In the meantime, try a question search using hakia.com!  (e.g., what caused the Challenger disaster -- what do the results tell you?)

Friday, July 10, 2009

Searching and Asking Better Questions


An early mistake made in the search process is looking for the wrong thing. From our research this happens around 10% of the time. No matter how you phrase a question, about one in ten people will misinterpret it. Maybe more.

Not only is looking for the wrong thing often the first mistake one can make, it is THE unforgivable one. Using poor keywords, looking in the wrong database, browsing ineffectively are all costly in terms of time, but they an lead to a good answer with persistence and a little luck. Looking for the wrong thing, well that's a different story.

I recall the time we asked students to research the World War II perspectives of people living in the Middle West. Sure enough, there were some students who turned their attention to the Middle East. A fairly easy mistake, but fatal in terms of search success. (By the way these were highly gifted high school students--no one is immune).

Yesterday I led a workshop for grad students at a local university. I used the Declaration of Independence search challenge that I posted on July 4 (see two blogs ago) as a warm-up activity to get an idea about their search skills. It confirmed a reservation I've had about search questions for several years: there is no perfect question.

Unless the question reads like a legal document--and who really wants that--there is almost always a way to misinterpret it. This may be particularly problematic in the case of search challenge questions. It takes a bit of work to come up with a question that doesn't give away the answer. By design, one or more powerful keywords for a search are left out of the question. That's to replicate the reality of what we call the "1 in 5 rule:" on average, there are four other words than the one you used in a query that may be more effective. Finding those better keywords is the most common search challenge.

It is enlightening to see what a group will do with a question. Every time I try this, I think of a better way to pose a question. And I've had a lot of experience designing questions. That's an unintended consequence of using a search challenge, but has a lot of importance for teachers who use questioning in the classroom. It's a mistake to think you've asked a good question until you see what students do with it. If you don't catch the misunderstandings in the beginning, a lot of effort can be wasted, both students' and teachers'.

Here's an illustration using the question: "How many original copies of the Declaration of Independence are known to exist?"

If you take 'original' without considering 'copies' you might think this is a trick question. Of course, there's only one original: the one in the National Archives with all the original signatures on it. 'Copies' is a bit misleading. What the search is about is 'printed copies', of course there is a better keyword than this--and it can be discovered by searching with the word 'copy' or 'copies'. The better keyword is 'Dunlap Broadsides'. That's the printer and the name of the 200 copies of the Declaration of Independence he printed soon after the original was signed.

There were also questions (confusion) among the students about the meaning of 'known to exist'. Many of the grad students found the right answer despite having to overcome these hurdles. Some found a wrong answer. Some were still searching when I called time. If I would have included the term 'printed' in the question, or 'Dunlap Broadsides', more would have found the answer. Of course, that would have made this a pretty weak challenge.

Here's my point: if you're a teacher and want to improve the questions you give your students, try giving it to them first as a search challenge question. See what they do. What keywords do they use in a query? What words do they think are unimportant? What do they find as they search? Do they get stuck? In most cases, I think you'll find a better way to ask your question.