Showing posts with label evaluation. Show all posts
Showing posts with label evaluation. Show all posts

Tuesday, October 9, 2012

New Release of Publisher Challenge

I spent some time this week revising and refreshing the Publisher Challenge, a tutorial to help learners track down the publishers of online information.

Periodic maintenance is needed due to link migration: users get those nasty 404 errors (which are not usually a dead end, but that's another challenge). In the case of the Publisher tutorial, designed in Action Script 2, I wanted to add the functionality of Action Script 3, and that required rebuilding the code from scratch. My apologies to iPad and iPhone users, but the tutorial is still Flash which you can't use.

In the tutorial you'll find three sections: a techniques practice page--methods you'll need to use to solve the challenges, a "find the publishers" page and an "investigate the publishers" page. Together, these require the type of investigation involved in determining whether content is trustworthy based on who published it. This fill-in-the-blank/click the appropriate button tutorial is paired with a MicroModule about the Publisher, to give background and explain why it's important to know about the publisher. That has also been refreshed.

Give them a try. Use them with students as part of a lesson on Web evaluation, the ownership of ideas, or one of these specific cases/themes you'll find in the tutorial: poetry publishing, gun laws or school health.

Publisher Challenge

Friday, June 29, 2012

Is it a Hoax or not?

Earlier this week, Dennis O'Connor and I offered a 'model lesson' at the ISTE Conference in San Diego. It was a full session, indicating interest in seeing how information fluency may be embedded in a session.

We probably attempted too much for a 60 minute session; we never did get to the fourth mini-lesson. In case you want to see our Lesson Plans for all four segments, you will find them here. Feel free to use them in your teaching.

The target for the lessons is Genochoice.com, a site that allows parents to create designer babies online. Fact checking claims and information on the site quickly turns up Red Flags. Most external references state that the site is a hoax, the claims are bogus.

I agree with bogus, but don't think it's really a hoax. The majority of information is made up. You can't "read" DNA using a thumb scan that is nothing more than a Flash movie embedded in a page. You can't determine if my medical insurance will cover the cost of genetic "improvements" based solely on my name. There's no evidence that you can "fix" genetic disorders. Someday that may be possible, but not yet.

What is a hoax? Most definitions boil down to "a deliberate deception." Some of the more malicious ones also attempt to defraud, which is not the case with Genochoice.  But while the information is deliberate, does it also aim to deceive?  I think it has a different purpose.

The profile of the author is the crux of the matter. It doesn't take long to determine that Virgil Wong is responsible for the content; he also owns the domain name. But is he a deceiver?

The inconsistency comes when you stop to consider why an artist-medical keynote speaker-PhD candidate-hospital webmaster would create a popular hoax site. Wouldn't that harm his reputation?


Bogus and Hoax sites present layers of challenges. Figuring out if they are bogus or a hoax is one layer. Deeper is: why does this site exist?


Thursday, December 15, 2011

Misinformation on Twitter

With over 200 million Twitter contributors*, misinformation is bound to happen.

It would be pretty interesting if there was a study to determine the frequency of misinformation created by authors in the world of Web 2.0. Maybe there is such a study. I should look.

Regardless, misinformation is regularly created in the form of rumors, honest mistakes and malinformation (intentionally misleading facts and opinions).  Tweets are a good example.

UK's The Guardian has collected seven rumors that attracted a following on Twitter, spread and then died out. These include: 'Rioters attack London zoo and release animals,' 'Rioters cook their own food in McDonalds' and 'Army deployed in Bank.' Here's the complete list and an interactive app to explore the rumors and their trajectories.

All of the rumors are what you could classify as 'breaking news.' Twitter became famous as a source of breaking news during events such as the US Airways flight that ended in the Hudson River. Twitter often scoops other news sources because of eyewitnesses who tweet what they happen to see. Sometimes the stories turn out to be true. Other times they don't.

How do you tell the difference between a 'truth' and a 'rumor?' This would be a great conversation to have with middle school, high school and college students. What can you do not to fall prey to rumors?



* FYI -- Finding the current number of Twitter users makes a pretty good search challenge.

Thursday, November 10, 2011

Baloney Detection Kit

The Baloney Detection Kit is described in a 14 minute video from the producers of Skeptic Magazine.

According to the authors, the kit is a 'scientific' guide to encountering new information. Here are the suggested questions to guide an investigation before acting on information.

  1. How reliable is the source of the claim?
  2. Does the source make similar type claims?
  3. Have the claims been verified by someone else?
  4. Does this fit with the way the world works?
  5. Has anyone tried to disprove the claim?
  6. Where does the preponderance of evidence point?
  7. Is the claimant playing by the rules of science?
  8. Is the claimant providing positive evidence (or just negative evidence)?
  9. Does the new theory account for as many phenomena as the old theory?
  10. Are personal beliefs driving the claim?
These all add up to the need to look thoughtfully at information from a variety of objective perspectives. 

I find that there are other ways to group these questions:

Source: 
  • Who is the author?
  • Has the author written on this topic before?
  • Are personal beliefs driving the claim?  (evidence of bias)
Content:
  • How does this information fit with the way the world (and rules of science) works?
  • Where does the majority of evidence point?
  • Is the evidence all negative?
  • Does the new theory explain as much as the old theory?
External References:
  • Have the claims been verified by someone else?
  • Has anyone tried to disprove the claim?
If you are examining information that is not purporting to be a new theory or scientific claim--e.g., a request for money from a friend who's allegedly been robbed while traveling in a foreign country--then most of these questions are no longer relevant (or helpful).

In the case of the email scam, two questions remain important:

Who is the alleged author (identified by name) and where does most of the evidence point?  You have to know the individual pretty well in this case, including the likelihood that person is out of the country.  

The question about the way the world works isn't helpful since there's always a first time for everything.  If this is the first time you've heard from this friend about this type of situation, then question two--by itself--would help support the request for funds.  I'll leave you to think about the remaining questions.

Simply asking questions is insufficient. Some research is required. When I got a similar message from a daughter-in-law, I did two things: I wrote to her to see if she would confirm  the message, and (not waiting for an answer) I searched online for similar emails (question 3 above: has anyone else made the same claim?). I got the answer to the latter before I heard back from my daughter-in-law. Plenty of people were getting similar emails from friends and relatives.

Tuesday, May 24, 2011

Spam Stupidity

Some spam is way too obvious--except for my spam filters.

Here's an example that made it past my email filters a few days ago:


From: Paypal Customer Service serviices@paypall.com
Subject: Your Paypal Account Will Be Limited
Date: May 22, 2011 9:43:41 PM CDT


Hello Paypal Customer,
Your Account Will Be Limited Until You Submitted Some Information.
If you didn't submit them, your account will be limited after 2 days.


This was followed by a link to click.

Right.

Just reading the header is enough to raise an eyebrow.  The Caps in the body and the grammatical mistakes provides a nice finish.

I've documented this to emphasize that reading is unparalleled as an investigative skill. If your language skills are weak, your vulnerability increases.

I'd use an example like this with students--along with some legitimate messages--to have them identify potential red flags.

Do you have examples of similar messages to share?

Thursday, August 19, 2010

Kitchen Myths

When I saw this blog on using Kitchen Myths as the objects of information literacy challenges, I thought this could be useful for getting students to practice search skills.


The more I think about it, however, I'm not so sure what skills would be needed or how hard these challenges might be. Take for instance, "Lobsters scream with pain when boiled." Is that true or false? Kitchen mythology would indicate it is true.


But to search for verification, what would a search do? The statement suggests you search for
lobsters scream boil
Try that and you'll get something like this in the results (top five from Google):

Do Lobsters Really Scream When You Put Them in Boiling Water ...

WikiAnswers - Do lobsters really scream when you put them in ...

Is it true that chefs can hear a lobster screaming when dumped in ...

Lobster Tales - Do Lobsters Scream?? (a sequel)

being Boiled Hurts - Lobster Lib.com

The positions of these five resources agree that lobsters cannot scream (they lack vocal chords) but as for experiencing pain, the answers (in order) are No, Yes but No,  No, Yes (kind of), Yes. Taken as a whole, that's 3 no's, 1 yes and 1 YouTube video that's a dramatization of a guy screaming while three lobsters are being boiled.

Finding results is easy using the keywords in the myth. Deciding whether lobsters scream is pretty conclusive too. But whether they feel pain or not is conjecture based on the neurology of invertebrates. No one knows for sure what a lobsters feels.

This particular challenge requires reading the results, interpreting them and doing some evaluation. The proponent "for" lobsters feeling pain is David Wallace. He could be the object of further investigation. What are his credentials?

I'll leave it at that. Maybe you'd like to explore some of the other kitchen myths and see what kind of search challenges they make.

Any conclusions that can be reached about lobsters and pain?

Wednesday, March 24, 2010

Vampire for President?

During the 2008 presidential election, Susie Flynn, a 10 year old, made a bid for the highest office in the land. Except she wasn't 10. And she wasn't a girl.

Now a vampire is making a bid for a 2012 campaign, supposedly as a republican candidate. He is known as John Sharkey, The Impaler. His campaign site is: http://www.theimpalerformngovernor.us/ You might want to take a look.  He is also interested in being governor of Minnesota.

The challenge: Who is John Sharkey? What evidence is there that the Republican party is actually taking him seriously? Is this campaign a front for some other cause (as was the case with Susie Flynn)? If so, what?

Sample Web 2.0 posts about him and his campaign. Look for clues about the nature of the campaign. Who is taking him seriously? 

Monday, March 8, 2010

How high? The San Jacinto Challenge

On my trip to the Palm Springs area--love the desert--I learned that, due to seismic activity in the area, many of the peaks surrounding the Coachella Valley are continuously being pushed up.

That means many of the posted elevations are inaccurate.

Thinking that could make a good search challenge for freshness, I searched for information on what I read was the second tallest peak in Southern California.  Turns out there's disagreement regarding that and, not surprisingly, the same peak is listed with several different elevations.  It kinda presses home the point that, according to some, the area is in a state of upheaval.

So here's the challenge for you or your students:

What is the elevation of San Jacinto Peak? Is Wikipedia currently right or wrong? How can you tell? (what information do you need in order to verify the accuracy of the information?)

Post your answer to the challenge. How did you determine the answer is accurate?

Monday, February 22, 2010

TOPSY, a Search Engine for Evaluating Credibility

I came across a new search engine while preparing for a workshop at ICE. Topsy.com retrieves twitter posts and does so in a way that may help to inform the credibility of information being evaluated.

For example, if I wanted to see who is tweeting about 21st Century Information Fluency, I would enter that string in topsy. The results (click here) show me the way in which the search terms were referenced and the number of related tweets. The largest category indicates there are 24 tweets. By clicking that I can see who tweeted about 21cif and what they said, whether good or bad or otherwise. It's who siad what that may information my evaluation of 21cif.  In this case, a typical tweet is:
"a great collection of tutorials, podcasts, wizards etc."
If I didn't know anything about 21cif, this at least would help me see some people's opinions of it. Who those people are is possibly more important. It turns out that the person quoted above describes herself as a knowledge librarian. Sampling the others who tweeted reveals a statewide tech integration mentor, a consultant, an eLearning director and online instructor.

It's a search that takes less than 5 minutes and tells me that people who work in relevant information fields value 21cif enough to write about it and recommend it to others they know.

I suspect if an author or organization was very popular, it would be harder to sift through the results. Nonetheless, it's a novel approach to evaluation. The strength of practioners' comments may help to vet information about which you may have little knowledge.

Try Topsy to find and evaluate the comments of tweeters for the following authors and organizations:
  • martinlutherking.org  (check out the descriptions of tweeps who favor the site,  compared to those who oppose it)
  • Lyle Zapato
Always pay attention to the person who says the positive or negative thing. Why should that person's opinion count? What picture starts to form about who is for and who is against? What does this say about the subject being evaluated?

Can you think of other searches? Share them here.

Tuesday, February 16, 2010

Changing Course

When I first started studying information fluency, I thought most of the "good stuff" pertained to finding information efficiently, namely, using keywords and operators optimally, finding the 'right' database to search, choosing links effectively, etc.  Most of the activities we developed at 21cif addressed searching and, to a lesser extent, evaluation. There was a lot of energy around keyword effectiveness and power, the number of terms that was most often best and so on.

I'd say that interest has shifted; now the majority of our work centers on evaluation because that's the greatest need. It's not hard to find information. It's harder to tell if it can be trusted.  Many people are satisfied with their search skills because most of the time they find things they are looking for. Sure, they waste a lot of time getting there and miss a lot of relevant information in the process, but they get the job done.

My own thinking about evaluation has changed.  The 21st Century Information Fluency Project still maintains that determining credibility depends on the source and content of information and that knowing about the author, publisher, date and who links to a site are really important.  Lately I've been concentrating on investigative skills that serve to reveal information about authors and their objectivity, publishers, dates, linking sites and the accuracy of evidence taken from the content. One of these is truncation; others include searching domain registries, examining page information, using the links operator, browsing carefully and checking facts. Few students or adults use these investigative skills without training and as a result, sometimes they mistake fiction for fact.

An emphasis on searching is still pervasive but I hope that will change. Today I took an online challenge that was all about finding but not about qualifying the information. I got 100% and had about half the time left. But in truth, if students have the same experience (and they probably will because the questions were easy searches) they will go away thinking they are pretty good searchers when, in fact, they are not. They may be good finders, but they are not good evaluators. Because of the timed nature of the challenge, they are forced to take the first thing they find--evaluation is out of the question.

I would rather see (and you will find this increasingly on 21cif.com resources) a greater emphasis on evaluation challenges: "how do you know that information can be trusted?"

Would you call students who don't investigate what they find complete searchers?

Wednesday, January 27, 2010

"My Predicament (I need your help)"


First thing Saturday morning I was confronted with an appeal sent from my daughter-in-law's gmail account. Here is the full text of the message entitled "My predicament (I need your help):"


It is with profound sense of sadness i wrote this email to you. I don't know how you will find this but you just have to forgive me for not telling you before leaving. I traveled down to United Kingdom on Thursday for a short vacation but unfortunately,i got mugged at gun point on my way to the hotel where i lodged.All my money and all other vital documents including my credit cards and my cell phone have been stolen by the muggers.

I've been to the embassy and the Police here but they're not helping issues at all,Things are difficult here and i don't know what to do at the moment that why i email to ask if you can lend me £800.00 so i can settle the hotel bills and get a return ticket back home. Please do me this great help and i promise to refund the money as soon as i get back home

I look forward to your positive response,so i can send you the details you need to send the money to me through Western Union.

Maybe you too received a message like this from someone you know. It's an annoying example of what happens when an email account gets hacked; it also challenges you to use your digital investigation skills.

I would use this with grades 6 and up to stimulate critical information literacy skills.  The example provides some interesting clues which immediately caused me to be skeptical. These were the questions that ran through my mind:

1. Do I know where she is now? Not for sure, but having just spent 2 weeks with us for Christmas, I seriously doubted she would take off for a short vacation to the UK. Still, anything is possible. Quickest way to find out?  Call her cell phone. In fact, many of her friends who also received the message did just that. If she answers, the claim about her cell phone being stolen is false.

2. Is this how she writes? This is not how the person I know writes. Too many awkward phrases and grammatical mistakes. Words can be like body language, revealing things the writer intended to cover up. Who would write to a close relative using "I look forward to your positive response..."?  That's pretty formal for family. Other keywords from the context that don't match the style (personality) of the person I know are: 'profound sense of sadness', 'you just have to forgive me', 'traveled down to UK',  'great help' and 'refund'. There are other ways to say these things that would be more characteristic of my daughter-in-law.

3. Can you travel to the US from the UK without a passport? Since all vital documents were stolen, how can she board an airplane?

You may see other things in the email that don't add up. The point is, it always pays to investigate before committing money online. Since this email went out to all her gmail contacts, I'm glad no one that we know of is out the money.

As a teaching example, challenge your students to come up with questions about the content of the message that would prevent someone from being tricked into sending money. Have them develop a profile of the author from the keywords and how they are used. Reading between the lines in this way is an information literacy skill that's also useful for detecting bias.

Tuesday, January 19, 2010

The Death of Information Literacy


According to the blog Infonatives, information literacy is dead. Indeed, Google trends for the past five years shows a consistent decline in queries for "information literacy."

But does that means it is dead?

Actually, the last two years shows the trend has flattened (if a trend may be accurately judged from a single query).  There may have been more interest in 2004 than today. The blog traffic at http://21cif.com/index.html has slowed down some as well (if a trend may accurately be judged from visits to one page).*

Saying information literacy is dead is a bit like saying keyboarding is dead. Maybe there's less interest in it, but it still matters. In fairness to Infonatives, the point is not the actual demise of information literacy, but a necessary paradigm shift:
"The idea that it is possible to teach localization, evaluation and use of information without reference to a subject-specific set of skill is ridiculous..."
I alluded to this point in my last blog.  Having content-specific knowledge is certainly a big help when evaluating possible scams--e.g., knowledge related to organizations promising humanitarian aid to Haiti.

I firmly believe there are some basic skills that underlie all locating, evaluating and using information ethically--things like interpreting urls, truncation, fact-checking, etc.--and these remain skills we wish to instill in K-12 students--students who may not think much of information literacy, but who, nonetheless, lack the skills to find, evaluate and ethically use digital information.

Subject-specific information literacy becomes more critical when it comes to identifying optimal databases to search and evaluation of content. Finding subject-specific databases is actually not all that challenging--a deep Web search skill that I still consider pretty basic stuff.  But accruing sufficient knowledge to ascertain the accuracy and reliability of subject-specific information--that is challenging.

For me, this points out the need to integrate subject-matter training with information literacy activities such as, for example, research leading to a science fair project. What does the student need to know about the subject? How does the student know if information is credible? Teachers support information literacy by helping students ask incisive questions. Those who leave students to find their own way, which includes leaving student evaluation of Internet sources unexamined, are doing little to prevent the death of information literacy. Put another way, helping students evaluate online sources promotes good subject-specific thinking while engaging in information literacy.

The challenge is to find subject-specific ways to help students find credible information online and keep moving in the direction of fluency.  Expect to hear more about this. Your thoughts are always welcome.


*At the same time, traffic to this blog has consistently increased since I started tracking it.

Thursday, January 14, 2010

Human Suffering, Haiti Scams


As if human suffering and loss of life aren't enough, major disasters spawn bad information, creating even more victims.

Preying on the good intentions of people who want to help, scammers piggyback on disasters like the Asian tsunami, Hurricane Kathrina and more recently the earthquake in Haiti.

The media is aware of the pattern and is pretty good warning us about the activity of online predators. For example, here's an excerpt of what the Better Business Bureau suggests (complete article):

BBB Wise Giving Alliance offers the following six tips to help people decide where to direct their donations:

1. Rely on expert opinion when it comes to evaluating a charity.

2. Be wary of claims that 100 percent of donations will assist relief victims.

3. Be cautious when giving online.

4. Find out if the charity has an on-the-ground presence in the impacted areas.

5. Find out if the charity is providing direct aid or raising money for other groups.

6. In the case of gifts of clothing, food or other in-kind donations, find out about transportation and distribution plans.

These are all good ideas for investigative searching, a critical component of information fluency. Checking out the author, publisher, date, references and content are the pillars of evaluation. The BBB's list provides insights for checking content.  Most sites that warn about such scams also emphasize NOT making impulsive (e.g., emotional) decisions.

Personally, I haven't received a scam request (yet). I have found one example online, however. If you are conducting workshops on evaluation or Web 2.0 for students or staff, the aftermath of disaster is a ripe time to incorporate real-life examples of bogus appeals and provide practice in evaluating their authenticity.

Challenge: If you've received a Haiti donation scam, please share it here! We can analyze it together.

Monday, January 4, 2010

Take Another Step



Information can be misleading.

This is why it's a good practice, once you've found potentially useful information, to take at least one more step to verify what you've found.

Consider the following example:






This is a snippet (or abstract, if you prefer) from Google for the query Isaac Newton fallacy.

January 4 is Newton's Birthday, and in commemoration, Google featured a pretty cool search box graphic. Looking to see if I could unearth some misinformation about Sir Isaac, on the tenth page of returns for the query, I came across the snippet above.

That reference to leprechauns is pretty interesting. If I were to stop here, I might think I just discovered a chink in the knight's armor. But it's never a good idea to form a conclusion on the basis of a snippet. It's hard to assess the big picture from bits and phrases of a larger work.

If you go an extra step by clicking on the page from which this snippet is extracted, you'll discover the author made up the part about leprechauns. First, you have to find the text (I used ctrl+F and looked for leprec) and see what else you can learn. This is the critical phrase:
"Remember, Isaac Newton believed in leprechauns! Well, not really, but you get my point."
Your students are likely to come to conclusions based on information in snippets. Impress on them the need to take an extra step and look at the actual material to which the snippet points.

If you want to make an activity out of this, challenge your students to answer the question: Did Sir Isaac Newton believe in leprechauns?

By the way, another really good snippet challenge is this: When is Isaac Newton's Birthday?  Query: sir isaac newton birthday -- it's hard to tell, just from reading snippets.

Tuesday, December 29, 2009

Tools for Evaluation


While browsing through library sites today, I came across a link to Human Cloning, The How To Page. The message on the library site (http://lib.nmsu.edu/staff/susabeck/checs98.html) offers the cloning page as an example why students should be taught to evaluate.

I wasn't familiar with the Cloning page or its author, Arthur Kerschen.  Not many sites link to Kerschen's pages (so using the link: command is not particularly useful). But good evidence can be found by browsing the site.

The challenge is for you (and your students) to determine whether this is a deliberate hoax or not and back it up find a page that supports your conclusion.

Something I'd like to create is a matrix of hoax sites and the techniques useful for investigating them. This is one example where browsing may be the most effective method.

Monday, December 21, 2009

Googling and Vetting

I found today's post by woodsiegirl pretty interesting. She's a librarian in a British law firm (I'm guessing). Apparently--as we know from our own behavior--it's not just kids and teens who fail to evaluate information online.

Entitled, "I'm sorry it doesn't exist," Woodsiegirl's post and others' comments describe how lawyers fall into the trap of thinking that anything they want is on Google and what they find there doesn't need to be scrutinized.

This comment by Jennie, who also works with attorneys, is particularly telling:

"(T)he amount of times I’ve had to deal with this!
The favourite is the “well, I found it on Google” option. Obviously, Google knows that Scots, English and many other laws are different, and will therefore give the location-appropriate results, yes?  No.
These are fully qualified legal professionals, yet they don’t even take 10 seconds to check jurisdiction of materials that they’re using. It’s kinda scary…

The recommendation put forth by Woodsiegirl is for information literacy skills to be introduced in primary school, "if we are to avoid having another generation grow up with no appreciation of how to find, evaluate and analyse information."

While it is scary that a lawyer may not be scrutinizing the relevance or accuracy of information for a case, I submit we all struggle with this same tendency. And we know better. That should be even scarier since the results could affect us directly and adversely.

Why the tendency to behave as if information googled is, practically speaking, the same as information vetted?  There could be a host of factors--and you are welcome to share your views on this--but I will limit my comments to two:
  1. Quick-serve information promotes a misperception that information is to be consumed quickly
  2. Published information has inherent value
Briefly, the first point is when we need information fast we tend to take it at face value. In and out. Hit it and move on to the next thing. We're busy and evaluation slows us down.  For me, this is a big reason why kids don't evaluate. Adults are no different. We don't want to have to slow down, especially when we can get our hands on so much information so fast.  One solution would be to take information from vetted sources. But that's often the problem: there may not be a vetted source. Just prolific ones.  Google may not have any plans to intervene where it concerns the credibility of the information it serves. That doesn't mean someone else won't eventually try.

The second point is one we adults learned in primary school that is hard to unlearn: if it exists in print some authority thinks it is good. Now that everyone (not just the teacher, editor or publisher) has authority, information should be suspect. We can only overcome this fault by using reason and constantly jolting ourselves back to our senses with examples of what happens when we don't evaluate what we read. Thanks for the reminder, Woodsiegirl!

Today's challenge: remember, information needs to be vetted.

Monday, December 14, 2009

Santa Challenge

If you are looking for a quick challenge to involve middle schoolers or high schoolers in a keyword search with browsing and evaluation, have them identify:

1. The name of the airline that did this
2. Whether this information is accurate (is it real or photoshopped?)





Don't necessarily take the url as the answer (I chose this location of the photo on purpose).

This activity will likely lead to a Web 2.0 search (blogs), where clues can be found. You will find that people disagree about the credibility of the photo; whether the information is trustworthy is debatable. I'd have the students discuss why or why not they think the photo is real or fake. What do they use as the basis for their decision?

Thursday, December 10, 2009

Too Fantastique to be True?

I love this one.

Over the last month or so, about a dozen people have sent me a link to an apparent engineering marvel located at the University of Iowa. Some have even talked about traveling there to see it.  Here's an excerpt from the video:



The claim is that the musical instrument is built mainly from John Deere machine parts and took over 13,000 hours to build, tune and perfect. The email, however, which most people simply forward, contains great clues for investigative searching:
Robert M. Trammell Music Conservatory
Sharon Wick School of Engineering
Matthew Gerhard Alumni Hall
University of Iowa

Fact Check: copy and paste any of the first three into a search query.

I hope my friends engage in a bit of investigative fact checking before they pack the car and head off to the University of Iowa!  They could check Snopes as well.

Your students may find this an interesting challenge.

The video is one of many similar animations first produced, not as a hoax, by Animusic.  Someone made it a believable hoax by making up some "facts" about it and thousands of people since have made it seem more credible by forwarding it to their friends. Did you help out in that regard?

Tuesday, December 8, 2009

Bias Literacy: Pick the Low-Lying Fruit


We all grow up in a world of bias. It's so much a part of our everyday experience that it's easy to overlook. The troublesome thing is not to recognize or discern it. The disturbing thing, as some have observed, is that the digital generation treats all information as pretty much equal. When it comes to evaluating online information, it's treated as if it's all good.

Back in the analog age, when I was a kid, things were different. I'm not saying this was preferable, but it was obvious to me that there were points of view that were just plain wrong. With education and experience, one starts to see some good even in opposing points of view. But that's not the same as treating all information as equal.

I believe most educators feel it is important for students to learn to identify an author's point of view. Doing this in the context of teaching digital information fluency is one approach, although I think language arts or social studies is a better context. For me, this is an opportunity to integrate online search experiences within standard courses. Students will learn something about information fluency while focusing on authorship and point of view, instruction that naturally fits in language arts and social studies.

Rather than rely solely on textbooks (e.g., book reports) to have students describe point of view (aka bias), I'd bring in blogs and online editorials. Textbooks and reference books are probably the hardest places to detect bias. Works of non-fiction and fiction are respectfully easier. But the real low-lying fruit is the common daily blog post. It's an unparalleled opportunity to see bias up close.

I'll limit myself to one example for now. Let's say you are studying a current event; something students might find interesting like 'climate change'. To make the point that there are different points of view on this subject, you could select (in advance) several blogs written about climate change. Have the students read them, and from the keywords used, identify the author's bias (single point of view) or objectivity (multiple, even opposing points of view). Compare the views. Are they all the same? How are they different? Are they all correct? What makes one better than another? Should everything be believed as written? Why or why not?

Here are three blogs on climate change. I've started to unpack the first one in terms of the keywords and phrases used. The challenge is to do the same with the other blogs.

1- 56 Chicken Little newspapers on climate change 

Keywords and phrases highlighted from just one paragraph may be used to detect bias: "Today, the eco-herd of papers published a collective editorial whipping up hysteria over the issue in the face of massive data manipulation, suppression, and bullying of dissenters." Whether the author considers herself among the dissenters may take more reading and online searching; she's clearly opposed to the position taken by the newspapers.

2- U.S. Unions Join Climate Change Talks in Copenhagen

3- The Climate-Change Travesty

In addition to differences in specific keywords that take a position, each blog also may be analyzed for its tone. Helping students to pay attention to keywords (and phrases) and tone is a positive step toward information fluency.

This activity may be suitable for upper middle school and high schoolers.

Wednesday, October 14, 2009

Ask the Expert


Choosing the 'right database' to search is usually a matter of predicting "who knows the answer?"

Turning to an online source is not necessarily the fastest way to get an answer to a question. If there's an expert in the room, it's best just to ask the expert.

My usual inclination is to look online for an answer: What is it I want to know? And who might know the answer? Seems pretty straightforward. There are exceptions.

I've been having an issue with my eyesight lately. Different glasses haven't helped, so I scheduled a second opinion exam with a second ophthalmologist. The second diagnosis is that I need cataract surgery; the first diagnosis was that cataracts are developing but glasses should correct it for now. Now I have to make a decision that involves surgery since three different attempts at glasses did not help.

My question: "Who can I trust to do this surgery?" Unfortunately, the second opinion was a referral "out of network" and going that way without insurance coverage will be costly. If I can't get another referral, that leaves the doctor who thought the problem could be corrected with glasses and her associates.

Since I know all their names, I can look them up online. Who might know the answer? I figure there has to be a site that collects information and references on doctors. It's not hard to find, but that kind of information has a fee attached. Example: http://www.healthgrades.com/ There is some information (years since graduation, etc.) that is provided for free, but it doesn't really answer my question.

Searching the Web (e.g., Google) for the doctor's names returns where they practice and their specialties. It doesn't tell me anything about the quality of their work.

I tried the blogosphere, in case someone had a good or bad experience and decided to write about it. Nothing.

This is probably one of those times that going online isn't going to be much help. Fortunately I asked the second opinion doctor about the reputations of the other doctors and got some good information from personal experience about one of them. This is one of those times a person turns out to be the best database.

I suppose my next step, if I can't get a referral, is to ask for references from the doctor I select. Previous patients ought to know something, although I'll likely be given ones who had a positive experience.

Have you found information-for-hire from online databases helpful?
What else would you do?