Tuesday, April 23, 2013

Fake Tweet Sends Stocks Plummeting

As many articles have already made clear, Americans will react to news that sounds like terrorism.

Today's fake tweet shows how sensitive consumers of information really are.

A hack attack on the Associated Press' Twitter account resulted in "an erroneous tweet" claiming that two explosions occurred in the White House and that President Barack Obama was injured. It didn't take long (2 minutes) for Twitter to suspend the @AP Twitter account.

More than 4,000 retweets later, the credibility of the message was dealt a fatal blow when an AP spokesperson told NBC News the news was false.

Like the EKG of a country, the Dow Jones industrial average just after 1 p.m. shows the collective heartbeat (above). More than 140 points was lost in a flash. Five minutes later much of the loss was regained.

According to Bob Sullivan, NBC News: "It's incredible what a single 12-word lie can do."

How could being an investigative searcher make a breaking lie less effective?

Fact checking the accuracy of the claim is a little trickier in the case of Twitter. Breaking news often comes through this channel before being picked up by major news.

That is probably the clue. AP wouldn't be the first to break the news. Someone on the scene would have said it first; AP would carry it a minute or more later. All one would have to do is look for the source of the AP tweet.

Not being able to find an earlier tweet about this news is the tell-tale sign about its credibility.  A good search engine for tweets is topsy.comhttp://topsy.com. Check it out before you react with your gut.

Tuesday, April 16, 2013

Crap Detection 101

Amateur Whale Research Kit?
Howard Rheingold is credited with Crap Detection 101: How to tell accurate information from inaccurate information, misinformation, and disinformation.

Put your crap detector to work here: http://www.icrwhale.net/products/amateur-whale-research-kit

Some of the usual investigative techniques (backlinks, fact checking) don't work very well. What is it that "tells" you this information, at face value, cannot be trusted?

Wednesday, March 13, 2013

High Cost of Being Gullible

The price of cyber crime is astounding.

  • UK Guardian: Consumers and businesses in the UK lost an estimated £27 billion in 2012 due to cybercrime.[i] 
  • Ponemon Institute: The average annualized cost of cybercrime for 56 benchmarked U.S. organizations is $8.9 million per year.[ii]  
  • People’s Public Security University of China: In 2012, economic losses from Internet crimes in China totaled an estimated $46.4 billion (RMB 289 billion).[iii]
And it's growing annually.

So what does being gullible cost the average American?

See if you can find the cost to the average Senior Citizen in the US today.

What does this say about the need to investigate online information?


[i] John Burn-Murdoch, “UK was the world’s most phished country in 2012 – why is it being targeted?”, www.guardian.co.uk, last modified on February 27, 2013, http://www.guardian.co.uk/news/datablog/2013/feb/27/uk-most-phishing-attacks-worldwide.
[ii] “2012 Cost of Cyber Crime Study: United States” Ponemon Institute, October 2012, http://www.ponemon.org/local/upload/file/2012_US_Cost_of_Cyber_Crime_Study_FINAL6%20.pdf
[iii] “Internet crimes cost China over $46 billion in 2012, report claims”, thenextweb.com, last modified January 29, 2013, http://thenextweb.com/asia/2013/01/29/china-suffered-46-4b-in-internet-crime-related-losses-in-2012-report/.

Friday, January 18, 2013

Invisible Query

Time flies! I've neglected this blog for about 6 weeks.

Dennis O'Connor and I are deep into authoring a book on Teaching Information Fluency. Our deadline is the end of April.

Writing a book is a discovery activity for me. Last time I wrote this much was my dissertation and I discovered plenty about flow and mathematics while doing that.

This time, while it would seem I've traversed the topic of information fluency through this blog and the 21st Century Information Fluency Project website, there are still Aha! moments.

As I was thinking about the process of querying, it occurred to me that there's a lot more to it than translating a natural language question into a query. That's just the visible query--the one that search engine responds to. There's also an invisible query, the one you don't enter into the text box. The keywords or concepts that remain in your head.

These help you filter the results of the query. Some results are more relevant than others, not due to their ranking, but because you have some priorities in mind the search engine is unaware of.

It's generally ineffective to enter everything you're looking for in a search box.  It constrains the search and produces fewer results--sometimes none. It's better to submit two or more (keeping it a small number) keywords and scan the results for your invisible query.

Using one of our classic examples, "How many buffalo are there in North America today?", a good query is buffalo north america (bison is better than buffalo). Yet that's not really enough information to answer the question which is going to be 1) a number and 2) as recent as possible. That's the invisible part that you have to remember throughout the process. You choose results that satisfy 1 and 2; if not, you're probably not answering the question.

One premise of the Filter Bubble is that the machine is learning from us and will hone its output to our preferences. This becomes a harder task when we are not feeding the machine everything we have in mind. It may be a pretty good way to keep the Filter Bubble from encompassing us.

Think about what you're not querying that you are still looking for next time you search.

Wednesday, December 5, 2012

Searching isn't as hard as Thinking.

Just google it. You're bound to find something you're looking for.

Finding is no longer the challenge it once was. Knowing what you are looking for remains no easy matter.

Asking the right question usually precedes touching a computer. What are the hours of the Louvre? Go to the computer. What is 4 degrees Celsius in degrees Fahrenheit? Go to the computer.

These are the average one-off questions an Internet search is really good--and fast--at answering.

But what about the occasional harder question? Harder questions include: a) those that lack precision in what is being asked, b) those with competing or rival answers and c) those with no known answers.  Googling the Internet is not particularly useful at answering this last type.

A and B type questions occur frequently and make easy searching harder. I've blogged before on the need to fact check B type questions to avoid trusting misleading and/or malicious information. Investigative searching is a remedy for establishing the credibility of answers.

Let's focus on A, ambiguous questions. These are questions that may be answered different ways (with different answers) and still be right. An example used on our web site is for the search challenge: What is the top speed of earth's fastest animal? Like most ambiguous questions, this question is unintentionally ambiguous. Only after searching and finding rival answers does its ambiguity become increasingly apparent. This requires real thinking.

Skimming the top ten Google results for the query speed fastest animal, possible answers include:
  • cheetah (3)
  • peregrine falcon (2)
  • sailfish (2)
  • pronghorn antelope
  • wildebeest
  • lion
  • thompson's gazelle
  • quarter horse
  • man
  • cow dropped from a helicopter
The student is confronted by a common problem: which one is right? The underlying problem is not that there are multiple answers (which one is right?) but that these are answers to different questions (which question am I supposed to answer?).

The problem could be simplified by rethinking the question: what animal travels the fastest? Now the differences between air, water and land don't factor in (cheetah is fastest on land, sailfish is fastest in water and peregrine falcon is fastest in air). The fastest speed belongs to the falcon.

But another search result--a cow dropped from a helicopter--reveals further ambiguity in the question. The originator of the question may have assumed the animal needs to travel under its own power. In that case, the falcon, which 'cheats' by virtue of gravity, could be bested by the cheetah. By the same token, what prevents an astronaut orbiting the earth from beating the falcon? It ultimately depends on the question.

The example is ridiculous but illustrates how 'right' answers may differ depending on how a question is interpreted and how thinking is aided by searching. Questions that leave room for interpretation make Internet searching more difficult (and may be more interesting). Teachers are advised to try the searches their students are likely to use in an attempt to avoid asking ambiguous questions and inviting 'smart' answers in return.

For the individual, questions may be improved the same way: try a search and see what happens. Don't expect the best answer the first time because the right question has not yet been asked. It's very hard to think of a question you haven't though of yet. Iterative queries are good at helping discover and refine questions.

So, how would you ask the fastest speed question? Go ahead and post your response.

Here are a couple more ambiguous questions that need refinement. See if you can figure out an unambiguous question without searching; then try a search.
  • How many buffalo are living in North America today? link
  • Between 1918 and 2012, in what year did Americans pay the most for a gallon of gas? link

Tuesday, November 13, 2012

Freshness Challenge: Young Entrepreneurs

In my career at the Illinois Mathematics and Science Academy I get to work with young entrepreneurs. I also enjoy playing guitar. When I came across a video about Alex Niles, a South Florida Middle School student who won a NFTE regional prize for his custom made guitar business, I was intrigued. His work is pretty impressive and he has solid endorsements. This led to a search for Niles Custom Guitars.

Can you buy one of Alex's guitars today? If so, where or how?

Here's a link to more information about Alex (about two-thirds the way down the page).

Post your answers to this blog.


Thursday, November 1, 2012

Re-release of Author Tutorial



Another refreshed tutorial is now available on 21cif:  Author


Author is the first in our series of Evaluation tutorials. Earlier last month we re-released Publisher. In the coming weeks expect to see two more: Date Checking and Back Link Checking.

If you want to try the Author module in a non-flash format, we just completed a revision.

There are three sections to the tutorial:

Practice the skills: to help novices, there are some practice exercises that introduce methods to solve the challenges in the tutorial. These focus on fact checking queries, truncation and browsing.
Find the Author: four challenges of increasing difficulty to identify the author of a page or site.
Investigate the Author: using clues on the site and external sites to determine if the author has a good, poor or unknown reputation. There are three of these challenges.

Try it out!  Tutorials may be completed in as little as a few minutes by individuals or extended into a classroom activity if desired. The final page may be printed and turned in if you want to see how students fared.

Start the tutorial