Showing posts with label Evidence. Show all posts
Showing posts with label Evidence. Show all posts

Friday, November 18, 2016

Facts Matter

I recently updated a 21cif MicroModule on Evidence that got me thinking about recent elections in the US. What role did facts actually play?

Election outcomes don't boil down to just a few factors. Not everyone who voted one way or the other did so without weighing pros and cons. In all probability there could have been cases where a vote was cast knowing something about that candidate wasn't 100% satisfactory. For some, evidence to back up claims was critical; for others, not so much.

Throw into this mix Fake News. Titles that appeared include these:
“Twitter, Google and Facebook are burying the FBI criminal investigation of Clinton.”
“Donald Trump Protester Speaks Out: ‘I Was Paid $3,500 To Protest Trump’s Rally.'”
"FBI AGENT SUSPECTED IN HILLARY EMAIL LEAKS FOUND DEAD IN APPARENT MURDER-SUICIDE."
“Remember the voting days: Republicans vote on Tuesday, 11/8 and Democrats vote on Wednesday, 11/9”
“Just out according to @CNN: “Utah officials report voting machine problems across entire country.”
Source: http://www.mercurynews.com/2016/11/11/did-fake-news-on-facebook-send-trump-to-the-white-house/

Whereas none of the headlines can be supported with evidence, did they reinforce or sway voters? Perhaps.

Without evidence, believing something is true gives all the control to the news source.
"If you don't look for evidence you blindly place all your trust in the alleged accuracy of a source. How do you know they are right?"
(Source: http://21cif.com//tutorials/micro/mm/evidence/index.php)
Here's a helpful open source document on evaluating Fake News sites, thanks to Melissa Zimdars: https://docs.google.com/document/d/10eA5-mCZLSS4MQY5QGb5ewC3VAL6pLkT53V_81ZyitM/preview

She lists 11 tips:
  • Three involve checking the URL of the news source. 
  • Three are about lack of author or publisher attribution. 
  • Two are about checking other, known sources. 
  • One is about the effects of biased writing creating an emotional response. 
  • One is about formatting (pay attention to ALL CAPS)
  • One is about content that encourages bad Internet etiquette.
Can you think of others? These are worth putting into practice unless you don't think facts matter any more.

Monday, July 14, 2008

Embedded Evidence, External Evidence


Over the weekend I created two new tutorial resources for the Website Investigator series (WSI): Accuracy and Evidence. In addition to knowing who is the author and/or publisher and when it was written or published, finding embedded evidence and external evidence can be very important in verifying the credibility of the source and the content.

In the simplest terms, credibility depends on source and content. Information about the author and publisher helps to define the source--where did these ideas originate? Is the author recognized as an expert? Does this publisher submit works to careful review before posting them? Embedded and external evidence helps to define the content--how are words used (signs of objectivity or bias)? When was this written? Who links to it? What do experts say about the content? A questionable source may produce brilliant content and a trusted author may produce flawed content--so it's important to check both before accepting information at face value.

As educators know, the majority of students today tend to accept information at face value. Somehow, merely finding information feels sufficient. Investigating it is unimportant.

To encourage investigation, students need to be shown and practice a few basic techniques. These are not hard to learn and don't take much time. Compared to searching (which I now call speculative searching--when you don't know exactly what words to search with and where to look), investigative searching is much more precise: the keywords are clues embedded in the information and the places (databases) to search are well-defined.

Think of embedded evidence as clues in the text, the url and metadata. These clues can be used to investigate the accuracy of information and often lead to external sources that have already done an evaluation.

The new Accuracy tutorial focuses on three areas: Finding powerful clues embedded in a Web page, checking the evidence by doing a secondary search and triangulating, checking what three different sources have to say about the information.

The companion Evidence tutorial emphasizes using queries to find external evidence, checking whether pages that link to the information support it or contradict it and triangulating information sources (examples that are different from the Accuracy module).

These tutorials are geared for middle schoolers through adults. There's increasing demand for similar activities aimed at elementary grade students, and that's my next project.