美國在台協會（AIT）於2017年10月17日與美國石溪大學新聞學院 (Stony Brook University School of Journalism) 副教授暨電子和數位媒體新聞採訪課程總召集人史蒂芬‧雷納 (Steven Reiner) 舉行一場視訊會議，主題為「對治假新聞來襲: 防範不實報導可能造成的危害」。歡迎收看。
MS. SONIA URBOM: All right. It is 9 o’clock. We are going to go ahead and get started. Let’s see where I can– there we go. So Professor Reiner can see us. Well, good morning, everyone. I am Sonia Urbom, the spokesperson at the American Institute in Taiwan. Very happy to welcome everyone to our three locations. We’re going to be saying hello to the other locations in just a moment. But I am here in Taipei at the American Institute in Taiwan’s American Center, and we are here with many of the people who are in Taiwan working on creating a media literacy and digital literacy curriculum for Taiwan.
And we are so pleased to have a very expert speaker here, with us, today, Mr. Steven Reiner. Here he is. Have a good look because when we start the presentation, we’re going to go to the PowerPoint so you won’t see his face anymore. I’m going to give a brief introduction. Mr. Steven Reiner is the director of video journalism, and an associate professor in the Stony Brook University of journalism. He has lectured and conducted workshops and seminars in news and media literacy through the Stony Brook center for news literacy and overseas. And he is also a chief presenter on a Coursera MOOC, massive open online course about news literacy, which is co-produced with the University of Hong Kong. He has conducted training sessions in science communications at colleges, universities, and foundations around the United States. Mr. Reiner is also a 30 year veteran of network broadcast television. He has, in fact, won multiple Emmy awards for producing the CBS News’ 60 Minutes, a program that I, and many other Americans, grew up with. He has also worked for NBC and ABC News. He was the editor of the Atlantic Magazine, and a senior editor and executive producer of NPR’s flagship news program, All Things Considered, which happens to be one of my personal favorites. I like to listen to it every morning. So we’re very, very pleased to have him with us here today.
We’re also very fortunate to have a special guest with us, Digital Minister Audrey Tang. Audrey Tang has been a driving force in Taiwan’s efforts to develop a more open and digital government, and he has been a pioneer in the field of virtual and electronic education, and a strong advocate for enhancing digital literacy. Minister Tang has just received from the U.S., where she worked closely with Silicon Valley based firms on some of the topics that we are going to be discussing here, today. So I’m going to turn the floor over to you, Minister Tang. Thank you.
DIGITAL MINSTER AUDREY TANG: Hello, Professor Reiner. It’s a pleasure and a honor to be delivering this very short remark. I’m an optimist when it comes to misinformation or, for that matter, anything that challenges our current trust in the internet. I’ve been an optimist for a very long time. And this strange condition started when I was 15 years old, when it was 1996, when the world wide web was just first invented. I dropped out of junior high school to pursue my education because I found the information and community on the world wide web, it was so helpful that the textbooks pale in comparison. So I told my teachers, my principals, that I’m quitting school. And much to my surprise, they all think it’s a great idea.
And so they covered me and I drop out of school, and started a journey to fund some startups on social media, on online auction, a lot of online community. And along the way, I participated in this very interesting, anarchistic political system called the Internet Engineering Task Force. This political system basically means that everybody just embarks on a rough consensus, meaning people generally agree that this is an OK direction, without voting and without any kind of representatives. But with rough consensus, we produce running code, and with this running code, we’re bringing about a culture of radical transparency.
And so now as the Digital Minister, I’m kind of importing this culture from the other internet to Taiwan, and to increase civic participation, and reduce the fear, uncertainty, and doubt from all stakeholders, especially Career of Public Servants. Before I was Digital Minister, I participated in the National Academy of Education Research, where we made the K to 12 basic education curriculum. And in this curriculum, we emphasized that, whereas before, Taiwan’s education, like many places in East Asia, focused on it’s skill-based, capacity-based education model, with examinations, and such.
And now, we’re reforming it into a character and literacy-based education system, where we focus on building the character of the students, themselves. And of the three core literacies or characters, [INAUDIBLE], one of the core thing is media literacy and ICP literacy. We don’t see the ICP and media literacy as two things. They are the same thing in our new curriculum. It’s going to take effect a couple of years from now. What we mean that this, is that instead of just assigning one hour or two hours on media literacy or ICP literacy, what we’re doing is that the teacher, instead of lecturing the student, the teacher is now learning with the student. All the different fields, be it mathematics, be it physics, or any of the other science fields, by learning it with materials from the internet, such as the abundant materials that Professor Reiner provides on media literacy.
But when the source comes from the internet, of course, the students must know and not everything on the internet can be swiftly trusted. They need to do their due diligence, their fact checking, and all of that such. So the teacher, instead of being authoritarian, actually works democratically with the students, to make sure that they can digest the information from all the different angles on the internet. And this is, what I think, how we can evolve from a system of division, of mistrust, of disempowerment, into a system of co-creation.
As a veteran, working on the internet culture, I’ve been through many misinformation, divisive moments. I remember the original war on the Communication Decency Act, The Blue Ribbon campaign, the original spam wars, where for a time, like during 2002, everybody thinks that spam mails is going to destroy the internet. Then it didn’t. And the misinformations during the elections, and also, nowadays, we also see counterfeit advertisements on Facebook, and such. But through all these different scenarios, there’s one thing in common. It takes all the stakeholders involved, instead of one authoritarian actor.
So I would like to commend and acknowledge the work of many people here, the AIT for providing the venue for this forum, people in the NCC for providing a internet governance-based multi-stakeholder instead of regulatory design, and the multi-stakeholder meeting would also recognize the Media Watch organization, as well as the people from GitHub Zero community, such as the [INAUDIBLE] Bot, where if you add it as your friend on an online messaging system, you can send it a link to something that you don’t know whether it’s authentic or not, and this [INAUDIBLE] bot will respond saying, OK, this is genuine or whether this is counterfeit. And behind this is a huge crowdsourced campaign. And of course, I also want to commend the original news helper community from GitHub Zero, [INAUDIBLE] for inspiring projects like [INAUDIBLE]. So, and this is the end of my remark, I would encourage you to just keep working together and collaborate, and as an optimist, I think we can also solve this issue in due time. Thank you.
MS. URBOM: Thank you so much Minister Tang. All right, now we’re going to turn and say hello to our other two locations. National Chengchi University, can you say hello to us? Patricia?
SPEAKER 1: Hi. This is the NCCU, National Chengchi University, and we have around 250 people here. We are Communication 101, so mostly college freshmen. So we are very interested in this topic. Thank you.
MS. URBOM: All right, we’re so happy to have all of you students here with us today. And now we’ll say hello to our third location. That’s the American Institute in Taiwan, Koahsiung office. Can you say hello to us?
SPEAKER 2: Good morning. Good morning from Koahsiung. We have about 25 students and professors from four different universities. We’re very excited to be a part of this, so thank you to everyone for coordinating and participating.
MS. URBOM: OK, wonderful. So it sounds like we can all hear and see each other. And I’m just going to remind everyone that we are doing this in two different languages. So I have some microphones here, this one is marked English. When I’m speaking in English, we have two interpreters in the booth, which are interpreting for Mr. Reiner, so that– I’m sorry, he’s listening to me. Interpreters are interpreting for anyone in our audience who does not understand English. So when we’re speaking English, we’ll try to speak slowly and clearly, so that the interpreters can interpret for all of you. For the people in the audience, when we turn to the questions, we have several microphones here, in Taipei, marked in Chinese. You are welcome to ask your question in Chinese. Please just let us know that you want to speak in Chinese, and the interpreters will translate– interpret what you have to say in Chinese into English for Professor Reiner. At our other two locations, they will be speaking only in English. And now I would like to turn it over to Mr. Reiner.
PROFESSOR STEVEN REINER: Ah, thank you very much. I hope you can hear me fine. It’s a pleasure to be joining you. It is, as you probably know, a quarter after 9:00 in the evening on Monday night, in New York City. And it’s, as I said, a great pleasure to join you. Just a brief introduction, when I joined the faculty of Stony Brook University, which is a large public university on Long Island, about 60 miles east of New York City, it was about the time of the School of Journalism was created. And along with the creation of the School of Journalism, the founding dean came up with an interesting idea, which was that in order to help stimulate and promote good journalism, it wasn’t only important to train the people who would be providing journalism, the future journalists, but also to help educate the future consumers of journalism, because without an educated consumer population, who would demand good journalism, who would be able to differentiate good journalism from bad journalism, there really would be no real impetus for journalism to survive.
10 years ago, of course, seems a lifetime ago, in terms of the changes that have taken place in the world of journalism, particularly in the United States, but also all over the world. The changing economics of journalism, the challenges that the internet poses, the remarkable variety and number of sources of information that are now available to anyone on the internet. The fact that the traditional gatekeepers of information, the major media outlets– what’s now called legacy media– which, once upon a time, perhaps there was a handful of sources of information, now there are hundreds and hundreds of different sources of information that each of us can turn to.
And teaching, in our case, undergraduate students how to navigate through this tsunami of information, and to be able to differentiate fraudulent information from believable, actionable information, information that is credible enough and fact-based enough that you can take action as a result of receiving it, was the initial mandate of what we created as the Center for News Literacy. Of course, in those 10 years, the whole question of news literacy, and media literacy, and digital literacy, and the phenomenon of fake news– which has really exploded over the last several years– has come to dominate the news and media literacy landscape in the United States. So where 10 years ago, we were pioneers, now there are literally dozens and dozens of programs, all of them somewhat different, but all of them with the purpose of trying to educate the public, and help the public learn how to differentiate between misinformation and reliable information.
There are many theories, there are many approaches. Some approaches focus on digital literacy, and trying to help people understand how to navigate and understand how information comes to them through the web and through social media. Our program, while it does address those issues, primarily deals with trying to teach people about how to recognize good journalism. Looking for things that will signal responsible fact-based journalism, as that would be the key to beginning to differentiate fake news from real news. So in my PowerPoint presentation, I’m going to talk a little bit about the basic approach that we have toward teaching of fundamental news literacy, as we call it.
And then secondly, to talk about some of the specific challenges and some of the specific techniques that we can employ to help determine trustworthy information from untrustworthy information on the internet and on the web. But we believe that the basic principles of news literacy, of information literacy, of media literacy are applicable across all platforms– whether you’re reading an old fashioned newspaper, whether you’re looking at a website, whether you’re navigating your way through social media, or whether you’re watching broadcast television or video television, either on an old fashioned television or on your computer screen. So with that, I want to start the PowerPoint now. And, of course, when it’s over, we’ll be having a discussion and we’ll be answering some questions.
So for the sake of this conversation, I’m going to talk about what we are going to term the fake news invasion. And the first question we’re going to ask is, what is fake news? Of course, I don’t personally like the term fake news because I think it is overused, it is misused, it is manipulated by whoever wants to mention it to mean news that you don’t agree with, news that is critical of you, news that poses a challenge. But certainly, it has become an umbrella term to describe all sorts of misinformation that comes at us through a variety of sources. As you may know, in the United States right now, we have a situation where the President of the United States is often accusing professional news media outlets of broadcasting and printing fake news. And what has resulted is a certain degree of confusion among the public and a very difficult situation, because, as you know, when someone cannot differentiate between truth and fiction, between the fake and the real, it is a challenge to the basic underpinnings of democracy, which is why it’s so terribly important to be able to ferret out reliable news from fake news.
So the question is, what can we do about misinformation? What can we do about fake news? And I was delighted to hear that the minister– that your Digital Minister, Minister Tang, was an optimist about how we can begin to navigate this perilous territory in the months and years ahead. I’m not sure whether I share that optimism, at the moment, because I think we have a tremendous number of challenges. But certainly, the effort to combat fake news can only serve us well, as we learn along the way.
So there are many sources of fake news, and there are many reasons that misinformation is disseminated to the public. As you may know, fake news can be lucrative. It can produce a tremendous amount of profit for those people who purvey it. Particularly when news stories that are sensational in nature, that are injected into political campaigns are introduced in the social media landscape, are introduced on Facebook, are introduced on Twitter, begin to show up in searches on Google. And there is a whole group of what we’re going to call digital entrepreneurs. People who don’t exactly have a political agenda, but who know that if they can plant a certain number of provocative, or incendiary, or controversial stories in the right fashion on social media, they can produce a tremendous number of clicks, and they can produce a tremendous amount of profit for them. And during the course of the American presidential election, it was reported that there were a number of digital entrepreneurs in Eastern Europe, in the Balkan countries that were really disseminating fake news for no reason, other than to make money, with no political agenda.
There are also, of course, propagandists and political partisans who have a vested interest in disseminating in spreading fake news. And, indeed, in this definition, there is a very little differentiation between fake news and propaganda. Fake news and misinformation spread for a certain purpose. This is also, of course, something that can become a question of partisan politics, where one can spread misinformation about a political opponent. And of course, governments, themselves, sometimes, can disseminate fake news, both to their own people, and, as we are all beginning to see, through many investigations right now in the United States, how foreign governments can disseminate fake news through social media, in an effort to have some influence over electoral politics. So the sources of fake news are many.
Now fake news, of course, can only thrive, can only live, if we are receptive to it– if it lives in an environment and in an atmosphere which is hospitable to it, which is friendly to it. So what is it about fake news that gives it such vitality, that gives it such strength to exist in the midst of so much solid journalism and solid reporting? Frequently fake news will manipulate our emotions, because fake news, usually, is directed in situations where there is already a tremendous amount of emotional involvement in a particular story, or in a political campaign. As we’ll talk about later, the manipulation of our emotions, as opposed to an appeal to our intellect, is something that is extremely powerful. So if fake news manipulates our emotions, it has a certain degree of fertile soil in which to grow.
Fake news also satisfies what is frequently called confirmation bias. And confirmation bias is an interesting phenomenon where we, as human beings, gravitate toward ideas that we already agree with. We don’t instinctively seek out information, and seek out points of view that are contrary to our own. We want our points of view, our biases, to be confirmed by others. And as the internet, now, can provide uncounted thousands of different sources of information, one can find information that confirms one’s own bias very, very easily. Fake news can target people’s own biases. Fake news can provide a safe haven for people who want to go and have their beliefs confirmed. Fake news will not challenge people. Fake news will only deepen and strengthen people’s existing beliefs, or beliefs of people who are still on the fence, and don’t really know what to believe.
Fake news– if you look at the kinds of information that goes out– fake news also encourages a certain sort of conspiracy thinking, a certain kind of thinking that provides a simple explanation for complicated issues. We live in an extremely complicated time, where there are no, really, easy answers to a lot of the vexing questions that we have. Fake news frequently gives very, very simple answers. Fake news, very often, points one finger at an understandable phenomenon that will explain something. So if there are certain individuals who, somehow, just can’t understand how– let’s use an example of an ongoing fake news story throughout the previous 18 months in American political history, what was called the birther movement in the United States, where there was a large belief system that former President Obama was not born in the United States, was not born in Hawaii, but instead was born in Africa, and, therefore, he would have been disqualified from being President of the United States. That’s a kind of conspiracy thinking that people who objected to someone like President Obama being elected president will gravitate toward. So fake news appeals to people who look for simple and sometimes outrageous explanations to explain things that they otherwise can’t understand.
And, of course, lastly, the digital ecosystem itself. The extraordinarily widespread reach of social media, the extraordinary widespread reach of platforms like Facebook, of platforms like Twitter– which have, at this point, a life of their own. Which are so big, and so robust, and so complicated that some people argue that even those individuals who created these platforms, and even those individuals who now are charged with attempting to self-regulate these platforms, don’t quite have control of what can occur on these social media platforms.
The digital ecosystem that we have is extremely fertile ground for fake news. And as most of you already know, the question of fake news, the issue of news stories that made their way into the digital ecosystem, made their way onto Facebook, made their way onto Twitter through retweeting– which we’ll deal with a little bit later– was a central issue in the recent U.S. Presidential election. And an issue that, right now, is under daily discussion in the United States, whether it is the government investigation of stories that it is argued were planted by Russia on Facebook, whether it is other– whether it’s Facebook’s or Twitter’s inability to control and monitor fraudulent information. It’s an extremely complicated and troublesome issue that remains in the headlines in the United States. And as the next election cycle in the United States draws closer– which it is going to be the 2018 congressional elections– many, many states and many, many players in the social network movement in the United States are grappling with ways to control and identify fake news in a more effective way.
As an example, after the election, a number of analysts concluded that by election day, the number of fake news stories that were posted on Twitter– rather, on Facebook– overtook real news by election day. Stories that, really, almost defy logic, stories that almost seem to be too ridiculous to be true received a tremendous amount of traction, and a tremendous amount of sharing on social media sites, like Facebook. One example, on the left, “WikiLeaks confirms that Hillary Clinton sold weapons to ISIS.” That’s a story that got a tremendous amount of sharing on Facebook. Far more so than a story that the mother of a former president of the United States gave an interview that was critical of, now, President Donald Trump.
As you can see by this chart, the number of fake news stories on Facebook continue to rise, and the closer we got to election day– which was in August– at some point, the number of fake news stories exceeded mainstream news, credible news stories by more than a million. 8.7 fake news stories engaged, shared, read on Facebook, versus 7.3 million mainstream news, credible stories on Facebook. Here you have two charts. On the left, top five fake election stories by Facebook engagement, and on the right, top five mainstream election stories by Facebook engagement.
And a few– the number one shared election story on Facebook, shared by almost a million people, was a fake news story. “Pope Francis shocks the world and endorses Donald Trump for president, in a statement that he released.” The second most shared story was the story I just referenced, that Wikileaks confirmed that Hillary Clinton sold weapons to ISIS. That first story exceeded any mainstream election story. And the fake news stories, usually, were shared far more than the mainstream stories. And if you hit the next slide, you’ll see the connections and the comparisons between the two.
As an example of how important social media was and how important spreading a certain degree of fake news was, after the election there was some calculation about how effective the two candidate’s social media engagements were. As it turns out, Donald Trump’s tweets were retweeted three times more than Hillary Clinton’s tweets. The Facebook posts that were either pro-Donald Trump or anti-Hillary Clinton were re-shared five times more than Facebook posts that were pro-Hillary Clinton or anti-Donald Trump. 20% of the Trump campaign’s own tweets were actually retweets of the general public, and many of these retweets were retweets of fake news. Indeed, half of the tweets on behalf of the Trump campaign were actually links to other news media, much of that fake news media. And many of the tweets were also linked to 78% of the Trump campaign’s Facebook posts. All of these numbers really indicate that the energy, and the targeting, and the micro-targeting frequently– with carefully targeted fake news reports about controversial issues– were done much more efficiently and much more frequently by the presidential campaign that wound up to be the victorious one. Of course, we don’t know if there is a cause and effect relationship, whether this is the reason that one candidate won and not another. But it is certainly an indication that the virality, the frequency with which fake news tweets and misinformation tweets can go viral, certainly played a very important role in the presidential election.
Of course, the other thing that many of you have probably heard of, is the issue of bots, the issue of algorithmic robotic tweeting and information, that basically has a life of its own on platforms like Twitter. An extraordinarily large amount of misinformation and fake news, spread through platforms like Twitter, are spread without human intervention, but basically spread by bots. Bots can spread information or misinformation. They can easily cause topics to trend online, through automated promotion of hashtags, and stories, the like. Indeed, in the month before the presidential election in the United States bots– meaning algorithms, meaning messages and links to stories, many of them fake news stories– 20% of all tweets were not generated by human beings, but were generated by bots. And in the run up to the election, during the campaign, pro-Donald Trump Twitter bots generated about 400% more tweets as pro-Hillary Clinton bots. Simply another example of the effective use of misinformation in that digital ecosystem, which strengthens and deepens the effect of any sort of fake news.
So the question, of course, the question of the hour is how can we– how can people test for the truth? How can we look at a tweet and decide whether or not to retweet it? How can we look at a Facebook post and know whether or not we should share it? What are the signs? What are the techniques that we can use to try to push back against what really does seem like an onslaught of information and misinformation, much of it generated almost on its own power? In many ways, it is a simple matter of supply and demand. And at the Stony Brook Center of News Literacy, we focus a lot on the demand side, on the consumer side. Platforms like Facebook and Twitter are developing, and will continue to develop, their own internal mechanisms to attempt to sift through extraordinary amounts of misinformation misdirection. But we believe what we can do is, I’ve said before, to help strengthen the critical thinking skills of news and information consumers. To help them wade through this flood of misinformation. So on the demand side, on the consumer side, what can we do?
We believe in something called news literacy. Which we’re going to define as the ability to use our own critical thinking skills to judge the reliability and the credibility of news reports, whatever platform we see them on. And the first thing we need to do is, really, to attempt to define what news is, as opposed to other sources of information. As opposed to advertising, or publicity, or entertainment, or propaganda, or other forms of information, which frequently can mimic– can disguise themselves as news, as journalism. We believe that news is information that’s in the public interest, that is shared, and is subject to a certain process. And this process is very key. It’s a process that has three key elements. It’s a process of verification. It’s a process that is independent. And it’s a process that is accountable. And the three letters in English are V, I, A. VIA, verification, independence, and accountability. Those are the three hallmarks of solid journalism.
Verification is the process that establishes, or confirms the accuracy, or truth of something. Verification is different than assertion. If I assert something, it is simply my opinion. It is simply a point of view. It is not necessarily based upon any verifiable fact. News reporting relies on a system of verification, where I can prove to you that the information that I am providing has a basis in fact. Verification is based on evidence. And the most important part of evidence, the most important category of evidence, is direct evidence. In videos, in audios, in photographs, in documents, in records, in eyewitness accounts, in accounts by observers. Direct evidence coming from reliable sources is the most– is, really, the bedrock of verifiable journalism. And you need to look for direct evidence that supports the facts that are presented to you. Direct evidence is more reliable than indirect evidence.
Indirect evidence can be accounts from spokespeople, or press secretaries, or press releases. They can be second-hand accounts, or third-hand accounts. There can be computer models. There can be inferences from evidence. But indirect evidence is not as reliable as direct evidence. Indirect evidence is hearsay. Indirect evidence is overheard conversations. It’s one person telling another person, telling another person. It’s rumor. It’s the stuff you frequently see in news reports that does not have the strength of verifiable evidence– of verifiable information from direct evidence.
What else makes news different is the question of independence, which is freedom from the control, or influence, or support of any interested parties, coupling that with a conscious effort to set aside any preexisting beliefs, and a system of checks and balances. We look for sources of information that are independent, that don’t have a vested interest in presenting one side of a story or another side of the story to you. You need to look for independence in your sources of information.
And of course, accountability. Being responsible or answerable for your work. Another hallmark of solid journalism. You always have to ask yourself some key questions when you read or watch any report, anything that purports to be news, anything that purports to be a presentation of reliable facts. What do I know? How do I know it? What don’t I know? You have to ask yourself those questions if you’re a journalist, and you have to ask yourself those questions if you are the consumer of journalism. And you have to ask yourself, have you set aside your own beliefs, your own biases and dispassionately evaluated a story?
Journalistic truth, truth in journalism, as we look for it, is interestingly defined as the best obtainable version of the truth, on any given day. It is a continuing journey toward understanding, which begins on the first day of a story and builds over time. And the bottom line, the take away of this, is that truth is always changing in journalism. The story is always changing. New developments always occur. As you may know, there was a tragic mass shooting in the United States several weeks ago, in New Orleans, where more than 50 people were killed, and hundreds of people wounded by a gunman in Las Vegas. And if you followed that story, over a number of days, you would see that one day’s truth changes, because new evidence comes about and new facts are revealed. And frequently, journalists are accused of inaccuracy and accused of fake news because people are only taking a snapshot of a story and not realizing that stories change over time. Journalistic truth develops as more and more facts are uncovered through reporting.
Journalistic truth, we say, is provisional– that today’s evidence may be changed by tomorrow’s discoveries. Very important to keep this in mind. Journalistic truth is changing. It’s a process. There is no absolute truth in journalism, because new facts can always be discovered. Frequently, the hardest time to pinpoint accurate information is during a breaking news event– Whether it is a earthquake, or a shooting, or an accident, or a fire, or a building collapse. Anything where news is breaking very, very quickly. And of course, one of the both positive and negative impacts of the internet is that news is now available to us 24 hours a day, seven days a week– and not just 24 hours a day, but every minute of every hour of every day. Where, if you’re a reporter, your deadline is now. Your deadline is one minute from now. And the rush to get news fast, the rush to be the first, if you’re a news organization, can become dangerous, because there’s so much chaos in a breaking news environment. So if you’re looking for information, if you’re looking for real news– if we can go back to that, I don’t know if we can reverse that. I want to go through some of these very quickly. Thank you. In the immediate aftermath of a breaking news stories, news outlets will frequently get things wrong. You should understand that. News outlets don’t get things wrong because they want to, or because they’re incompetent. It’s because they get things wrong because information is changing very, very quickly. OK, you can move it on now.
Sometimes verification, that process of finding reliable, direct evidence, breaks down because journalists get sloppy, because they’re in a rush, or because people, sources of information, give reporters incorrect information, or even don’t give reporters the truth. Verification can be very difficult even when sources are helpful. One thing we have to realize– and again, this is largely because of this extraordinary wealth of information that is provided to us by the internet– that there are many– is what we call information neighborhoods. There are many sources of information. Not all of those sources are journalism. They can be propaganda. They can be publicity. They can be entertainment. They can be raw information. Particularly when they’re available online. Particularly when you look at the website of a journalistic operation, and you read a story, and it says on the top of it, “sponsored content.”
What does sponsored content means? It means it’s an advertisement that’s dressed up as a news story. It’s very difficult sometimes to differentiate what the sources of information are. Just because something, of course, is on YouTube doesn’t mean it’s journalism. YouTube is populated by journalism, by publicity, by raw information. You have to be able to distinguish between those different sources of information, particularly when you see them on the internet.
When you’re looking for something that’s reliable, consider all of the evidence that you see, that you read, that you watch for here. Note which evidence is direct, from a reliable document, from an eyewitness, from a credible expert, and which evidence is indirect. Know yourself if relevant evidence is missing, and also note if there’s irrelevant evidence that stands in the way of clarity. Those are some techniques for looking to see if there is good journalism behind the information that you are consuming. And, as I’ve said, evidence exists on a spectrum. Every news account has some direct evidence and has some indirect evidence. Your job, as someone who is evaluating the reliability of news, is to weigh which kind of evidence dominates the story. You, also, must look for something we call transparency in a story. Are you understanding what the journalist– what the reporter is giving you? Do you understand the process through which that individual or that organization got their news reports? Are they clear about why, for instance, anonymous sources must, must be anonymous? Are things explained? How did they get their information? Why is some information missing? Are you being allowed into the journalistic process, which then gives you a greater understanding of the reliability of the information.
Very quickly, the other thing you need to look for is the difference between fairness, and balance, and bias. Because fairness and balance are not the same things. Fairness is a question of quality. Balance is a question of quantity. Fairness means being fair to the evidence. It means being fair to the facts. Balance means basically giving the same amount of attention, the same amount of weight, to either side of an argument. Sometimes, that is not fair. You can be fair, but not balanced. And you can be balanced, but not fair. Balance for balance sake, for the sake of making both parts of an argument even, is not necessarily always the most reliable way to go. You must be fair to the facts. You must give every side of an argument expression, but not falsely provide balance when the information– when the evidence does not justify it.
So, what I have talked about so far, of course, applies through all platforms, but so many of us get our information online, so many of us get our information through our social networks, through social media, on websites that looking at digital age media and deconstructing and understanding how information comes through us, digitally, is of utmost importance. All those fundamental approaches to understanding basic journalism, all those techniques to using critical thinking absolutely apply to looking at digital information, information on the web, but there is another layer of understanding.
There is another way you can look at news and look at information– look at websites– by evaluating the architecture of a website, by looking at the URLs, by examining the photographs, by traveling upstream to see where the information came from, by traveling downstream to seeing where these websites that you may be getting information from, where those websites are linking to. All of these different approaches help you evaluate individual websites for their credibility. Fraudulent websites presenting misinformation and fake news can be made to look remarkably like reliable websites. You must examine them very, very carefully to see where the little signs are that something is missing. Are the dates off? Are the links suspicious? Is the About Us tab telling you something that makes you suspicious? Are these sources of information, or the information that you are reading on these websites unavailable anywhere else? All these things can be little indicators that you may be getting misinformation.
– Facebook, Google, and Twitter have announced new efforts to stop the spread of fake news. Twitter has amped up its mute and report functions to slow dissemination of fake news, but you shouldn’t leave it up to these big websites to perfectly curate all your news to you all the time. Here are a few easy ways to spot fake news. One, double check the URL. Some fake sites look like they have legit URLs, but take a closer look. This site tries to fool readers into thinking it’s ABC News, but the web address is a few letters off. It was abc.com.co instead of abcnews.go.com, the real one. .co is the country code for Columbia.
Two, does the photo you’re looking at seem photoshopped or unrealistic? It could be. Drag and drop the photo you’re looking at into Google Images, and they can help you verify the original source of the image. Three, can you identify the original source of the information in the story? Check that source against other sources. If other reputable news outlets haven’t picked up the story, it’s likely you’re looking at fake news. Here are some sites we’ve seen deliver fake news.
Four, think about installing a Chrome extension to help you detect fake news. The plug-in called FiB– Stop living a lie was recently developed by four Princeton University students, and can help you to determine the validity of some news links. The plug-in purports to verify pictures, texts, tweets, and embedded links. It also checks the site for malware and dead links. Other Chrome extensions like B.S. Detector and Fake News Alert say they do similar things. Before you send an article to all your friends on social media, consider visiting ideas one through four. You’ll thank us later.
PROFESSOR REINER: Again, ask yourself, what’s the evidence? Does this story support the headline? Ask yourself, says who? Do all the links work? What does the About Us page say? When was the information last updated? Check whether fact checking websites– and I’ll mention a few of them in a moment– have investigated the information. Cut and paste images into reverse search engines like one called TinEye.com. And lastly, and most interestingly in a way, beware of stories that come from people you trust, even from your friends and relatives. Don’t confuse the sender of the story with the source of the information. There are a number of websites now devoted to unmasking fake news. Some of them use crowdsourcing. Others of them use algorithms that they’ve developed.
Here is an example of a photoshopped story– a photoshopped photograph– that pretended to show, and claimed, that this was one of the brothers that carried out the Paris terrorist attacks a number of months ago. It’s a photograph of someone in a suicide vest, holding what looks like a Quran. But, of course, this is simply a photoshopped image that was originally a quite innocent selfie taken by someone holding an iPad. The crowdfunding website Grasswire fact checked this and debunked this. But before they had debunk it, this particular photograph– the one that has the X through it– went viral, and was picked up by newspapers all over the world. It was picked up in Spain. And when there were terrorist attacks– another set of terrorist attacks– in Nice, this photoshopped image of a simple selfie was reborn again, and went viral again as a piece of fake news. Just an example of what can happen.
There are a growing number of watchdog websites in the United States. And I know there is an extremely active and robust effort in your country to also root out misinformation. Three of the most respected ones in the United States, available on the web, is factcheck.org, PolitiFact, both of which deal with mostly political information. And snopes.com, which is a highly reliable debunker of fraudulent misinformation on the web.
Seven quick steps to examining the journalism, whether it’s online. Step one, always compare the main points of the story to the headline of the story. And if they don’t have any relationship, it’s a good sign that the headline is just there for clickbait, and the headline is not there to really help you understand the story, and it’s probably fronting a fraudulent story. Step two, look for that evidence or indirect evidence. Step three, always evaluate the sources. Remember, multiple sources are always better than single sources. Knowledgeable sources are always better than sources that don’t have expertise. Independent sources are always more reliable than self-interested sources. Who the sources of information are is extremely important. And when the sources are anonymous, expect and demand the journalist tells you why those sources have requested anonymity, and whether those anonymous sources are in a position to provide the information that they are. Step four, does the reporter make his or her work transparent, let you in on the reporting process? Step five, does the reporter place all the facts in context? Step six, are the key questions answered? The who, what, when, where, why, and how– the basic questions of any journalistic report. Step seven, is this story fair? Is balanced called for, or was balance simply done for balance sake? Was the language fair? Was the information presented fairly to the evidence? In the digital age, of course, headlines can be very, very different. Because headlines are frequently there to generate clicks. To sell. Not to tell the story.
Always ask a two word question. The digital age, in fact, has transformed the landscape for news producers and news consumers. But I can’t repeat enough, no matter how you’re getting your information, you have to ask yourself a two word question. “Says who?” Where is this information coming from? Where is the reporter getting his or her information? How reliable are those sources of information, and can you verify those sources? Can you do what scientists do when they need to replicate an experiment, in order to see whether or not it’s truthful? Ask yourself, who’s saying this and why? Remember, as I’ve said before, there are a couple of key things to keep in mind. Independent sources are better than self-interested sources. If I have a set– if I have an interest in one side of the story, I’m not a reliable source of information. You want multiple sources, rather than a single source of information. You want sources who can verify your information. You want sources who are authoritative and inform, and you want sources– when you can get them– who are named, and not unnamed sources. Independence, multiple, verifiable, authoritative, unnamed sources are what you must look for all the time. If you don’t find them, you don’t necessarily have a trustworthy piece of journalism.
You’re not always going to be able to figure out what’s true. In fact, there are many things about who we are, as human beings, how our brains operate, how we receive information, that make it difficult for us to figure out what’s true. What’s, of course, true is this. What’s not true is this. If it’s on the internet, it doesn’t necessarily have to be true. It’s something that every child, every student in grammar school and middle school must understand again and again. Just because it’s there doesn’t mean it’s true.
So as a souvenir, here is a digital coffee cup for everyone. And the slogan that I– one of the slogans I want to leave you with is this, “Think once before you act, think twice before you speak, and think three times before you post anything on Facebook, Instagram, or Twitter.” Because we are not all only news consumers, we are all news producers. We all– by virtue of the fact that we have a smartphone, that we have a Twitter account, that we have a Facebook page– all of us have more power to disseminate and spread and share information than any human beings who have ever walked the face of the earth before. We are all incredibly powerful producers and disseminators of information. And in a famous line from the movie Spider-Man, “With great power comes great responsibility.” We all have tremendous responsibility to pass on only information that we are convinced is accurate, and simultaneously, to call out information that we, not only suspect is inaccurate, but we have detected as inaccurate. And, as I’ve said, it’s not always easy for us to do that.
– I’ll answer the question. You want answers?
– I think I’m entitled.
– You want answers?
– I want the truth!
– You can’t handle the truth!
PROFESSOR REINER: That’s a pretty famous scene from a well-known American movie called A Few Good Men, and the punch line of that movie is, “You can’t handle the truth.” And the question is, can we handle the truth? Because we, as human beings, have our own biases. And, as I mentioned before, we tend to seek confirmation for our own biases. And as well, as it turns out, that political scientists and [AUDIO OUT]. We, as voters, as citizens, of a democracy, frequently make judgments that are based on our emotional reactions and our emotional response– not necessarily to facts. It presents a tremendous challenge to journalism if facts themselves are sometimes not powerful enough to convey information and to be convincing. If we make a decision about what to buy, or we make a decision– more importantly– about who to vote for, and that decision can be based on emotional reactions, it means that we have to work even doubly hard to ferret out the truth.
It’s hard for all of us to admit that we’re wrong. It can be very uncomfortable, especially when we are admitting we’re wrong about implicated some aspect of our identity or our world view. It is what has certainly happened in the United States, and I suspect, happening in many democracies around the world. Where the internet is making possible the phenomenon that we all gravitate toward our own corners of belief and we close ourselves off to ideas and points of view that are contrary to our own. There is something in our brains that’s called cognitive dissonance. We rebel– our brains rebel against information that goes against our core emotional beliefs. We tend to forget it. We, also, have an interesting phenomenon that’s called source amnesia, where if we get information that we don’t agree with, we tend to forget the source of that information. If we get information that we do agree with, we tend to give the sources of that information increased credibility. We tend to believe things that our emotions want us to believe.
So, again, we are particularly susceptible. We are particularly vulnerable to fake news, which is all the more reason why we have to have the critical thinking skills– the rational skills– to understand how to recognize it. The bottom line, in many cases, although we like to think so, we are frequently not rational human beings. We do not absorb information rationally. Our circuits are not objective. We make decisions on our emotions. And that’s why fake news can be so dangerous. So ultimately– in the digital age– you, me, we, all of us, really, are in charge of determining what is reliable and what is not. We, certainly, can have sources of information that we trust, and we can have sources of information about which we are suspicious. But in the end, we can’t outsource reliability. We have to make the judgments about what is reliable and what is not. We have to understand that we frequently, instinctively, rebel against information we disagree with. And we need to use critical thinking skills in order to navigate through an extraordinary amount of fake news that’s coming our way. And with that, I believe that’s the end of the PowerPoint.
All All right, then, if anyone needs to leave, we understand. You could just politely excuse yourself, and we will go ahead with our Q&A. So, are there any questions from Taipei, here from the American Center? We’re going to start with you guys. If you would like to ask a question in Chinese, please raise your hand and someone will bring you the Chinese microphone. Yes, right here in the front.
AUDIENCE: Hi, I’m curious. It does seem like fake news, in the U.S. Is something that’s not stoppable. And I think speaking to many editors in the U.S.– I think they’re all very frustrated with fake news, and even if they try to do media literacy, I think that effect is very limited. And the fact that you are trying to push media literacy– people don’t necessarily want to read about media literacy– so there’s a question of acceptance, as well. I’m curious, have you seen this phenomenon of fake news spreading in other parts of the world? And how can we, as media organizations, you know, do something about this? Because there are many media organizations who are producing in the state that we are, bad quality journalism, which, really, adds to the problem of fake news. So as an industry, I’m just curious if there is any effort in the U.S., or around the world, that you see are addressing this issue, as an industry.
PROFESSOR REINER: You’re absolutely correct. It is a perplexing problem and what the industry is doing, and has done, may, indeed, not be sufficient. I agree. The proliferation of fact checking– not only fact checking sites, but fact checking sections of all the various major news media outlets, the focus on trying to fact check in real time during political debates, to call out misrepresentations and mistruths when they happen, as often as possible– it’s really all the industry can do. I really think, in the end, this really has to be, at this stage, in my opinion, a long run educational enterprise. Because people who are susceptible to fake news are not only susceptible to it because of our own human frailty and our own neuroscience, but I believe they’re susceptible to it because of a certain lack of basic education about government, about civics, about science.
I mean, one can turn from the political sphere to the scientific sphere, for instance, and one could replace the term news or media literacy with the term science literacy. Why is the public susceptible to fake news about issues like climate change, for instance? Well, I think the public is susceptible to fake news about climate change because of a lack of adequate education, starting in the very early grades– to teach science adequately. So to answer your question, I think, ultimately, the responsibility for this is going to be with the schools. I really do. Starting in a very, very early age. Which I know you’re going to be doing in your country.
All that the industry can do is attempt to maintain the highest standards of journalism possible. To be even more scrupulous, to be even more transparent, to work toward allowing and encouraging the audience to understand the process of news gathering, so in an effort to build trust. Because, at least to the United States, the general distrust with institutions, with the old institutions that used to dominate– whether they were the institutions of political parties, or the institutions of the big broadcast networks and the big newspapers– and I think institutions that have lost trust have to rebuild trust by maintaining high standards, and by providing information that is relevant to people. But I certainly don’t–
MS. URBOM: All right, thank you. Thank you very much. Let’s go to AITK, now, Koahsiung. You have a question for Mr. Reiner?
AUDIENCE: Yeah, thanks, Sonia. We have a question from a professor of Koahsiung Normal University. The question is, it is undeniable that everyone should develop media literacy, but we should also think about the penalty for fake news. What do you think about those checks and balances, and possible penalties?
PROFESSOR REINER: Well, my question would be, who is going to impose the penalty? Are we– and I think that’s a very thorny issue. I think the penalty has to come from the public, itself. The penalty has to come from individuals who refuse to click that mouse, who refuse to buy that product, who refuse to share a tweet, or retweet, or share a post. I think the penalty has to come from the ground up. It has to come from consumers. I think will be very, very dangerous if there were penalties that came from regulatory agencies, or from the government, because that can be a very slippery slope. So yes, I do think there need to be penalties. I think there should be a price to pay for disseminating false information. But I think the price has to be enacted by all of us, and not by a government.
MS. URBOM: All right, thank you very much. And now we’re going to take a question from our students.
AUDIENCE: Hi. I would like to ask you about the bots. Well, the bots give misinformation or information, and I would like to ask you that, do you think what they send should be verified by human beings before they give their information?
PROFESSOR REINER: Yes, I think ultimately that would be ideal. And I know that both Twitter– which is really where bots are prevalent– and Facebook– which has other ways of information, and ads, and stories being spread by algorithms and not by human beings– are hiring more and more people– hiring humans– to independently check. But how many humans can be hired, and how much of this can be done by algorithm is, really, an economic determination. There are enormous amounts of very, very smart people working on technological ways to try to address this problem, and it could be that it’s just going to be a battle of the computer scientists to see who prevails.
AUDIENCE: Thank you, and I also have another question is about– well, you said that multiple sources is better than only one source, right?
PROFESSOR REINER: Yes.
AUDIENCE: But in some cases– like, sometimes, the reporters, they are not allowed to get into a war area and the military will be the one who give the information. And in that case, is there any other way or sources for reporters to get information, and also check if the information that the military give is true or not?
PROFESSOR REINER: Well, that’s a very good question. I mean, in a war situation, it’s very difficult to fact check a official military spokesman. If reporters don’t have access to the battlefield, or have not been able to be embedded with any of the troops, there may be sources elsewhere on the ground that might be sources of information. But a spokesman or a press representative is, really, not a direct source of information. So reporters need to be able to reach out for eyewitness testimony, as much as they can.
AUDIENCE: Thank you.
MS. URBOM: All right, thanks so much. We’re going to come back to Taipei, here, the American Center. Is there somebody else who would like to ask a question? We want to make sure everybody has a chance. Did you have a question, [INAUDIBLE]?
AUDIENCE: [NON-ENGLISH SPEECH]
MS. URBOM: Could you just– I’m sorry– tell us, where are you from?
AUDIENCE: I’m from the Commonwealth Magazine. I’m curious about the role of technology company in this, because they are a platform where all the fake news disseminates. And recently, there’s been a lot of cases in Europe where, you know, these technology companies are using or spreading information they like, as a way to make money, or favoring instant articles on Facebook, because they are a source of income for Facebook, or certain types of posts, because that drives eyeballs. But they tend to say that they’re just a technology company and not publishers. I’m referring to Google, and Facebook, and Twitter. So I’m curious, what would you– what’s your stand on this, and what role do you think they should play in fake news?
PROFESSOR REINER: Well, as you well know, because you’re informed about this, this really is the question and the debate that’s going on right now, certainly in the United States. And not a day goes by without some major report about Twitter, whether it’s Mark Zuckerberg or Sheryl Sandberg– the number-two person there– going before Congress and going to Washington, D.C., to make the case that Facebook and/or Twitter can, essentially, self-regulate and resist government regulation. But I think it’s inevitable that there is going to be more and more pressure coming from the government to begin to deal with Facebook, and Twitter, and the like as something more than a simple platform, something more than a simple telephone wire, where information passes. Just to claim that they’re simply a platform is really not credible anymore. I think something new has been developed. And I don’t think anybody has quite figured out what it is yet. And I think that includes the people who created it.
There is a debate that even Facebook doesn’t really understand Facebook anymore, that it could be kind of a Frankenstein creation that has gotten so big, and so unwieldy, that even Mr. Zuckerberg doesn’t quite know what to do about it. And it seems to have been a very, very painful, slow process to get him to begin to recognize that Facebook is not an innocent in this circumstance, right now. So I think we can look toward a real battle over some kind of regulation. And of course, regulation can be potentially dangerous. It can be potentially harmful. So I don’t think anybody knows the answer right now. Because I do think that what has been created is something that we’ve just not seen yet, and I don’t think we know the implications. I don’t think we know how to control it. I think it’s the very beginning to what we can do.
MS. URBOM: All right, thank you very much. We’re going to go back to Kaohsiung now.
AUDIENCE: Thank you. We have a question from one of our professors.
AUDIENCE: OK, can you hear me?
PROFESSOR REINER: Yes.
AUDIENCE: I actually got two questions, but I was told that there’s not much time left, so I chose only one. My question is, should we regulate fake news? Because some people create fake news, not for political reasons, commercial interests, or triggering hate. They create fake news just for fun, or maybe they just want to practice writing skills. Is regulating fake news against freedom of speech? That’s my question.
PROFESSOR REINER: Well again, that’s a very important question. And I think there are tremendous dangers inherent with regulating so-called fake news. I think it should only be absolute last resort, and hopefully not a resort that we ever need to employ, because I think there are just too many inherent risks, inherent dangers, in giving any single entity the power to regulate information. Because one person’s fake news is another person’s truth. in some cases. I think this has to be regulated by us, by the marketplace. By people who simply are educated enough, are smart enough, are concerned enough to make decisions that will starve fake news to death, and the only way to starve it to death is not to accept it, not to buy it, not to believe it, not to act on it. And I don’t know what that’s going to take, because again, it’s a term that’s bandied about too often, by too many politicians. It’s certainly something that has exploded in the United States, as you well know. There’s a lot of misinformation that’s sent out almost every day, and it’s very difficult to navigate, but I think it’s up to all of us to be vigilant.
MS. URBOM: All right, thanks so much. We are going to go back to our university students. They have two questions. We’re going to take two questions from the students right now.
AUDIENCE: Hello. Well, I want to know about– what do you personally think is the greatest obstacle in stopping fake news?
PROFESSOR REINER: I think the greatest obstacle, personally, in stopping fake news is our own ignorance, individual and collective ignorance.
AUDIENCE: OK, thank you. And OK, I have another question. As we all know, it is an obvious fact that the internet and social media enables fake news to spread more widely, and faster than we can ever imagine. Has this become a concern to you?
PROFESSOR REINER: Oh, absolutely. Yes, of course. It’s a concern to me, as someone who was a professional journalist for many years, someone who teaches journalism now, and as someone who has a son who is your age, or a little bit younger, who is going to be living in the world 10, 20, 30 years from now. And I want him to live in a healthy democracy, where people make decisions based upon facts, and where we can all– there’s a very famous phrase that– you can be entitled– you’re entitled to your own opinion, but you’re not entitled to your own facts. I think a democracy– I mean, there will be disagreements, healthy disagreements, about how to approach issues, how to solve problems, how to make choices, but democracy can only make those decisions if there is a broad agreement about a certain set of facts that we can all agree on, and then we can take our different approaches. But I think if we lose the ability to agree on certain fundamental facts, that democracy is a threat.
AUDIENCE: Thank you.
MS. URBOM: All right, and we’re going to come back to Taipei, here. We have a question. Please just let us know where you’re from.
AUDIENCE: OK, I have a question that– a very practical one that– every day what we can always see that President Trump criticizing the American’s mainstream media, those very prestigious media, and said that they are making fake news. And for us foreigners, it’s very difficult for us to discriminate, or to find the truth. Who is true? The president, or the mainstream news? How should we do? How American usually try to figure out which side they can trust more.
OK, and another question is that, everyday, maybe you can see some strong– some countries like made in China, becomes China, and they say you are– it’s very difficult, also for us, to know whether that is a national propaganda, that is communism propaganda, or that is true or not. So what would you suggest? Thank you.
PROFESSOR REINER: Well, that’s a very difficult question to answer, obviously. The term fake news is being used rather promiscuously with very– rather, carelessly– and being used to dismiss things that you just don’t agree with. So I think one needs to hold the news media accountable. One needs to insist that– because so much of the information– I think one of the problems, right now, is that so much of the information that is subsequently called fake by the president is information that comes from anonymous sources. It’s information that comes from leaks within the administration, where no one’s name is ever used, where no one who actually– no one is willing to say, “My name is so-and-so and I am giving this information to the news media.” Everybody is doing it under cloak of darkness. So it’s very, very easy for the president to dismiss it as fake news. And I think that’s going to be a problem, unless and until more people decide to provide information not anonymously. And I think the media has to do more with trying to credential the anonymous sources of information. But certainly the term fake news is thrown around much to casually, and it’s very dangerous, .
MS. URBOM: OK, let’s go back to Kaohsiung. They have a few more questions.
AUDIENCE: Hello, I’m [INAUDIBLE] from National Kaohsiung University who teach media literacy. We know, actually, a lot of so-called news came from the PR offices, that not only write some news and give it to the media, but also make a lot of misinformation online, and make a lot of events, and try to create it as if it was true. Then how should we teach the students to differentiate?
PROFESSOR REINER: Well, that’s absolutely true and that’s what their job is, publicists and public relations people. Their job is not to inform, not– but their job is to convince, and their job is to present a certain perspective, and their job is not to present news. And certainly news organizations have to know that when they get their press release from the public relations department of a company, that that’s what that is, that’s a press release. Then that press release needs to be looked at journalistically. And the consumer needs to look at a body of information that is spread by a publicist, or by a public relations firm, and examine it and say, OK, now, is this a source of independent information? Do we have reliable sources of information? What is the purpose of this information? Do we have individuals, or sources quoted on different sides of the issue? Or is this just simply a piece of publicity?
I just think we need to learn how to detect the differences between publicity, and public relations, and news. And news in journalistic organizations need to put a magnifying glass on those publicity releases and explore them journalistically, before they just simply publish them. It goes on in print and it goes on in television, where you have video news releases. And television news stations will just run them, as if they’re news reports. Where some company says, oh, the best gift this Christmas is the new doll, and they put some expert on camera, but it turns out the expert is being paid by the company to say that their new toy is the best. That’s a pretty harmless example, but it’s not a piece of journalism.
MS. URBOM: All right, thanks very much. We are going to go back to Kaohsiung. I’m sorry, I’m sorry. We’re going back to the university, there’s one more question from a student at the university.
AUDIENCE: Hi, I want to ask about that– you just mentioned the growing number of watchdogs in the U.S.A. I want to know that, can we feel fully trust these website? Because they may have their own standpoint, as well. So should we trust them, or they are just another information resources?
PROFESSOR REINER: Well, that’s an excellent question. Again, you know, you can trust some. You can do your own investigation to see how long they’ve been in business, and whether they’ve been proven to be correct, and what the source of their funding is, how they support themselves, where they get their money, That can frequently be an indication of whether or not they’re independent. And I think if they’re independent, and they’ve been in business a long time, and they’ve been proven to be credible and correct, then you can trust them. But you’re asking the right questions, certainly. You need to decide for yourselves whether or not they are trustworthy, but certainly, one way to help determine whether they’re trustworthy is to try to determine, as best you can, where they get their money from. Are they sponsored by an organization, or by a foundation that has a particular political point of view? Sometimes you can tell that by the other sites that they may link to. Because sometimes, you can tell what’s really going on, not by looking at the site, but by looking at where they link to.
AUDIENCE: OK, thank you.
PROFESSOR REINER: You’re welcome.
MS. URBOM: All right, OK, and we’re back here, to Taipei. We’ll take, maybe, one more round of questions, all around. Is there a last question from Taipei? Sorry, just remind– please let us know what organization you’re with.
AUDIENCE: I am involved in the news helper project, mentioned by our Digital Minister. I want to briefly ask a question in two levels. First of all, for us, the general public, when we are using something like Wikipedia that provides access to everyone, and everyone can edit– so, we are not professional journalists. We are not from the authority. And everything we write doesn’t become information that can be readily disseminated, right away. It cannot be streamed or verified easily. So what do you think about this?
And secondly, how can we block fake news? We can stop sharing them, of course, but for the news helper project, we have a campaign. We want people to review media outlets. So I think, in this way, every month, we can have a meeting and talk about what news are good, what news are bad. What do you think as a professional journalist?
PROFESSOR REINER: Well, I mean, from what I understand of the news helper approach, it sounds like a very valuable tool. So as a professional journalist, I think any time you can get the public, the news consumers, to become involved with monitoring, and evaluating, and discussing the work of professional news organizations, debating the different approaches that different news outlets take to report a given story, to understand how misinformation can be spread or how inaccuracies develop, I think that’s a step in the right direction. I think the more involvement from the general public, the better. So I would be in favor of that kind of approach.
MS. URBOM: OK, thank you. And we are going to take a last question from the university, from Dr. [INAUDIBLE].
AUDIENCE: OK. Hi, Professor Reiner, thanks for the very insightful lecture. I’m one of the professors teaching this class, and it seems to me that one of the approaches to combat fake news is to provide correct information, or more facts. And you also mentioned that people make decisions mainly based on emotions, so if we are emotionally unwilling to look for more information, how do we deal with fake news? So I would like to hear, what’s your point on this? Thank you.
PROFESSOR REINER: Well, that’s certainly the question, isn’t it? And I don’t know the answer. I mean, I don’t think that we’re completely impervious to fact based information. Certainly there are lots of people and lots of issues where things are– where opinions are unformed. I think it really, just, is really up to the news media to be as clear, and transparent, and open as possible. And to provide news that is so relevant to people that they begin to trust it, and they begin to test it, and they begin to see that it’s advantageous to them to have trust in their news media. Because it helps them live better lives, it helps them make better decisions, it results in better government. I think it’s a very, very complicated phenomenon.
There have been recent polls that I just read. There was an editorial or a column in the New York Times, just the other day, that a decreasing number, fewer and fewer American young people of the age of the people that are sitting all around you right now, fewer and fewer people think it’s so important to live in a democracy. It’s a very frightening result, when people are going to begin to value their own security over their freedom. And I think one of the ways to combat that is for journalists– is for journalism to become really indispensable to people. And now I’m not sure that it really is. You have to really– it just has to mean something to people on a real level, not on a hypothetical intellectual level.
MS. URBOM: OK, thank you. And we’re going to go back to Kaohsiung and let them ask the closing question of our program.
AUDIENCE: Am I on? Oh, hi, I’m a student from a Normal Kaohsiung– Kaohsiung Normal University, NKNU, and I’m wondering that, in your slides you mentioned, it is important for news literacy delivered to everybody in the daily life. And I’m wondering, in the United States, is news literacy taught in normal schools, like elementary, middle schools, or only for university students, the media departments? And don’t you think this is important to teach the kids, in the future, to equip them with such an ability, to differentiate fake news and the real ones? Thank you.
PROFESSOR REINER: It’s an excellent question. As far as I know, and I’m certainly not familiar with everything, everywhere in the United States, news literacy is certainly taught at– beginning to be taught more at the high school level, as well as the college level. And I think there is a certain degree of general media literacy that’s occasionally taught in elementary schools. And by media literacy, it’s more in terms of understanding about advertising, and commercials, and things like this. But I think it absolutely must be taught at the earliest ages, to begin to involve people in understanding what journalism is all about. It certainly has become much, much more prevalent in the last five or six or seven years, where it’s taught in many, many universities, and in a number of high schools. I know that our Center for News Literacy works with a number of high schools in the New York area to help that, but it’s very important.
MS. URBOM: All right, thank you so much. On that note, we’ll close our program. Thank you all for staying with us. Apologies for the late time, and please continue to keep in touch with AIT to watch our Facebook, and also to speak directly with Irene and I. AIT will be continuing to work in this space, to help Taiwan to develop its media literacy and news literacy programs. And we look forward to cooperating with you. And students, enjoy your class, and enjoy your massive open online course on news literacy. Make sure you’re all following us on Facebook, where we will offer you some new information about news and media literacy. And thank you, Steven, for joining us today. Thank you very much for having me. I hope it was helpful.
MS. URBOM: Very helpful, thank you.
PROFESSOR REINER: Bye.