As mentioned before, technology and the way we consume information has dramatically changed. But do we really need to scrutinize and question every bit information that we come across?
It is possible for major publications to regurgitate information from elsewhere on the Internet. It’s a new term called “churnalism,” and probably every publication is guilty of it. In the journalism world, whenever we use a piece of information, no matter how basic or simple it is, we must credit where we got it from or back it up with evidence. With the ease of information flowing freely, it is hard to remember to check out information that seems obvious.
An article in The Atlantic suggests that a new and open-source method of detecting plagiarism created by the Sunlight Foundation will help readers determine where information is coming from by scanning the articles and comparing them with a plethora of press releases, Wikipedia entries and even quotes from speeches. In the future, they hope to expand this plagiarism project into not just journalism, but also legislation.
In a world where anyone will believe anything they read on the Internet, a tool like this is necessary for readers to determine what is credible and what is not. The Internet has made people more gullible and lazy, and I’m not sure if this tool will be used widely among readers due to their need for immediacy of information.
If the program is used to detect what information was taken from press releases, how will this apply to Associated Press releases that are widely published? I believe that this tool will be quiet useful for those who use it, but it needs to be more strongly developed at this point.