Under the Rose
Veteran Member
Wikipedia is used by many people as a first resource when doing a quick search on a topic of interest, largely because it also often provides links to a myriad of other inputs. I just read a piece by ZME Science which makes the following remarks:
http://www.zmescience.com/research/w...nce%29#!bf3Iwr
This is very interesting to me and explains why many of the pages contain only very basic content, presented in a similar format. I was wondering how many of you were already aware of this and what your thoughts and comments are regarding the use of Bots as research assistants and authors.
Sverker Johansson could encompass the definition of prolific. The 53-year-old Swede has edited so far 2.7 million articles on Wikipedia, or 8.5% of the entire collection. But there’s a catch – he did this with the help of a bot he wrote. Wait, you thought all Wikipedia articles are written by humans?
Read more at This author edits 10,000 Wikipedia entries a day
Lsjbot’s entries are categorized by Wikipedia as stubs – pages that contain only the most important, basic bits of information. This is why his bot works so well for animal species or towns, where it can make sense to automatize the process. In fact, if Wikipedia has a chance of reaching its goal of encompassing the sum of the whole human knowledge, it needs bots. It needs billions of entries, and this is no task a community of humans can achieve alone, not even one as active and large as Wikipedia.
Read more at This author edits 10,000 Wikipedia entries a day
http://www.zmescience.com/research/w...nce%29#!bf3Iwr
This is very interesting to me and explains why many of the pages contain only very basic content, presented in a similar format. I was wondering how many of you were already aware of this and what your thoughts and comments are regarding the use of Bots as research assistants and authors.