Armies of Expensive Lawyers, Replaced by Cheaper Software
New York Times (03/04/11) John Markoff
The automation of high-level jobs is getting more frequent due to progress in computer science and linguistics. Recent advances in artificial intelligence have enabled software to inexpensively analyze documents in a fraction of the time that it used to take highly trained experts to complete the same work. Some programs can even extract relevant concepts, specific terms, and identify patterns in huge volumes of information. "We're at the beginning of a 10-year period where we're going to transition from computers that can't understand language to a point where computers can understand quite a bit about language," says Carnegie Mellon University's Tom Mitchell. The most basic linguistic approaches use specific search words to find and sort relevant documents, while more sophisticated programs filter documents through large word and phrase definition webs. For example, one company has developed software designed to visualize a chain of events and search for digital anomalies. Meanwhile, another company has developed software that uses language analysis to study documents and find concepts instead of key words. These tools and others are based on Enron Corpus, an email database that includes more than five million messages from the Enron prosecution and was made public for scientific and technological research by University of Massachusetts, Amherst computer scientist Andrew McCallum.
New York Times (03/04/11) John Markoff
The automation of high-level jobs is getting more frequent due to progress in computer science and linguistics. Recent advances in artificial intelligence have enabled software to inexpensively analyze documents in a fraction of the time that it used to take highly trained experts to complete the same work. Some programs can even extract relevant concepts, specific terms, and identify patterns in huge volumes of information. "We're at the beginning of a 10-year period where we're going to transition from computers that can't understand language to a point where computers can understand quite a bit about language," says Carnegie Mellon University's Tom Mitchell. The most basic linguistic approaches use specific search words to find and sort relevant documents, while more sophisticated programs filter documents through large word and phrase definition webs. For example, one company has developed software designed to visualize a chain of events and search for digital anomalies. Meanwhile, another company has developed software that uses language analysis to study documents and find concepts instead of key words. These tools and others are based on Enron Corpus, an email database that includes more than five million messages from the Enron prosecution and was made public for scientific and technological research by University of Massachusetts, Amherst computer scientist Andrew McCallum.
Nenhum comentário:
Postar um comentário