March 21, 2007
Slashdot comments:
I agree that the Semantic Web people haven't read their epistemology texts. Here's an interesting article on this topic [], explaining how essentially, all this “web-of-meaning” stuff was tried by NLP/AI researchers decades ago, and plain does not work.

The article concludes that a “weak” version of the semantic web may be possible - no clever inference or anything, just a set of data interchange standards. Which is basically the XML / data interchange standards bit of Web 2.0.

-- Zarkonnen (662709)
In free societies, everyone is master, and our language is conditioned only by the minimal need to communicate approximately with others. Beyond that, we are free to impose whatever semantics we want, and we do this to a far greater extent than most people realize. As a friend who works in GIS once said, “If I send out a bunch of geologists to map a site and collate their data at the end of the day, I can tell you who mapped where, but not what anyone mapped.” Individual meanings of terms as simple as “granite” or “schist” are sufficiently variable that even extremely concrete tasks are very difficult.

Imposing uniform ontologies on any but the most narrowly defined fields is impossible, and even within those fields nominally standard vocabularies will be used differently by rapidly-dividing “cultural” subgroups within the workers in the field.

The semantic web is doomed to fail because language is far more highly personalized than anyone wants to believe. I think this is a good thing, because the only way to impose standardized meanings on terms would be to impose standardized thinking on people, and if that were possible someone would have done it by now. Whereas we know, despite millennia of attempts, no such standardization is possible, except in very small groups over a very specialized range of concepts. -- radtea (464814)

tags: ComputersAndTechnology
comments powered by Disqus