Maybe that’s a bit unfair – it incorporates an insightful look at the binary origins of dictionaries – around word meanings, and around word usage.
But it holds no prisoners in regard to the web 2.0 philosophy underpinning wordia.
There’s an overbearing hint of the Luddite in one or two of his arguments. For example…
Any dictionary that attempted to record the full gamut of usage… irrespective of the degree of error or accuracy, would be so cumbersome an object that it the term “dictionary” would have to be substantially redefined.
Maybe so – but its hardly as absurd a notion as is implied.
If it were possible to geo-code those words, then a superstructure of local, regional and national usage (as determined by ‘the wisdom of the crowd’ at each level) would be possible, and could nest inside each other. This would provide a far more comprehensive range of usage than is possible from the top-down, proprietary entities which currently exist.
Also, a key issue Dammann fails to acknowledge in this piece is just how absurd the ‘evolution’ of our language is measured, essentially according to the publishing patterns of the leading authorities in the field.
It’s with some irony that Dammann’s piece should be published the very week the OED published it’s latest missive on what words are used and not used.
Every year we are informed by national news of which new words have entered into the national lexicon – usually in January.
But one of the powers of web 2.0 is it’s immediacy – hence why Wikipedia has been able to pick up the slack that Encyclopedia Britannica can’t provide, because its review process takes so much time.
I’m not for one second arguing that wordia isn’t attempting to whip up controversy or celebrity in it’s approach. But please, is it really necessary to reduce any element of ‘interactivity’ in web sources down to the lowest common denominator?