The Complete Dictionary

2003
Encyclopaedia of all possible words up to 6 letters.
‘How big is the complete dictionary?’ It is impossible to answer this question, unless there are limitations to the definition of the word ‘complete’ and the word ‘word’. From A to Z, The Complete Dictionary consists of 26 volumes, each containing all pronounceable words beginning with one letter up to 6 letters – 50 million words in total. These words are generated following a certain formula, which describes sounds according to our experience of words as alternating vowels and consonants. Processing by cognitive scientist Egon W. Stemle (currently working at Eurac, Bolzano, IT).


Photos: Renaat Jannssen, Christine Clinckx. Bound by Boekbinderij Seugling, Amsterdam.


Excerpt from Taxidermy for Language-Animals (pp. 351)

"THE COMPLETE DICTIONARY How large is a complete dictionary? Using this question as a departure point a possible answer was formulated through visual means rather than in the discursive form of a philosophical paper. However, knowledge from the field of language philosophy was included in the work The Complete Dictionary. Inspired by The New Art of Making Books by Mexican artist Ulises Carrión, a long series of questions on language, words, books and the reader presented themselves, from which the question ‘How big is a complete dictionary?’ emerged as the most pressing. The art movements of the 1960s have had a great influence on my own artistic interests, in part due to their fascination with the order of words, alphabetised sentences and language-games. The artists’ book had its origins in the collaboration between visual artists (often painters) and writers (often poets) and became an autonomous art form in the second half of the twentieth century, when artists reclaimed the book as art object and autonomous medium.


Today both books on art and artists’ books coexist and numerous publications and fairs celebrate the book as medium in its own scene.  The Complete Dictionary included all possible words up to six letters long in the Latin alphabet. It contained both meaningful words and all meaningless letter-combinations based on a formula designed to generate pronounceable words.  Altogether fifty million words were computer-generated in alphabetical order, split and printed into twenty-six volumes from A – Z . Each of these thick A4 folios includes 1600 pages and weighs 3kg. The corresponding data was issued in an edition on CD-ROM and takes merely 350MB of ‘space’. The physical edition, meanwhile, occupies two metres of shelf space. Despite the formal likeness to some of the pioneering works of Carrión, Kosuth, Weiner and other conceptual artists, as well as a kinship with the work of Borges and British science fiction writer Arthur C. Clarke’s works of fiction, The Complete Dictionary bears properties that are temporal situated in the year of its making, 2003. It was realised with the help of Egon Stemle over the course of a few months by using a programme developed for a personal computer. This technology was unavailable to artists between the 1960s and 1980s. Although the idea probably existed in the minds of previous conceptual artists, the time it would take to list all possible words would simply take too long—longer than a lifetime. However, the idea was proposed by several authors, including Clarke; the book- binder who bound The Complete Dictionary brought this to my attention after finishing the binding.

The short story by Clarke, ‘The Nine Billion Names of God’ (1953) concerned the relationship between words, naming and the ‘meaning of the world’, the same themes in The Complete Dictionary. In Clarke’s story the monks of a Tibetan monastery try to list all the names of God because they believe that the purpose of the universe is to collect all the possible names of God, an act that would at the same time complete and end the universe: Writing the names out by hand, as they had been doing, even after eliminating various nonsense combinations, would take another fifteen thousand years; the monks wish to use modern technology in order to finish this task more quickly. In alphabet-based sign-systems such as Latin, words are composed of one or more letters. The choices when making this work were based on arguments about the visual and physical impact of the work. Restrictions were necessary to realise and complete the work. For instance, the limitation of six letters per word was due to the consequence word-length would have on the entire ‘dictionary’. With a six-letter limit, the series of volumes from A – Z alluded instantly to encyclopeadic works, where each volume pertains to one or more letters.

A word-length of up to, say, seven letters would already entail a twenty-fold increase in data, resulting in twenty books containing words starting with the letter A. The number of identical letters was also limited in order to avoid letter-combinations that lose their appearance as words and look like abbreviations. But the formula used for calculation was very simple; it was based on the classification of twenty-six letters of the Latin alphabet into vowels (v) and consonants (k). All letters from both subgroups were then subsequently calculated into sequences from two to six letters, while restrictions applied on the frequency of each group (v or k) per word. Within the notion of ‘pronounceability’ there is a necessary limitation of words consisting of consonants only (i.e. wrjnmn or a series of identical letters like ttttt). The hypothesis was that the alteration between vowels and consonants would create forms we recognise as ‘words’. The rules were applied in order to approximate the experience in SAE languages of ‘pronounceable’ words. 

In his influential work Language, Sapir claimed that words are provided with a ‘definitive plastic unity’, though he warned against mistaking the formal unity for mental entity. In other words, Sapir considered the word not as a ready-made mental unit made up of its formal aspects, i.e. vowels and consonants, rather a word is form in flux: Such features as accent, cadence, and the treatment of consonants and vowels within the body of a word are often useful as aids in the external demarcation of the word, but they must by no means be interpreted, as is sometimes done, as themselves responsible for its psychological existence. They at best but strengthen a feeling of unity that is already present on other grounds. Sapir asked: ‘If function is not the ultimate criterion of the word, what is?’ He characterised the word as ‘one of the smallest, completely satisfying bits of isolated “meaning” into which the sentence resolves itself,’ and he continues that ‘the true, significant elements of language are generally sequences of sounds that are either words, significant parts of words, or word groupings.’ In his analysis he distinguished between the conceptual charge of words and their formal properties:  The word is merely a form, a definitely molded entity that takes in as much or as little of the conceptual material of the whole thought as the genius of language cares to allow. Thus it is that while the single radical elements and grammatical elements, the carriers of isolated concepts, are comparable as we pass from language to language, the finished words are not. 

For the word-list in The Complete Dictionary,  the distinction between vowels and consonants led to the restriction of a maximum repetition of letters of the same subgroup per letter- sequence. Words can be pleasure and fetish. In the line of deconstruction, French postmodern philosophers dedicated much of their attention to the formal qualities of texts and words. In The Pleasure of the Text (1975) Barthes classified language-users into four categories taken from psychoanalysis: the hysteric, who ‘joins in bottomless [...] comedy of language’ [...] with ‘no critical scrutinity’; the fetishist, who is ‘intrigued by “divided up text”, “formulae” and “pleasure of the word”’; the obsessives, who are ‘logophiles, linguists, semioticians’; and the paranoids ‘who consume and produce complicated texts’ with ‘constructions like games’ and ‘secret constraints’.   French philosopher Gilles Deleuze is known to have experienced words as autonomous entities. After all, a philosopher is the inventor of new words and new concepts: Early on, he recalled, philosophical concepts struck him with the same force as literary char-acters, having their own autonomy and style.  The ‘autonomy’ of the letter springs form its formal unity, just as we see common words as single units. The linguistic unity with which concepts are often formulated also forms them.

The Complete Dictionary contains new potential words in the sense that most of the listed let- ter-combinations are void of meaning, but they are, in other words, still available for use. This encyclopedic artists’ book (which is strictly speaking neither a ‘dictionary’, as it lacks translation, nor ‘complete’, as it embodies precisely the limits on which it is based) has no moth- er-tongue. Any reader can search the pool of possible words and rec- ognise those s/he is accustomed to in her/his language(s).

A complete vocabulary is potentially infinite and ever-expanding, yet the abundance of words paradoxically does not solve the social dilemma of language: ‘how many words do we need in order to understand each other?’ A similar point was made by Prof. Dr. Martin Stokhof in a short article on this work, which was published in the art magazine HTV de IJsberg #50 in 2004 with the aim of introducing linguistic notions to the general public.  He explained that apart from the formal aspects of the work (in the work ‘a word is viewed as a sequence of letters’), the sound aspect and meaning are perhaps more important. He pointed to the special alphabet used in phonology and phonetics, which is capable of expressing more sounds than our standardised Latin alpha- bet. He also noted that: [...] the restrictions on letter-combinations are strongly linked to ‘pronounceability’. These restrictions, however, are different for different languages.

Therefore, the concept of ‘possible word’ as such has little significance in linguistics.  In relation to a word’s meaning Stokhof remarked that there are many more homonyms than synonyms in any given language, and argued that: Form is subordinate to meaning in the sense that the same form (word) often represents more than one meaning. This is because the context (in spoken and written language) almost always makes clear which meaning is represented by the form in that specific context. Stokhof engaged with a Wittgensteinian perspective when he wrote about the function of words: How words come into being and whether they survive, depends on how well they serve this function: to convey meanings.   He continued to explain that the functions words have in our practice will eventually determine what is ‘possible’.  These practices: [...] define to a high degree who we are; what it means to be human. To question possible words, thus turns out to be related to the question what possible practices are. And the latter question is essentially the question of what we, as individuals and as a community, could be. The question of possible words, lexica, languages, is therefore not only a linguistic question, but also an anthropological one: the question of our possible identities.  Stokhof ended by sharing his experience of what such a complete overview, a complete index of the imaginary, would entail: It is both intriguing and alarming, when one is confronted with the idea of a complete list of all possible meanings, a list that effectively says: “Look, this is everything you could possibly think.” "