|Binary trees can be used to implement maps, where, at every node, instead of having a 'count' like Whiteknight has, you have a pointer to whatever you want to store.|
Whiteknight, for instance, basically wrote a string-to-integer map.
So, a binary tree could be used as an effective data structure for, e.g., every time you need to store a very large quantity of string-to-xyz mappings. SMAUG does this a lot: skill tables, command tables, ...
What's nifty about what Whiteknight wrote is that the binary tree is balanced. An unbalanced binary tree is at the worse case basically a linked list, which means you lose all the good properties of a balanced tree. A balanced tree, on the other hand, guarantees at all times that the left-right distribution of every node is optimal, i.e. for every node, the quantity of nodes to the left is equal to the quantity of nodes on the right. (Except of course when you don't have the choice, e.g. a tree with 4 elements.)
You don't do this in C a whole lot, but in C++, every time you create a map<foo, bar>, chances are you are using an implementation that uses binary trees.
More technically speaking a binary tree gives you O(log n) lookup times, as opposed to O(n) for an array. It is very hard to beat a binary tree in general.
One way is to use a hash table, which is as good as its hash function. The 'ideal hash table' is O(1) (i.e. constant-time lookup), but these do not occur very often in practice. Practically speaking, hash tables are, assuming you have a very good hash function, O(n/m) where m is the number of buckets in your hash table.
There are other tricks you can use to implement mappings where the key is something sequential (numbers, strings : these are both sequences of digits, letters). One of my all-time favorites is the "trie". This stores a tree-like structure of the keys, where every node is an element of the sequence. So to look up "foobar", you'd follow the f node of the root node, the o node of the f node, the o node of the o node, the b of the o, the a of the b and the r of the a.
What's so nifty about 'tries' is that your lookup time is O(p) where p is the length of your key. Tries are not at all affected (time complexity) by the number of elements. This is an incredibly good feature.
Anyhow, enough rambling...
David Haley aka Ksilyan
Legends of the Darkstone
| top |