Memory

The computer term for the maximum amount of information it can handle, as measured in bits.

The total amount of memory in a system bounds the amount of useful information that the system can manage. However, the size in bits of computer representations of objects is often far greater than the actual useful information in them, as there is often a lot of redundancy in them, as made necessary by requirements of processing speed. Hence memory is not used to its full information capacity, but rather according to tradeoffs in a famous compromise between processing time and space.

Nowadays, computer memories are organized in various layers, from very fast and volatile, but very few 1-word registers, to very big but permanent, but somehow slow multi-gigabyte disks. In between are various levels of hardware- or software- managed caches, made in various technologies (SRAM, DRAM), as progressive intermediate buffers between very fast but small central processors and very big but slow disks.

As these different layers of memories have quite different time and space constraints, different layer-specific representations and tradeoffs are to be used to efficiently adapt to layer-specific constraints, hence the growing need for more and more complex algorithms for compression, decompression, caching and flushing, hashing and looking up, etc.

As things grow in complexity, we at the TUNES project feel that there is a call for metaprogramming techniques, so as to automatically generate code to efficiently interface between those different object spaces.


This page is linked from: Bit   Constructor and Destructor   Garbage Collection   Language Implementation   Linear Logic   Memory-Pool System   TAL   Tradition