Higher entropy does not necessarily mean higher disorder, in physics entropy is a measure of the information capacity of a system, the number of bits that it would take to describe its internal state. This can be determined by counting the number of ways its contents could be arranged. Entropy is a measure of how much information can be packed into an object, the limit on entropy is a limit on information.

Points are regions with the dimensions of 10-33 centimeters, the so-called Planck length that physicists believe are the ”grains” of space. Each of these can be assigned a value of zero or one, like the bits in a computer or the structure of the geomantic figures and charts.

The basic constituents of our universe are not quarks or protons or electrons, but much smaller entities called “strings” or “superstrings”, which vibrate-like violin strings-in multidimensional Hyperspace, and whose vibrations in different resonances are manifested in the elementary particles, the quarks, protons and electrons.

The universe exists in multi-dimensional hyperspace, and is not just the fourth dimensional spacetime with which Einstein made us familiar. Binary strings and superstrings are the very basis of geomance, the figures and system are themselves fourth dimensional column vectors, tesseracts.

Consciousness could be the result of an optimization of information processing, recently a group of researchers have suggested that human consciousness emerges due to increasing entropy, linking human consciousness to the Universe. Statistical mechanics of consciousness

All this has got me thinking about random generation again, in a different light.