Pages

Friday, March 14, 2014

The MAGICAL Number Seven (plus or minus two)

Why do we “chunk” things in groups of about seven – seven days of the week, seven seas, seven sins, etc? The presentation I gave to the Philosophy Club in The Villages, FL, 14 March 2014 provides the theoretical answer. You may download a PowerPoint Show that should run on any Windows computer here: https://sites.google.com/site/iraclass/my-forms/PhiloMAGICALsevenMar2014.ppsx?attredirects=0&d=1

This is an easy-to-understand version of a more technical presentation I made to the Science-Technology Club in February, see: http://tvpclub.blogspot.com/2014/02/optimal-span-amazing-intersection-of.html


MILLER - PERSECUTED BY THE NUMBER SEVEN !



Way back in 1956 a classic scientific paper appeared in the Psychological Review with the intriguing title: The Magical Number Seven, Plus or Minus Two – Some Limits on Our Capacity for Processing Information. That paper was extremely important and influential and is still available online. George A. Miller begins with a strange plea:
My problem is that I have been persecuted by an integer … The persistence with which this number plagues me is far more than a random accident …
He presents the results of twenty experiments where human subjects were tested to determine what he calls our "Span of Absolute Judgment", that is, how many levels of a given stimulus we can reliably distinguish. Most of the results are in the range of five to nine, but some are as low as three or as high as fifteen. For example, our ears can distinguish five or six tones of pitch or about five levels of loudness. Our eyes can distinguish about nine different positions of a pointer in an interval. Using a vibrator placed on a person's chest, he or she can distinguish about four to seven different level of intensity, location, or duration, etc. The average Span of Absolute Judgment is 6.4 for Miller's twenty one-dimensional stimuli.

Miller also presents data for what he calls our "Span of Immediate Memory", that is, how many randomly presented items we can reliably remember. For example, we can remember about nine binary items, such as a series of "1" and "0", or about eight digits, or about six letters of the alphabet, or about five mono-syllabic words randomly selected out of a set of 1000.

At the end of his paper Miller rambles:
...And finally, what about the magical number seven? What about the seven wonders of the world, the seven seas, the seven deadly sins, the seven daughters of Atlas in the Pleiades, the seven ages of man, the seven notes of the musical scale, and the seven days of the week? What about the seven-point rating scale, the seven categories for absolute judgment, the seven objects in the span of attention, and the seven digits in the span of immediate memory?

For the present, I prefer to withhold judgment.

Perhaps there is something deep and profound behind all these sevens, something just calling out for us to discover it.
But I suspect that it is only a pernicious, Pythagorean coincidence. [my bold]
Well, it turns out that there IS something DEEP and PROFOUND behind "all these sevens" and I (Ira Glickstein) HAVE DISCOVERED IT. And, my insight applies not only to the span of human senses and memory, but also to the span of written language, management span of control, and even to the way the genetic "language of life" in RNA and DNA is organized. Furthermore, my discovery is not simply based on support from empirical evidence from many different domains, but has been mathematically derived from the basic Information Theory equation published in 1948 by Claude Shannon, and the adaptation of "Shannon Entropy" to the Intricacy of a biograph by Smith and Morowitz in 1982.


SIMPLICITY VS COMPLEXITY VS INTRICACY 


Albert Einstein wisely advises us to:
Make things as simple as possible, but no simpler!
Good advice, but how to follow it? Well, Edward Teller suggests:
Threads of simplicity … are not easily discovered in music or in science. Indeed, they usually can be discerned only with effort and training. Yet the underlying simplicity exists and once found makes new and more powerful relationships possible.


How to find those "threads of simplicity"? Well, we need to understand the difference between COMPLEXITY and INTRICACY which, in normal usage, are sometimes used interchangeably. However, there is an important distinction between them according to Smith and Morowitz (1982).

COMPLEXITY - Something is said to be complex if it has a lot of different parts, interacting in different ways. To completely describe a complex system you would have to completely describe each of the different types of parts and then describe the different ways they interact. Therefore, a measure of complexity is how long a description would be required for one person competent in that domain of knowledge to explain it to another.

A great example of UNNECESSARY COMPLEXITY is found in those "Rube Goldberg Inventions" where a relatively simple task is complicated by combining all sorts of different effects into a chain of ridiculous interactions.

INTRICACY - Something is said to be intricate if it has a lot of parts, but they may all be the same or very similar and they may interact in simple ways. To completely describe an intricate system you would only have to describe one or two or a few different parts and then describe the simple ways they interact.

A great example of useful INTRICACY is a window screen that is intricate but not at all complex. It consists of equally-spaced vertical and horizontal wires criss-crossing in a regular pattern in a frame where the spaces are small enough to exclude bugs down to some size. All you need to know is the material and diameter of the wires, the spacing betwen them, and the size of the window frame. Similarly, a field of grass is intricate but not complex.

If you think about it for a moment, it is clear that, given limited resources, they should be deployed in ways that minimize complexity to the extent possible, and maximize intricacy! That is why nearly all natural and artificial structures are configured as Hierarchical and have a common "Optimal Span".


 A SIMPLE FORMULA FOR MAXIMIZING INTRICACY AND REDUCING COMPLEXITY

What is Optimal Span?

With so many different types of hierarchical structures, each with its own purpose and use, you might think there is no common property they share other than their hierarchical nature. You might expect a particular Span of Control that is best for Management Structures in Corporations and a significantly different Span of Containment that is best in Written Language.

If you expected the Optimal Span to be significantly different for each case, you would be wrong!
According to System Science research and Information Theory, there is a single equation that may be used to determine the most beneficial Span. Thatoptimum value maximizes the effectiveness of the resources. A Management Structure should have the Span of Control that makes the best use of the number of employees available. A Written Language Structure should have the Span of Containment that makes the best use of the number of characters (or bits in the case of the Internet) available, and so on.

The simple equation for Optimal Span derived by [ Glickstein, 1996 ] is:

So= 1 + De
(Where D is the degree of the nodes and e is the Natural Number 2.71828459)

In the examples above, where the hierarchical structure may be described as a single-dimensional folded string where each node has two closest neighbors, the degree of the nodes is, D = 2, so the equation reduces to:

So= 1 + De = 1 + 2 x 2.71828459 = 6.43659

“Take home message”: OPTIMAL SPAN, So = ~ 6.4

Also see Quantifying Brooks Mythical Man-Month (Knol) , [Glickstein, 2003 ] and [ http://repository.tudelft.nl/assets/uuid:843020de-2248-468a-bf19-15b4447b5bce/dep_meijer_20061114.pdf] for the applicability of Optimal Span to Management Structures.
Ira Glickstein

No comments:

Post a Comment