A constrained-choice system is one in which the very structure of the system eliminates certain options. The aims of the system are physically manifest in such a way that the desires of the user, when physically manifest, are simply incompatible. The Paris Metro, for example, provides places to sit, but structures these seats such that they can only act as seats—they are physically unable to act as places to lie down, thereby curbing public sleeping. Note that (a) the aims of the system are not to prevent litter, leading to the newspaper on the second chair, and (b) the system assumes a relatively tight correlation between sleeping and lying down, between the action and the position. That is, people can use these chairs to sleep while sitting up.
It’s easy to notice the underlying presence of a constrained-choice system in Metro seating because the change from benches to chairs is a relatively recent one, brought about by renovations. The new seating system eliminates an existing loophole or exaptation—something the system was never “designed” to be used for. The seating design in certain Metro stations, like the one above, just misses the point: a constrained-choice system only successfully eliminates choices if it is properly applied. Seating design is an especially powerful example of constrained-choice because it modifies an ongoing action, in a way that keys or security checkpoints don’t. A remarkable example of a constrained-choice system is the Airbus A320, which ignores pilot instructions that would cause the airframe to exceed its design parameters.
Orwellian Newspeak is an example of a linguistic constrained-choice system:
The purpose of Newspeak was not only to provide a medium of expression for the world-view and mental habits proper to the devotees of Ingsoc [English Socialism], but to make all other modes of thought impossible. It was intended that when Newspeak had been adopted once and for all and Oldspeak forgotten, a heretical thought—that is, a thought diverging from the principles of Ingsoc—should be literally unthinkable, at least so far as thought is dependent on words. Its vocabulary was so constructed as to give exact and often very subtle expression to every meaning that a Party member could properly wish to express, while excluding all other meanings and also the possibility of arriving at them by indirect methods. This was done partly by the invention of new words, but chiefly by eliminating undesirable words and by stripping such words as remained of unorthodox meanings, and so far as possible of all secondary meanings whatever. To give a single example. The word free still existed in Newspeak, but it could only be used in such statements as ‘This dog is free from lice’ or ‘This field is free from weeds’. It could not be used in its old sense of ‘ politically free’ or ‘intellectually free’ since political and intellectual freedom no longer existed even as concepts, and were therefore of necessity nameless. (George Orwell, 1984, appendix.)
The really fascinating thing about this is that, to a certain extent, all systems are constrained-choice systems. The difference lies in whether or not the system deliberately seeks to constrain a user’s choices, and in how well or poorly it does so. In most cases, a system either needs to meet the needs of so many different people, is not sufficiently funded, or is so crudely designed that there is plenty of wiggle room and all manner of exaptations are possible. (Dumpster diving is an exaptation; pouring bleach on thrown-out food is an attempt to maintain the integrity of the system.)
The process of design, of choosing this form over that one, necessarily excludes certain possible choices. The interesting stuff comes when certain choices are deliberately built out of a system, because those design decisions reveal both the mindset of the designer, the aims of the system, and possible weak points. Every system, even constrained-choice systems, can be exploited. It’s just a matter of how or where, and finding a system’s weak points is easier when you can see what the system ‘wants’ its users to do.