Wednesday, April 15, 2009

Molecular computers -- A historical perspective. Part 2

We left off last time discussing the precision of an analog signal.

Consider a rising analog signal that looks like the following ramp.

Notice that there's noise polluting this signal. Clearly, this analog signal is not as precise as it would be without noise. How do we quantify this precision? The answer was described in the early 20th century and is known as the Shannon-Hartly theorem. When the receiver decodes this analog variable what is heard is not just the intended signal but rather the intended signal plus the noise (S+N); this value can be compared to the level of pure noise (N). Therefore the ratio (S+N)/N describes how many discrete levels are available in the encoding.

The encoding on the left is very noisy and therefore only 4 discrete levels can be discerned without confusion; the one in the middle is less noisy and permits 8 levels; on the right, the low noise permits 16 levels. The number of discrete encodable levels is the precision of the signal and is conveniently measured in bits -- the number of binary digits it would take to encode this many discrete states. The number of binary digits need is given by the log base 2 of the number of states, so we have log2( (S+N)/N ) which is usually algebraically simplified to log2(1+S/N).

It is important to note that although Shannon and Hartley (working separately) developed this model in the context of electrical communication equipment, there is nothing in this formulation that speaks of electronics. The formula is a statement about information in the abstract -- independent of any particular implementation technology. The formula is just as useful for characterizing the information content represented by the concentration of a chemically-encoded biological signal as it is for the voltage driving an audio speaker or the precision of a gear-work device.

We're not quite done yet with this formulation. The log2(1+S/N) formula speaks of the maximum possible information content in a channel at any given moment. But signals in a channel change; channels with no variation are very dull!

(A signal with no variation is very dull. Adapted from Flickr user blinky5.)

To determine the capacity of a channel one must also consider the rate at which it can change state. If, for example, I used the 2 bit channel from above I could vary the signal at some speed as illustrated below.

(A 2-bit channel changing state 16 times in 1 second.)

This signal is thus sending 2 bits * 16 per second = 32 bits per second.

All channels -- be they transmembrane kinases, hydraulic actuators, or a telegraph wires -- have a limited ability to change state. This capacity is generically called its "bandwidth" but that term is a bit over simplified so let's look at it more carefully.

It is intuitive that real-world devices can not instantaneously change their state. Imagine, for example, inflating a balloon. Call the inflated balloon "state one". Deflate it and call this "state zero". Obviously there is a limited rate at which you can cycle the balloon from one state to the other. You can try to inflate the balloon extremely quickly by hitting it with a lot of air pressure but there's a limit -- at some point the pressure is so high that the balloon explodes during the inflation due to stress.

(A catastrophic failure of a pneumatic signalling device from over-powering it. From

Most systems are like the balloon example -- they respond well to slow changes and poorly to fast changes. Also like the balloon, most systems fail catastrophically when driven to the point where the energy flux is too high -- usually by melting.

(A device melted from overpowering it. Adapted from flickr user djuggler.)

Consider a simple experiment to measure the rate at which you can switch the state of a balloon. Connect the balloon to a bicycle pump and drive the pump with a spinning wheel. Turn the wheel slowly and write down the maximum volume the balloon obtains. Repeat this experiment for faster and faster rates of spinning the wheel. You'll get a graph as follows.

(Experimental apparatus to measure the cycling response of a pneumatic signal.)

(The results from the balloon experiment where we systematically increased the speed of cycling the inflation state.)

On the left side of the graph, the balloon responds fully to the cycling and thus has a a good signal (S). But, on the left side very few bits can be transmitted at these slow speeds so there's not a lot of information able to be sent despite the good response of the balloon. But, further to the right the balloon still has a good response and now we're sending bits much more rapidly so we're able to send a lot of infrmation at these speed. But, by the far right of the graph, when the cycling is extremely quick, the balloon response falls off and finally hits zero when it popped so that defines the frequency limit.

The total channel capacity of our balloon device is an integral along this experimentally sampled frequency axis where we multiply the number of cycles per second at that location by the log2( 1+S/N ) where S is now the measured response from our experiment which we'll call S(f) = "The signal at frequency f". We didn't bother to measure noise as a function of frequency in our thought experiment, but we'll imagine we can do that just as easily and we'll have a new graph N(f) = "The noise at frequency f". The total information capacity (C) of the channel is the integral of all these products across the frequency samples we took up to the bandwidth limit (B) where the balloon popped.

If you want to characterize the computational/communication aspects of any system you have to perform the equivilent of this balloon thought experiment. Electrical engineers all know this by heart as they've had it beaten into them since the beginning of their studies. But, unfortunately most biochemists, molecular biologists, and synthetic biologist have never even thought about it. Hopefully that will start to change. As we both learn more about biological pathways and we become more sophisticated engineers of those pathways we will have an unnecessarily shallow understanding until we come to universally appreciate the importance of these characteristics.

Next, amplifiers and digital devices. To be continued...


hydraulic actuator said...

Valve actuators are really needed for complex tasks in the high power market.If the load requires accurate positioning, the electric actuators as well as the valve actuators has the advantage among others.That is why,to familiarize yourself in this kind of industrial application , knowing how an automation works is a wisely action and will positively keep us in track.

REZA said...

can you explained it simpler?
it was too long, I can learn it,,,
and I want too know it better,,