Tuesday, April 14, 2009

Molecular computers -- A historical perspective. Part 1

I've been having discussions lately with Andy regarding biological/molecular computers and these discussions have frequently turned to the history of analog and digital computers as a reference -- a history not well-known by biologists and chemists. I find writing blog entries to be a convenient way to develop bite-sized pieces of big ideas and therefore what follows is the first (of many?) entries on this topic.


In order to understand molecular computers -- be they biological or engineered -- it is valuable to understand the history of human-built computers. We begin with analog computers -- devices that are in many ways directly analogous to most biological processes.

Analog computers are ancient. The first surviving example is the astonishing Antikythera Mechanism (watch this excellent Nature video about it). Probably built by the descendants of Archimedes' school, this device is a marvel of engineering that computed astronomical values such as the phase of the moon. The device predated equivilent devices by at least a thousand years -- thus furthering Archimedies' already incredible reputation. Mechanical analog computers all work by the now familiar idea of inter-meshed gear-work -- input dials are turned and the whiring gears compute the output function by mechanical transformation.


(The Antikythera Mechanism via WikiCommons.)

Mechanical analog computers are particularly fiddly to "program", especially to "re-program". Each program -- as we would call it now -- is hard-coded into the mechanism, indeed it is the mechanism. Attempting to rearrange the gear-work to represent a new function requires retooling each gear not only to change their relative sizes but also because the wheels will tend to collide with one another if not arranged just so.

Despite these problems, mechanical analog computers advanced significantly over the centuries and by the 1930s sophisticated devices were in use. For example, shown below is the Cambridge Differential Analyzer that had eight integrators and appears to be easily programmable by nerds with appropriately bad hair and inappropriately clean desks. (See this page for more diff. analyzers including modern reconstructions).


(The Cambridge differential analyzer. Image from University of Cambridge via WikiCommons).

There's nothing special about using mechanical devices as a means of analog computation; other sorts of energy transfer are equally well suited to building such computers. For example, in 1949 MONIAC was a hydraulic analog computer that simulated an economy by moving water from container to container via carefully calibrated valves.


(MONIAC. Image by Paul Downey via WikiCommons)


By the 1930's electrical amplifiers were being used for such analog computations. An example is the 1933 Mallock machine that solved simultaneous linear equations.


(Image by University of Cambridge via WikiCommons)

Electronics have several advantages over mechanical implementation: speed, precision, and ease of arrangement. For example, unlike gear-work electrical computers can have easily re-configurable functional components. Because the interconnecting wires have small capacitance and resistance compared to the functional parts, the operational components can be conveniently rewired without having to redesign the physical aspects of mechanism, i.e. unlike gear-work wires can easily avoid collision.

Analog computers are defined by the fact that the variables are encoded by the position or energy level of something -- be it the rotation of a gear, the amount of water in a reservoir, or the charge across a capacitor. Such simple analog encoding is very intuitive: more of the "stuff" (rotation, water, charge, etc) encodes more of represented variable. For all its simplicity however, such analog encoding has serious limitations: range, precision, and serial amplification.

All real analog devices have limited range. For example, a water-encoded variable will overflow when the volume of its container is exceeded.



(An overflowing water-encoded analog variable. Image from Flickr user jordandouglas.)

In order to expand the range of variables encoded by such means all of the containers -- be they cups, gears, or electrical capacitors -- must be enlarged. Building every variable for the worst-case scenario has obvious cost and size implications. Furthermore, such simple-minded containers only encode positive numbers. To encode negative values requires a sign flag or a second complementary container; either way, encoding negative numbers significantly reduces the elegance of the such methods.

Analog variables also suffer from hard-to-control precision problems. It might seem that an analog encoding is nearly perfect -- for example, the water level in a container varies with exquisite precision, right? While it is true that the molecular resolution of the water in the cup is incredibly precise, an encoding is only as good as the decoding. For example, a water-encoded variable might use a small pipe to feed the next computational stage and as the last drop leaves the source resivoir, a meniscus will form due to water's surface tension and therefore the quantity of water passed to the next stage will differ from what was stored in the prior stage. This is but one example of many such real-world complications. For instance, electrical devices, suffer from thermal effects that limit precision due to added noise. Indeed, the faster one runs an electrical analog computer the more heat is generated and the more noise pollutes the variables.


(The meniscus of water in a container -- one example of the complications that limit the precision of real-world analog devices. Image via WikiCommons).

Owing to such effects, the precision of all analog devices is usually much less than one might intuit. The theoretical limit of the precision is given by Shannon's formula. Precision (the amount of information encoded by the variable, measured in bits) is log2( 1+S/N ). It is worth understanding this formula in detail as it applies to any sort of information storage and is therefore just as relevant to a molecular biologist studying a kinase as it is to an electrical engineering studying a telephone.

.... to be continued.

Utility yard fence




In the last few days I've finished up the fence line that separates the backyard from the utility yard. This involved staining more boards with Pinofin which is as malodorous as it is beautiful. Thanks to Jules for the help with staining! Fortunately she is hard-of-smelling so didn't notice how bad it was!

Saturday, April 11, 2009

Finished workshop drawers


Today I finished attaching the hardware to my new tool drawers. I'm stupidly excited about them as I can put away all my tools and clear out a lot of clutter from my shop.

We ordered the boxes from Drawer Connection. They really did a great job; they are perfectly square, dovetailed joined, glued, sanded, and polyed. As Bruce said, "I'll never build another box again." It's a demonstration to me how custom web-based CNC construction is the future of a lot of products. We ordered about 30 boxes of all different sizes and the total was only about $1100 including shipping. There's no possible way we could have made them for that.

Thursday, April 9, 2009

Finished utility yard



Finished up the utility yard today which involved raising the AC units and changing grade a little bit. This weekend I'm going to stain the pickets and rebuilt the rear fence line.

Tuesday, April 7, 2009

The 21st Century Chemical / Biological Lab.

White Paper: The 21st Century Chemical / Biological Lab.

Electronic and computer engineering professionals take for granted that circuits can be designed, built, tested, and improved in a very cheap and efficient manner. Today, the electrical engineer or computer scientist can write a script in a domain specific language, use a compiler to create the circuit, use layout tools to generate the masks, simulate it, fabricate it, and characterize it all without picking up a soldering iron. This was not always the case. The phenomenal tool-stack that permits these high-throughput experiments is fundamental to the remarkable improvements of the electronics industry: from 50-pound AM tube-radios to iPhones in less than 100 years!

Many have observed that chemical (i.e. nanotech) and biological engineering are to the 21st century what electronics was to the 20th. That said, chem/bio labs – be they in academia or industry – are still in their “soldering iron” epoch. Walk into any lab and one will see every experiment conducted by hand, transferring micro-liter volumes of fluid in and out of thousands of small ad-hoc containers using pipettes. This sight is analogous to what one would have seen in electronics labs in the 1930s – engineers sitting at benches with soldering iron in hand. For the 21st century promise of chem/nano/bio engineering to manifest
itself, the automation that made large-scale electronics possible must similarly occur in chem/bio labs.

The optimization of basic lab techniques is critical to every related larger-scale goal be it curing cancer or developing bio-fuels. All such application-specific research depends on experiments and therefore reducing the price and duration of such experiments by large factors will not only improve efficiency but also make possible work that was not previously. While such core tool paths are not necessarily “sexy”, they are critical. Furthermore, a grand vision of chem/bio automation is one that no single commercial company can tackle as the vision for such requires both a very long time commitment and a very wide view of technology. It is uniquely suited to the academic environment as it both depends upon and affords cross-disciplinary research towards a common, if loosely
defined, goal.

Let me elucidate this vision with a science-fiction narrative:

Mary has a theory about the effect of a certain nucleic acid on a cancer cell line. Her latest experiment involves transforming a previously created cell line by adding newly purchased reagents, an experiment that involves numerous controlled mixing steps and several purifications. In the old-days, she would have begun her experiment by pulling-out a pipette, obtaining reagents out of the freezer, off of her bench, and from her friend's lab and then performed her experiment in an ad hoc series of pipette operations. But today, all that is irrelevant; today, she never leaves her computer.

She begins the experiment by writing a protocol in a chemical programming language. Like high-level languages used by electrical and software engineers for decades, this language has variables and routines that allow her to easily and systemically describe the set of chemical transformations (i.e. “chemical algorithms”) that will transpire during the experiment. Many of the subroutines of this experiment are well established protocols such as PCR or antibody
separation and for those Mary need not rewrite the code but merely link in the subroutines for these procedures just as a software engineer would. When Mary is finished writing her script, she compiles it. The compiler generates a set of fluidic gates that are then laid-out using algorithms borrowed from integrated circuit
design. Before realizing the chip, she runs a simulator and validates the design before any reagents are wasted – just as her friends in EE would do before they sent their designs to “tape out.” Because she can print the chip on a local printer for pennies, she is able to print many identical copies for replicate experiments. Furthermore, because the design is entirely in a script, it can be reproduced next week, next year, or by someone in another lab. The detailed script means that Mary’s successors won’t have to interpret a 10 page hand-waving explanation of her protocol translated from her messy lab notes in the supplementary methods section of the paper she publishes – her script *is* the experimental protocol. Indeed, this abstraction means that, unlike in the past, her experiments can be copyrighted or published under an open source license just as code from software or chip design can be.

Designing and printing the chip is only the first step. Tiny quantities of specific fluids need to be moved into and out of this chip – the “I/O” problem. But Mary’s lab, like any, both requires and generates thousands of isolated chemical and biological reagents each of which has to be stored separately in a controlled environment and must be manipulated without risking cross-contamination. In the old days, Mary would have used hundreds of costly sterilized pipette
tips as she laboriously transfered tiny quantities of fluid from container to container. Each tip would be wastefully disposed of despite the fact that only a tiny portion of it was actually contaminated – such was the cost when everything had to be large enough to be manipulated by hand. In the old days, each of the target containers – from large flasks to tiny plastic vials – would have had to be hand-labeled resulting in benches piled with tiny cryptic scribbled notes with all of the confusion and inefficiency that results from such clutter. Fortunately for Mary, today all of the stored fluids for her entire lab are maintained in a single fluidic database; she never touches any of them. In this fluidic database, a robotic pipette machine addresses thousands of individual fluids. These fluids are stored inside of tubes that are spooled off of a single supply and cut to length and end-welded by the machine as needed. Essentially, this fluidic database has merged the concepts of “container” and “pipette” – it simply partitions out a perfectly sized container on-demand and therefore the consumables are cheaper and less wasteful. Also, the storage of these tube-containers is extremely compact in comparison to the endless bottles (mostly filled with air) that one would have seen in the old days. The fluid-filled tubes could be simply wrapped around temperature-controlled spindles and, just like an electronic database or disk drive, the system can optimize itself by “defragmenting” its storage spindles ensuring there’s always efficient usage of the space. Furthermore, because the fluidic
database knows the manifest of its contents, all reagent accounting can be automated and optimized.

Mary has her experiment running. But, moving all these fluids around is just a means to an end. Ultimately she needs to collect data about the performance of her new reagent on the cancer line in question. In the old days, she would have run a gel, used a florescent microscope, or any number of other visualization techniques to quantify her results – any of these measurements would have required a large and expensive machine. But today, most of these measurements are either printed directly on the same chip as the fluidics using printable chemical / electronic sensors or those that can’t be printed are interfaced to a standardized re-usable sensor array. The development of those standards was crucial to the low capital cost of her equipment. Before far-sighted university engineering departments set those standards, each diagnostic had its own proprietary interface and therefore the industry was dominated by an oligopoly of several companies. But now, the standards have promoted competition and thus the price and capabilities of all the diagnostics has improved.

As Mary’s chemical program executes on her newly minted chip, she gets fluorescent read-outs on one channel and antibody detection on another – all such diagnostic were written into her experimental program in the same way that a “debug” or “trace” statement is placed into a software program. After her experiment runs, the raw sensor data is uploaded to the same terminal where she wrote the program and she begins her analysis without getting out of her chair.

After the experiment, the disposable chip and the temporary plumbing that connected to it are all safely incinerated to avoid any external contamination. In the old days, such safety protocols would have had to be known by every lab member and this would have required a time-consuming certification process. But today, all of these safety requirements are enforced by the equipment itself and therefore there’s much less risk of human mistake. Furthermore, because of the
enhanced safety and lower volumes, some procedures that were once classified as bio-safety level 3 are now BSL 2 and some that were 2 are now 1, meaning that more labs are available to work on important problems.

Mary’s entire experiment from design to data-acquisition took her under 1 hour – comparable to a week by old manual techniques. Thanks to all of this automation, Mary has evaluated her experiment and moved on to her next great discovery much faster than would have been possible before. Moreover, because so little fluid was used in these experiments her reagents last longer and therefore the cost has also fallen. Mary can contemplate larger-scale experiments than anybody dreamed of just a decade ago. Mary also makes many fewer costly mistakes because of the rigor imposed by writing and validating the entire experimental script instead of relying on ad hoc procedures. Finally, the capital cost of the equipment itself has fallen due to standardization, competition, and economies of scale. The combined result of these effects is to make the acquisition of chemical and biological knowledge orders of magnitude faster than was possible just decades ago.

Monday, April 6, 2009

Macro-scale examples of chemical principles

I like macro-scale examples of chemical principles. Here's two I've noticed recently.


I was very slowly pouring popcorn into a pot with a little bit of oil. The kernels did not distribute themselves randomly but instead formed some long chain aggregations because, apparently, the oil made them more likely to stick to each other than to stand alone. This kind of aggregation occurs frequently at the molecular scale when some molecule has an affinity for itself.


This is wheelbarrow chromatography. During a rain, water and leaves fell into this wheelbarrow. Notice that the leaves and the stems separated; apparently the stems are lighter than water and the leaves are heavier. This sort of "phase separation" trick is frequently used by chemists to isolate one type of molecule from another in a complex mixture. Sometimes the gradient of separation might be variable density as in this example, but other times it might be hydrophobicity or affinity to an antibody or many other types of clever chemical separations known generically as "chromatography". Note that the stems clustered. Like the popcorn above, apparently there is some inter-stem cohesion force that results in aggregation as occurs in many chemical solutions.

Porch branches




.
Bruce and I finished up the porch branches on Friday; they have not yet been stained so the color is different. It's funny -- this is one of the very first details I thought of for the house design and one of the very last to be implemented so for me this small detail is very important in that it collapses some sort of psychic "todo" stack and thereby provides the relief that one feels in crossing-out a complicated set of tasks (never mind that the list has grown substantially since then! :-)

Geometry of Biological Time, Chapt 1.

I've started reading A. T. Winfree's book (father of Erik): "Geometry of Biological Time". Sometimes one finds just the right book that fills in the gaps of one's knowledge; this book is just right for me at this moment, as if I was fated to read it.

It begins with an excellent introduction to topology mapping. I had picked up some of the ideas by osmosis, but the first 20 pages were an excellent and helpful series of discussions that help solidify my understanding of this subject. He lucidly expands on abut 15 topological mappings in increasing complexity. For each, he provides intuitive examples with lovely side discussions such as relating the S1 -> I1 mapping to the international-date-line problem and the astonishment of Magellan's expedition to the loss of a day upon the first round-the-world trip. (I first heard of this idea as the climax of the plot of "around the world in 80 days"). He introduced the idea of all such mapping problems as singularities in the mapping functions. Again, this was something that I half-understood intuitively and thus it was very helpful to have it articulated clearly.

I now realize that in previous amorphous computing experiments described in this blog, I had been exploring S1 x S1 -> S1 mappings (circular space by oscillator space mapping to a visible phase). This S1xS1->S1 mapping is exactly where he heads after his introduction as a place of interest. In other words, I had ended up by intuition exactly where he did.

It's a very long and dense book, if I can maintain my way through it, it may generate a lot of blog entries!

Wednesday, April 1, 2009

House projects


Arch cut for the handrail on the main stairs.


Kitchen nearly complete.


Bruce gets out the Gallagher Saw!



Bruce and I start mounting branches on the front porch.


This one branch was a bitch, we gnawed away it it with probably 30 cuts before we got it to fit right!


We wrapped up many of the recent house projects. The upstairs porch is screened in and has a pane of glass on one side which significantly reduced the noise from the neighbor's AC unit -- now I can have my bedroom windows open!

The rear kitchen is done except for drawer pulls and a couple of electric outlets.

We cut an arch into one of the supports along side the staircase that had been bothering me for a long time because it didn't leave room for your hand to slide along the hand rail. Bruce and I came up with this cut arch solution which also I think looks really cool -- opened up the space a lot.

We started adding branches to the front porch beams which was part of the original plan but we had never gotten around to it. One of these branches was much harder than the others we ended up make many small cuts until we got it positioned just right.

Monday, March 30, 2009

An Idea: Federal Reserve Random Moves


Reference historical DJIA (log scale)

Self Organized Criticality (SOC) is a model to describe the dynamics of certain kinds of systems built out of many interacting non-linear actors. The "sand pile" model" relates the frequency of avalances to their magnitude by 1/f (i.e. avalances happen with inverse frequency to their size).

It seems intuitive that economic systems should also show this "sand pile" behavior and this paper claims that stock markets do indeed show "near-self organized critical" behavior. (The exact function is not relevant to my argument.) This intuition for this comes from the fact that each economic actor relies on others in a complicated web of interactions. The value of assets in the system are subjective and are strongly biased by the perception of other actor's subjectively valued assets. Moreover, the perceived future value of those assets is a strong function of the cultural perception of the unknown future. In other words, the macroeconomic system is in a strong, multi-scale, positive feedback.

In the sandpile model, a few grains of sand will end up holding an enormous load of upstream stress and therefore their perturbation will create large avalanches. Analogously, a few economic actors (insurance companies, banks, hedge funds, etc) will end up with an enormous load of upstream dependencies that will similarly cause avalanches if they are disrupted.

In the sand pile model one can imagine a large conical basin of uphill dependencies resting on a few critical grains -- those critical bits are the ones that are "too big to be allowed to fail". Playing very loosely with the analogy, the stress on a gain from its uphill neighbors is analogous to the balance sheets of an economic actor. But not exactly. In the sand pile, all potential energy is explicitly accounted for -- there's no hiding the cumulative stresses due to the weight of each particle. This is not true in the economic analog. Real balance sheets do not account for total stresses because complicated financial transactions (like mortgages and insurance contracts) contain off-balance-sheet information that is usually one-way. For example, when a bank realizes that there is risk in a mortgage they will pass on this cost to the uphill actor but when a debtor realizes that there's more risk (for example, they might know that their financial situation is not as stable as it appears on paper) they will not pass along this information. In other words, there will tend to be even more uphill stress than is accounted for by the balance sheets of each downhill actor.

Now the point.

If you wanted to reduce the number of large scale catastrophic avalanches in the sand pile model, the method for doing so is easy: add noise. The vibration of the sand pile would ensure that potential energy in excess of the noise energy would not be allowed to build up. It's the same idea of forest management -- lots of small fires prevent larger ones. Therefore, by analogy, a good strategy for the Federal Reserve might be to similarly add noise. Conveniently, this "add noise" strategy is inherently simpler to execute than is their current strategy -- they would simply roll a die every few months and change the discount rate by some number between zero and ten percent.

Crazy? Well, as it stands now, the Federal Reserve operates under the belief that it can act as a negative-feedback regulator of the macro economic system. The idea is sound, but based on my experience attempting to control even very simple systems, I'm skeptical of the reality. To begin with the obvious, the economy is anything but simple. Furthermore, the Fed does not have, never has had, and never will have, an accurate measurement of the economy. To wit: it neglected the huge volume of CDOs built up in the last 10 years, and the S&L stress of the 80s, and the tech bubble of the 90s, etc, etc. History shows that there have always been, and will always be, bubbles and newfangled leveraging instruments so anything short of draconian regulation that stopped all financial innovation (which would be worse) will not be anything but reactive. But it gets worse. There are also large and unpredictable latencies in both the measurements and the results of the Fed's actions. Even in simple linear systems, such latencies can have destabilizing effects and since the macro economic system is highly non-linear and constantly evolving the effects are essentially unknowable apriori.

In summary, I suspect that the macroeconomic system is not directly controllable in the way that is envisioned by the creators of the Federal Reserve due to non-linearity, poor measurability, and latency. Therefore, given that the economy probably has some SOC like organization, I suspect that random Fed moves would probably be no worse than the current strategy and would probably be better.

Flagella assembly video (external link)


This is a really nice video about flagella assembly (thanks to Ken for the forward). One detail I didn't know before was that the flagella proteins are denatured for export through a ludicriously small 1 nm channel. All the stuff about the hook length measurements were particularly interesting. Very cool!

http://video.google.com/videoplay?docid=14997924975209807&hl=en

Saturday, March 28, 2009

An Idea: Internet Security Though Random Compilation

This morning an idea occurred to me -- a way to stop malware, viruses, and worms. When someone wishes to crack an internet protocol for nefarious purposes, one way to do so is to exploit bugs in buffer handling. For example, some specific implementation of the email protocol might have a bug whereby if certain characters are passed in the addess field then it causes a buffer overflow that could permit writing onto the stack. By sending the right set of characters, the overflow might be directed to upload and execute arbitrary instructions. Similar exploits have existed/still exist in many systems such as the image handlers for Microsoft Outlook and countless other programs.

As clever as it is, exploiting such a bug requires having a copy of the code locally during development so that the programmer can step through it and figure out exactly how to exploit the overflow. Thus, a way to defeat this is to ensure that every single instance of that code running on every machine is unique. Therefore the solution is simple. Write a compiler that generates random code that performs the same task but with different execution paths. Such a complier would stop all such exploits by effectively creating a local unique encryption. A random compiler would be easy to write and indeed already exists in Java as "code obfuscators" for the purposes of reducing reverse engineering. The only difficulty in deploying such a system is that the relevant software could no longer be deployed on mass-produced media such as CDs since each instance has to be different. But this is a declining issue as more and more software is delivered online where each instance could be different. Furthermore, many of the main internet protocols are open source implementations and where local compilation is already possible or, in many cases, already occuring. Therefore adding this feature to Gnu C would be a big step in the right direction.

Tuesday, March 17, 2009

Sound insulation, screens


The wallboard guys came today and added two layers of sound proofing to my bedroom wall.


Kurt finished the screen installation. You can hardly see them. Of course, after a while they'll get dirty. The section on the left is where glass will go which will hopefully abate some of the sound of the neighbor's AC unit.

Monday, March 16, 2009

Sound proofing, Screens


Looking up into the ceiling Bruce cut a hole to gain access to underneath the bathroom.


This wall and the adjacent one are going to have sound proofing board added tomorrow so bed and pictures were taken out of the room.


Bruce finished up the back of the new shop cabinets. The wall board guys will clean this up tomorrow.


Crammed my bed temporarily into the spare room.


Pulled the railings out from the back porch to insert the new screens. But the screens didn't quite fit so there's some modification to be done.

There is a bathroom immediately adjacent to my bedroom and anytime it is used at night I can hear, well, everything. This is despite the fact that the walls are filled with 3" of foam. To abate this, Bruce opened up the the ceiling and sprayed in cellulose insulation above and below the bathroom. This made a tremendous amount of dust but didn't do much to dampen the sound. Tomorrow the wallboard guys come and we're going to expand the adjoining walls with another inch of special sound-proofing board. This requires rebuilding two walls, two doors, and repainting so I've temporarily moved my bed into the spare bedroom. Meanwhile, the cabinets were framed up and an attempted mounting of the porch screens determined that there were a few mis-measurements and will require some modifications.

Sunday, March 15, 2009

Utility yard


Before (aftually after moving a palette of wood with the help of my neighbor, thanks Verner.)


Yesterday I graded and filled gravel along the sides. The brick ramps are for moving the wheel-barrow.


We moved all the bricks and wood into this pile near where the BBQ will go.


After regrading and filling with gravel. (Not shown, about 1/3 as much still to go)


Today Aaron and I did huge amount of work. I dug 3" deep channels under the fence line to improve drainage. The we regraded the area and dug drainage troughs. We moved all the junk out of the back including a half palette of bricks. I've moved piles bricks so many damned times, I've lost count; at least now they are sitting next to where the BBQ pit will be so they are within arms reach of their final resting place. Then we took down part of the fence in order to improve the grade, laid down a weed barrier, and hauled about 12 loads of gravel up the steps. I'm beat.

Friday, March 13, 2009

Pipe move



The plumber Dale came by today to reroute the pipe that was in the middle of my new cabinet space. Unfortunately I forgot to mention to him that the bottom needed to be cut out (you can see the stub at the bottom) to make way for a larger drawer so either I live with it or have him rebuild that when he comes over for the kitchenette plumbing next week.

Tuesday, March 10, 2009

Workshop demo, Kitchenette framing, fence



"Yeah, those aren't necessary", says Bruce as he cuts them out. (He's the one who put them there in the first place. :-)



After duct and electrical rearrangements, leaving only the pipe to be re-routed.



Having two excellent professional carpenters around sure boosts the productivity. We began the morning by demolishing part of the chase that is adjacent to the workshop wall where we are opening space for a new set of tool drawers and storage. This required moving around a few supports, ducts, and eletrical boxes (my job). There's a big sewage pipe in the middle of this which will be re-routed when the plumber comes on Thursday. Then Bruce and Kurt framed out the wall where the kitchette will go and we went over to a discount appliance store and purchased the microwave/vent hood, half-size dishwasher, and gas cook top which look pretty nice for a pretty reasonable price of about $1100. Then I worked for a few hours on revisions to Andy's paper and then after a 1 hour gym, I managed to get 4 of the 6 stringers up on the back fence.

Monday, March 9, 2009

Fence line, Bed spreads, Screening, Sophia Collier, and a paper for the Royal Society


Today was an oddly productive day. In the morning I dug post holes for a last bit of fence line that will separate my utility yard from my back yard. Then Bruce came over for measurement on the screening that for the upstairs porch. Then Amberlee came over and we had a little Christmas where we opened all the packages that she had ordered for me -- a new bed headboard, douvet, sheets, and pillows! For lunch I had a marvelous time meeting with Sophia Collier who was introduced to me by my attorney. Sophia and I were apparently born with the same mutant genes; we both left school (I in 11th and she in 12th) and we went off to various other endeavors (although hers have been generally more profitable than mine!) After a career in such things as soda and mutual fund management, she's now into CNC artwork. She has a CNC mill, 3D software, and a lot of fun ideas. We geeked out for hours on art and science projects of all kinds and she gave me much valuable feedback with regards to my various forthcoming enterprises. After lunch I set the fence posts and poured the footer concrete and then started on the rewrite of the paper I've been writing with Andy Ellington for the Royal Society journal Interface which came back with deservedly so-so reviews and which as a result (as seems to often be the case with peer-reviewed journals) is forcing a rewrite that will no doubt result in a better paper.

Saturday, March 7, 2009

Utility yard steps


This afternoon I built three steps along the utility yard that will be back-filled with pea-gravel forming a series of stepped terraces using the bricks my neighbors gave me.

Plants




I attached my custom tiles to the front of the planter my friend Scott Thurmon came over with dirt and plants so the front planter is now complete (except that the irrigation still needs to be connected). In a year or so the red yucca in the middle should grow up to be more proportional to the mass of the planter.