|
Circuits of a cell
A modern microchip is a wonder, a
tiny crystalline object packed with
an integrated array of resistors,
capacitors, inductors, amplifiers,
and more-a network that can
perform prodigies of signal
processing and computation, as
long as a manageable number of
electrons flows through the system.
Yet the circuitry of a single-celled
organism-even tinier, common as
dirt-is more wonderful still.
Microbes adapt to changing
environments, respond with
versatility to threat from within and
without, and replicate themselves in
profusion. Simple viruses perform
tricks almost as impressive; cells in
complex organisms can do even
more. "The tradition in biology is to try to understand intracellular communication from the
bottom up," says Adam Arkin, a member of the Department of Computational and
Theoretical Biology in the Physical Biosciences Division. "This involves dissecting cellular
function to the point that every molecular reaction in a biochemical pathway is known -
leading to reaction networks so dense, with so many interconnections, that a flowchart
of just the most important pathways in a simple cell covers a wall."
Arkin, who is also an assistant professor in Bioengineering and Chemistry at UC Berkeley,
says "it's difficult for biologists to make sense of what happens when reactions in such a
dense and tangled network are modified even in a bacteria-much less to predict, in
human cells, how particular mutations lead to disease, or how particular pharmaceuticals
will affect an individual."
That's why he and his colleagues "want to design tools for understanding cellular
biochemical networks which are similar to the tools electronics engineers use to design
circuits," he says. "For all their complexity, there are interesting ways in which the
circuitry of a living cell resembles electronic circuitry."
A living cell is not (quite) a digital computer
At first glance the analogy between electronic
circuits and living cells seems far-fetched.
Diverse atoms and molecules, not
indistinguishable electrons, are the cell's
currency-units as simple as elemental ions, as
specific as enzymes, or as complex as long
strands of dna or rna, each with its own
unique set of sources and targets.
And while an electrical circuit consists of an
arrangement of physical parts, the "parts" of a
biochemical network are mostly chemical
reactions themselves-not entities but
processes, whose existence depends on the
presence of the right chemicals at the right
place, at the right time, and in the right
amounts.
The number of possible states in even
elementary chemical "devices" is usually much
greater than two; where large numbers of
molecules are involved, the ratio of chemical
concentrations may change virtually
continuously. Still, the outcome of a given
biochemical process is often binary, like an
on/off switch.
For example, the amino acid glycine is
produced from 3-phosphoglycerate by a series
of enzyme-catalyzed reactions. If the supply
of phosphoglycerate runs out, the reaction shuts down-but more often it shuts down
when too much glycine inhibits the first reaction step; in either case the result is "off."
Arkin and his collaborators have identified several classes of chemical devices which
function like basic units of electronic circuits, such as "voltage dividers" that can divide
or amplify chemical signals and "frequency filters" that respond only to particular timing
patterns in chemical concentrations. These sets of reactions are "wired" in series or
parallel to form regulatory feedback loops, binary switches, oscillators, and even-in a
resonant metaphor from biology to electronics and back again-computer-like "neural nets"
that calculate the cell's needs and adapt to changing conditions.
Reactions involving large numbers of molecules are capable of deterministic outcomes,
but when important functions make use of very small numbers of molecules, the
outcomes are not completely assured; rather, they are stochastic.
Noisy signals and missing clocks
"Gene regulation is the process by which a cell
produces the proteins to do what needs doing at a
given time, including controlling the expression of
other genes," Arkin says. "Often the genes
regulating a given pathway are present in only one
or two copies per cell, and the transcription factors
that regulate these genes may be at very low
concentrations, less than a hundred molecules per
cell."
When a crucial signal consists of less than a
hundred molecules, disruptive jostling among them,
known as thermal noise, is high; how the reaction
proceeds and which of the possible signals gets
sent is a random affair. The process moves by fits
and starts, and the time to complete it
varies-although once underway, buildup of chemical
concentrations is usually self-reinforcing.One way
electronic circuit designers improve the
signal-to-noise ratio is through signal
synchronization. But unlike most digital circuits,
biochemical networks (with a few exceptions) have
no clock. The timing of even seemingly regular
oscillations such as cell growth and division varies
widely and can hang up, waiting until swarms of
internal processes are completed and external
conditions, like nutrient supply, are right.
"Synchronous design cuts noise by ignoring signals
that arrive at the wrong time-but asynchronous
systems also have advantages, which even chip
designers are beginning to appreciate," says Arkin.
"An asynchronous circuit can speed up or slow down
in response to changing conditions like temperature.
There's no circuitry wasted maintaining synchrony,
and if a new version of a component is placed in the
system, the network can adapt to the change.
Power requirements are adjustable and generally
lower." Arkin points out that "these are just the
sorts of advantages that favor evolution by natural
selection."
Survival through stochastic logic
Arkin and his colleagues, including Harley McAdams
and John Ross of Stanford University, have analyzed
the stochastic logic of gene regulation in several
cell systems, including well-known,
long-investigated genetic switches in viruses and
bacteria.
When a bacteriophage ("bacteria eater") virus injects its own dna into a microorganism
such as Escherichia coli, the host cell apparatus rapidly expresses the program on the
viral dna that decides whether or not to kill the host immediately. Under conditions that
are less than optimal for replication, the phage may actually confer immunity to further
infection upon the host (lysogeny). But if conditions are good, the virus produces so
many copies of itself that the cell walls burst-a state known as lysis-and the infection
spreads.
Two independently produced regulatory proteins compete to control whether the invading
genes will remain quiescent or be expressed. Because of inescapable thermal noise, the
outcome in any given case is random, and the proportion of the population in either state
changes according to conditions such as cell nutrition and the number of invading
particles per cell.
Arkin and his colleagues have found that the underlying stochastic mechanisms of the
lysis-lysogeny decision circuit, which involves five genes and four promoter proteins,
depend entirely upon the chance timing and concentrations of bursts of competing
proteins that act to reinforce or inhibit one another.
"Thermal fluctuation at the molecular level makes for diversity in cells that start out
under identical conditions," says Arkin. "The phage actually makes use of noise as a
survival mechanism: sometimes it pays to multiply and infect as many hosts as possible,
sometimes it pays to lie low. Either way, the viral population is prepared to cope with
changing conditions."
E. coli itself adapts a comparable strategy to increase its survival chances. In humans,
most E. coli live benign and even useful lives in the bowel. The bacteria can invade new
territory, however, by sprouting hairlike filaments known as pili and anchoring themselves
to tissues such as the lining of the urethra-where they resist being washed away and
can cause painful and dangerous infections.
Immobility is a perilous advantage. Pili make it hard for bacterial cells to divide and
multiply, and they can also trigger the body's immune system; their proteins make fine
binding sites for antibodies. In any population of uropathic E. coli, only a few at a time
display pili.
"What causes an
individual cell to express
pili or not is a dna
inversion switch," says
Arkin, "in which a protein
clips the gene in such a
way that it reads 'pili' in
one direction and 'no pili'
in the other." Arkin and
his colleagues have
been investigating how
noise in the system
leads to random
differences in the rate
at which individual
bacteria switch pili on
and off, under different
environmental
conditions-"which leads to significant differences in the proportion of hairy cells in a
population, which affects its spread and colonization."
Biochemical switches such as the lysogeny/lysis decision in viruses, governed by
antagonistic feedback loops, and pili expression in bacteria, randomly implemented by a
dna inversion reaction, are found in many different infectious microorganisms. Indeed,
they are typical of developmental pathways in general.
Simulating cell circuits
Arkin and his colleagues have investigated pathways in more complex cells, such as the
calcium-ion oscillator that may control embryonic cell division in organisms such as sea
urchins and frogs. Nevertheless, he is in no hurry to abandoned simple organisms. "The
only way we can fully understand even a simple organism is by a formal mathematical
framework, a rigorous and testable theory, linked to experimental results," he says. "What
goes on in a single cell really is a big deal."
With testable predictions derived from theory and computer modeling, and with a growing
database of chemical devices and circuit motifs, Arkin hopes to make the understanding
of intracellular communication and control "a true engineering discipline. We've hardly
begun, but we're getting the tools."
One of the primary tools he is working to develop is a computer-aided simulation and
analysis program called bio/Spice-a name deliberately chosen to echo spice (for
"Simulation Program for Integrated Circuit Evaluation"), developed at the University of
California at Berkeley in the 1970s and widely used to analyze electronic circuits and
predict their behavior.
With bio/Spice, Arkin hopes to create a way for biologists to understand and build models
of living systems as easily as electronic-circuit designers lay out a chip. bio/Spice and
other tools will "yield new insights and new biological principles," he says, and "aid in
designing novel functions into cells."
If Arkin is right, both electronic and cellular circuitry are subsets of overarching
circuit-design principles. He thinks it's possible that a basic understanding of intracellular
communication "may even teach us how to design more robust electronic circuitry."
Meanwhile the focus is on fundamental biology. "The Department of Energy's national
laboratories have the best data-production and analysis facilities in the world-from tools
for decoding the genome, to determining protein structure, to analyzing the cellular
environment and whole tissues," Arkin says, "These are the facilities-and the people-we
need to ask the complex questions. And the opportunities are spectacular."
###
|