Currently Active Research Topics:
Extreme Synergy: Recent results obtained by Bill Bialek and collaborators at Princeton indicate that even relatively weak pairwise correlations between retinal neurons can, in aggregate, produce astonishing levels of order in the resulting firing patterns, in much the same way that the interactions between local domains can lead to global ordering of ferromagnetic materials below the Curie temperature. I have recently been investigating the hypothesis that the presence of similarly realistic pairwise correlations could allow downstream targets to more rapidly reconstruct visual stimuli from retinal spike trains. My findings, reported in a preprint entitled Extreme Synergy, suggest that information regarding the local intensity of each pixel can in many cases be widely distributed across a large population containing hundreds of retinal ganglion cells all responding to the same contiguous stimulus, a non-local encoding strategy that may have evolved to minimize the number of spikes necessary to support rapid image reconstruction.
Self-Repairing Synapses: One of the great unsolved mysteries of neuroscience is how we are able to retain long-term memories despite the known volatility of individual synapses, which are subject to ongoing random changes due to both intrinsic and extrinsic sources. We suggest that a robust solution to this problem requires a fundamental reassessment of what types of information can and cannot be learned by biological systems. Specifically, we suggest that the decision surfaces maintained by any given pattern of synaptic weights, in order to remain stable under random fluctuations, must correspond to separable independent components in the raw environmental input. The problem of storing memories over long periods, despite random fluctuations in individual synaptic weights, can thus be solved by exploiting the structure present in the environment itself. As a corollary, our findings suggest that in a purely random environment, long-term storage of information would be impossible.
Self-Repairing Synapses. The decision surface through time for the standard static implementation (top) and the dynamic implementation (bottom). Faults are injected into first layer nodes.
High-Performance Neural Computing: Simulating large, semi-realistic neural systems will clearly require massive computational resources. We are developing a suite of object-oriented tools that will allow any neural simulator to maximum advantage of high-end computer clusters.