Currently Active Research Topics:


Extreme Synergy: Recent results obtained by Bill Bialek and collaborators at Princeton indicate that even relatively weak pairwise correlations between retinal neurons can, in aggregate, produce astonishing levels of order in the resulting firing patterns, in much the same way that the interactions between local domains can lead to global ordering of ferromagnetic materials below the Curie temperature. I have recently been investigating the hypothesis that the presence of similarly realistic pairwise correlations could allow downstream targets to more rapidly reconstruct visual stimuli from retinal spike trains. My findings, reported in a preprint entitled Extreme Synergy, suggest that information regarding the local intensity of each pixel can in many cases be widely distributed across a large population containing hundreds of retinal ganglion cells all responding to the same contiguous stimulus, a non-local encoding strategy that may have evolved to minimize the number of spikes necessary to support rapid image reconstruction.

Extreme Synergy. Spikes trains 100 msec in duration with a baseline rate of 25 Hz were used to simulate firing activity in a 3232 retinal patch. IMAGE: Stimuli were square spots with 10% of pixels randomly deleted. Intensity indicated to the left of each panel. RATE: Representative reconstructions based on the number of spikes. gMUA*: Reconstructions from the same spike trains but based on the largest singular vector of the pairwise correlation matrix, weighted by the oscillatory component of the Local Field Potential in the g-band (gMUA*). Percentage of pixels correctly classified as ON or OFF indicated to bottom right of each panel. Spatiotemporal correlations yielded dramatic improvements in image reconstruction at all intensities without sacrificing fine spatial detail.



Self-Repairing Synapses: One of the great unsolved mysteries of neuroscience is how we are able to retain long-term memories despite the known volatility of individual synapses, which are subject to ongoing random changes due to both intrinsic and extrinsic sources. We suggest that a robust solution to this problem requires a fundamental reassessment of what types of information can and cannot be learned by biological systems. Specifically, we suggest that the decision surfaces maintained by any given pattern of synaptic weights, in order to remain stable under random fluctuations, must correspond to separable independent components in the raw environmental input. The problem of storing memories over long periods, despite random fluctuations in individual synaptic weights, can thus be solved by exploiting the structure present in the environment itself. As a corollary, our findings suggest that in a purely random environment, long-term storage of information would be impossible.

Self-Repairing Synapses. The decision surface through time for the standard static implementation (top) and the dynamic implementation (bottom). Faults are injected into first layer nodes.



High-Performance Neural Computing: Simulating large, semi-realistic neural systems will clearly require massive computational resources. We are developing a suite of object-oriented tools that will allow any neural simulator to maximum advantage of high-end computer clusters.


High Performance Neural Computing. The SENSE pipeline consists of presenting rendered visual stimuli to a cortical model, decoding the output spike trains and optimizing model parameters until the predicted segmentations match the labeled input. input: naturalistic computer-rendered image sequences. cortical model: hierarchical network of spiking neurons with realistic dendritic morphology and extensive feedback interactions. output: spike trains encoding image segments as spatiotemporal patterns spanning multiple processing levels. segmentation: labeled training data. decoder: image segmentation extracted from cortical spike trains, optimized from training data. genetic algorithm: parameter optimization routine.