m8ta
You are not authenticated, login.
text: sort by
tags: modified
type: chronology
{1449}
hide / / print
ref: -0 tags: sparse coding reference list olshausen field date: 08-04-2021 01:07 gmt revision:5 [4] [3] [2] [1] [0] [head]

This was compiled from searching papers which referenced Olshausen and Field 1996 PMID-8637596 Emergence of simple-cell receptive field properties by learning a sparse code for natural images.

{1496}
hide / / print
ref: -2017 tags: locality sensitive hashing olfaction kenyon cells neuron sparse representation date: 01-18-2020 21:13 gmt revision:1 [0] [head]

PMID-29123069 A neural algorithm for a fundamental computing problem

  • Ceneral idea: locality-sensitive hashing, e.g. hashing that is sensitive to the high-dimensional locality of the input space, can be efficiently solved using a circuit inspired by the insect olfactory system.
  • Here, activation of 50 different types of ORNs is mapped to 50 projection neurons, which 'centers the mean' -- concentration dependence is removed.
  • This is then projected via a random matrix of sparse binary weights to a much larger set of Kenyon cells, which in turn are inhibited by one APL neuron.
  • Normal locality-sensitive hashing uses dense matrices of Gaussian-distributed random weights, which means higher computational complexity...
  • ... these projections are governed by the Johnson-Lindenstrauss lemma, which says that projection from high-d to low-d space can preserve locality (distance between points) within an error bound.
  • Show that the WTA selection of the top 5% plus random binary weight preserves locality as measured by overlap with exact input locality on toy data sets, including MNIST and SIFT.
  • Flashy title as much as anything else got this into Science... indeed, has only been cited 6 times in Pubmed.

{1451}
hide / / print
ref: -2018 tags: sparse representation auditory cortex excitatation inhibition balance date: 03-11-2019 20:47 gmt revision:1 [0] [head]

PMID-30307493 Sparse Representation in Awake Auditory Cortex: Cell-type Dependence, Synaptic Mechanisms, Developmental Emergence, and Modulation.

  • Sparse representation arises during development in an experience-dependent manner, accompanied by differential changes of excitatory input strength and a transition from unimodal to bimodal distribution of E/I ratios.

{1448}
hide / / print
ref: -2004 tags: Olshausen sparse coding review date: 03-08-2019 07:02 gmt revision:0 [head]

PMID-15321069 Sparse coding of sensory inputs

  • Classic review, Olshausen and Field. 15 years old now!
  • Note the sparsity here is in neuronal activation, not synaptic activity (though one should follow the other).
  • References Lewicki's auditory studies, Efficient coding of natural sounds 2002; properties of early auditory neurons are well suited for producing a sparse independent code.
    • Studies have found near binary encoding of stimuli in rat auditory cortex -- e.g. one spike per noise.
  • Suggests that overcomplete representations (e.g. where there are more 'second layer' neurons than inputs or pixels) are useful for flattening manifolds in the input space, making feature extraction easier.
    • But then you have an under-determined problem, where presumably sparsity metrics step in to restrict the actual coding space. Authors mention that this could lead to degeneracy.
    • Example is the early visual cortex, where axons to higher layers exceed those from the LGN by a factor of 25. Which, they say, may be a compromise between over-representation and degeneracy.
  • Sparse coding is a necessity from an energy standpoint -- only one in 50 neurons can be active at any given time.
  • Sparsity increases when classical receptive field stimuli in V1 is expanded with a real-world-statistics surround. (Gallant 2002).

{1431}
hide / / print
ref: -0 tags: betzig sparse and composite coherent lattices date: 02-14-2019 00:00 gmt revision:1 [0] [head]

Sparse and composite coherent lattices

  • Focused on the math:
    • Linear algebra to find the wavevectors from the Bravais primitive vectors;
    • Iterative maximization @ lattice points to find the electric field phase and amplitude
    • (Read paper for details)
  • High NA objective naturally converts plane wave to a spherical wave; this can be used to create spherically-constrained lattices at the focal point of objectives.