You are not authenticated, login.
text: sort by
tags: modified
type: chronology
{231} is owned by tlh24.
hide / / print
ref: -0 tags: sparse coding reference list olshausen field date: 08-04-2021 01:07 gmt revision:5 [4] [3] [2] [1] [0] [head]

This was compiled from searching papers which referenced Olshausen and Field 1996 PMID-8637596 Emergence of simple-cell receptive field properties by learning a sparse code for natural images.

hide / / print
ref: -2017 tags: human level concept learning through probabalistic program induction date: 01-20-2020 15:45 gmt revision:0 [head]

PMID-26659050 Human level concept learning through probabalistic program induction

  • Preface:
    • How do people learn new concepts from just one or a few examples?
    • And how do people learn such abstract, rich, and flexible representations?
    • How can learning succeed from such sparse dataset also produce such rich representations?
    • For any theory of learning, fitting a more complicated model requires more data, not less, to achieve some measure of good generalization, usually in the difference between new and old examples.
  • Learning proceeds bu constructing programs that best explain the observations under a Bayesian criterion, and the model 'learns to learn' by developing hierarchical priors that allow previous experience with related concepts to ease learning of new concepts.
  • These priors represent learned inductive bias that abstracts the key regularities and dimensions of variation holding actoss both types of concepts and across instances.
  • BPL can construct new programs by reusing pieced of existing ones, capturing the causal and compositional properties of real-world generative processes operating on multiple scales.
  • Posterior inference requires searching the large combinatorial space of programs that could have generated a raw image.
    • Our strategy uses fast bottom-up methods (31) to propose a range of candidate parses.
    • That is, they reduce the character to a set of lines (series of line segments), then simply the intersection of those lines, and run a series of parses to estimate the generation of those lines, with heuristic criteria to encourage continuity (e.g. no sharp angles, penalty for abruptly changing direction, etc).
    • The most promising candidates are refined by using continuous optimization and local search, forming a discrete approximation to the posterior distribution P(program, parameters | image).

hide / / print
ref: -0 tags: standard enthalpy chemicals list pdf date: 06-25-2015 00:09 gmt revision:1 [0] [head]

Standard thermodynamic properties of chemical substances

hide / / print
ref: Velliste-2008.06 tags: Schwartz 2008 Velliste BMI feeding population vector date: 01-06-2012 00:19 gmt revision:1 [0] [head]

PMID-18509337[0] Cortical control of a prosthetic arm for self-feeding

  • Idea: move BMI into robotic control.
  • population vector control, which has been shown to be inferior to the Wiener filter.
  • 112 units for control in one monkey. 2 monkeys used.
  • 4D control -- x, y, z, gripper.
  • 1064 trials over 13 days, average success rate of 78%
  • Gripper opened as the arm returned to mouth. Works b/c marshmallows are sticky.


[0] Velliste M, Perel S, Spalding MC, Whitford AS, Schwartz AB, Cortical control of a prosthetic arm for self-feeding.Nature 453:7198, 1098-101 (2008 Jun 19)

hide / / print
ref: notes-0 tags: debian apt sources list date: 0-0-2006 0:0 revision:0 [head]


#deb file:///cdrom/ sarge main

deb http://mirrors.kernel.org/debian/ unstable main contrib non-free
deb-src http://mirrors.kernel.org/debian/ unstable main contrib non-free

deb http://security.debian.org/ stable/updates main contrib non-free
later run as root:
  # apt-get update
  # aptitude install aptitude 
  # aptitude -f --with-recommends dist-upgrade
  # reboot
then you'll probably want to get a newer kernel, ala: http://www.howtoforge.com/forums/showthread.php?t=21

hide / / print
ref: bookmark-0 tags: Bayes Baysian_networks probability probabalistic_networks Kalman ICA PCA HMM Dynamic_programming inference learning date: 0-0-2006 0:0 revision:0 [head]

http://www.cs.ubc.ca/~murphyk/Bayes/bnintro.html very, very good! many references, well explained too.