m8ta
You are not authenticated, login. 

{1574} 
ref: 0
tags: ocaml application functional programming
date: 10112022 21:36 gmt
revision:2
[1] [0] [head]


https://stackoverflow.com/questions/26475765/ocamlfunctionwithvariablenumberofarguments From this I learned that in ocaml you can return not just functions (e.g. currying) but appliations of yettobe named functions. let sum f = f 0 ;; let arg a b c = c ( b + a ) ;; let z a = a ;; then sum (arg 1) ;; is welltyped as (int > `a) > `a = <fun> e.g. an application of a function that converts int to `a. Think of it as the application of Xa to argument ( 0 + 1 ), where Xa is the argument (per type signature). Zero is supplied by the definition of 'sum'. sum (arg 1) (arg 2);; can be parsed as (sum (arg 1)) (arg 2) ;; '(arg 2)' outputs an application of an int & a yetto be determined function to 'a, E.g. it's typed as int > (int > `a) > `a = <fun>. So, you can call it Xa passed to above. Or, Xa = Xb( ( 0 + 1 ) + 2) where, again, Xb is a yettobe defined function that is supplied as an argument. Therefore, you can collapse the whole chain with the identity function z. But, of course, it could be anything else  square root perhaps for MSE? All very clever.  
{1572} 
ref: 2019
tags: Piantadosi cogntion combinators function logic
date: 09052022 01:57 gmt
revision:0
[head]


 
{1288} 
ref: 0
tags: automatic programming inductive functional igor
date: 07292014 02:07 gmt
revision:0
[head]


Inductive Rule Learning on the Knowledge Level.
 
{1141} 
ref: 0
tags: putamen functional organization basal ganglia
date: 02242012 21:01 gmt
revision:0
[head]


PMID6705861 Single cell studies of the primate putamen. I. Functional organization.
 
{1088}  
PMID18540149[0] Deep brain stimulation: how does it work?
____References____
 
{788} 
ref: 0
tags: reinforcement learning basis function policy specialization
date: 01032012 02:37 gmt
revision:1
[0] [head]


To read:  
{60} 
ref: Douglas1991.01
tags: functional microcircuit cat visual cortex microstimulation
date: 12292011 05:12 gmt
revision:3
[2] [1] [0] [head]


PMID1666655[0] A functional microcircuit for cat visual cortex
____References____
 
{821} 
ref: work0
tags: differential evolution function optimization
date: 07092010 14:46 gmt
revision:3
[2] [1] [0] [head]


Differential evolution (DE) is an optimization method, somewhat like NeidlerMead or simulated annealing (SA). Much like genetic algorithms, it utilizes a population of solutions and selection to explore and optimize the objective function. However, it instead of perturbing vectors randomly or greedily descending the objective function gradient, it uses the difference between individual population vectors to update hypothetical solutions. See below for an illustration. At my rather cursory reading, this serves to adapt the distribution of hypothetical solutions (or population of solutions, to use the evolutionary term) to the structure of the underlying function to be optimized. Judging from images/821_1.pdf Price and Storn (the inventors), DE works in situations where simulated annealing (which I am using presently, in the robot vision system) fails, and is applicable to higherdimensional problems than simplex methods or SA. The paper tests DE on 100 dimensional problems, and it is able to solve these with on the order of 50k function evaluations. Furthermore, they show that it finds function extrema quicker than stochastic differential equations (SDE, alas from 85) which uses the gradient of the function to be optimized. I'm surprised that this method slipped under my radar for so long  why hasn't anyone mentioned this? Is it because it has no proofs of convergence? has it more recently been superseded? (the paper is from 1997). Yet, I'm pleased because it means that there are also many other algorithms equally clever and novel (and simple?), out their in the literature or waiting to be discovered.  
{774} 
ref: work0
tags: functional programming compilation ocaml
date: 08242009 14:33 gmt
revision:0
[head]


The implementation of functional programming languages  book!  
{764} 
ref: work0
tags: ocaml mysql programming functional
date: 07032009 19:16 gmt
revision:2
[1] [0] [head]


Foe my work I store a lot of analyzed data in SQL databases. In one of these, I have stored the anatomical target that the data was recorded from  namely, STN or VIM thalamus. After updating the analysis programs, I needed to copy the anatomical target data over to the new SQL tables. Where perl may have been my previous goto language for this task, I've had enuogh of its strange quiks, hence decided to try it in Ruby (worked, but was not so elegant, as I don't actually know Ruby!) and then Ocaml. ocaml #use "topfind" #require "mysql" (* this function takes a query and a function that converts entries in a row to Ocaml tuples *) let read_table db query rowfunc = let r = Mysql.exec db query in let col = Mysql.column r in let rec loop = function  None > []  Some x > rowfunc col x :: loop (Mysql.fetch r) in loop (Mysql.fetch r) ;; let _ = let db = Mysql.quick_connect ~host:"crispy" ~database:"turner" ~password:"" ~user:"" () in let nn = Mysql.not_null in (* this function builds a table of files (recording sessions) from a given target, then uses the mysql UPDATE command to propagate to the new SQL database. *) let propagate targ = let t = read_table db ("SELECT file, COUNT(file) FROM `xcor2` WHERE target='"^targ^"' GROUP BY file") (fun col row > ( nn Mysql.str2ml (col ~key:"file" ~row), nn Mysql.int2ml (col ~key:"COUNT(file)" ~row) ) ) in List.iter (fun (fname,_) > let query = "UPDATE `xcor3` SET `target`='"^targ^ "' WHERE STRCMP(`file`,'"^fname^"')=0" in print_endline query ; ignore( Mysql.exec db query ) ) t ; in propagate "STN" ; propagate "VIM" ; propagate "CTX" ; Mysql.disconnect db ;; Interacting with MySQL is quite easy with Ocaml  though the type system adds a certain overhead, it's not too bad.  
{762} 
ref: work0
tags: covariance matrix adaptation learning evolution continuous function normal gaussian statistics
date: 06302009 15:07 gmt
revision:0
[head]


http://www.lri.fr/~hansen/cmatutorial.pdf
 
{183}  
PMID17237780[0] Switching from automatic to controlled action by monkey medial frontal cortex.
____References____  
{409} 
ref: bookmark0
tags: optimization function search matlab linear nonlinear programming
date: 08092007 02:21 gmt
revision:0
[head]


http://www.mat.univie.ac.at/~neum/ very nice collection of links!!  
{139}  
PMID9804671 Constructive incremental learning from only local information  
{140}  
PMID15649663 Composite adaptive control with locally weighted statistical learning.
