m8ta
You are not authenticated, login. |
|
{1537} |
ref: -0
tags: cortical computation learning predictive coding reviews
date: 02-23-2021 20:15 gmt
revision:2
[1] [0] [head]
|
||
PMID-30359606 Predictive Processing: A Canonical Cortical Computation
PMID-23177956 Canonical microcircuits for predictive coding
Control of synaptic plasticity in deep cortical networks
| |||
{1512} |
ref: -0
tags: rutherford journal computational theory neumann complexity wolfram
date: 05-05-2020 18:15 gmt
revision:0
[head]
|
||
The Structures for Computation and the Mathematical Structure of Nature
| |||
{1472} |
ref: -0
tags: computational neuroscience opinion tony zador konrad kording lillicrap
date: 07-30-2019 21:04 gmt
revision:0
[head]
|
||
Two papers out recently in Arxive and Biorxiv:
| |||
{1391} | |||
Evolutionary Plasticity and Innovations in Complex Metabolic Reaction Networks
Summary thoughts: This is a highly interesting study, insofar that the authors show substantial support for their hypotheses that phenotypes can be explored through random-walk non-lethal mutations of the genotype, and this is somewhat invariant to the source of carbon for known biochemical reactions. What gives me pause is the use of linear programming / optimization when setting the relative concentrations of biomolecules, and the permissive criteria for accepting these networks; real life (I would imagine) is far more constrained. Relative and absolute concentrations matter. Still, the study does reflect some robustness. I suggest that a good control would be to ‘fuzz’ the list of available reactions based on statistical criteria, and see if the results still hold. Then, go back and make the reactions un-biological or less networked, and see if this destroys the measured degrees of robustness. | |||
{1083} |
ref: Holgado-2010.09
tags: DBS oscillations beta globus pallidus simulation computational model
date: 02-22-2012 18:36 gmt
revision:4
[3] [2] [1] [0] [head]
|
||
PMID-20844130[0] Conditions for the Generation of Beta Oscillations in the Subthalamic Nucleus–Globus Pallidus Network
____References____
| |||
{931} |
ref: Deco-2009.05
tags: stochastic dynamics Romo memory computation
date: 01-16-2012 18:54 gmt
revision:1
[0] [head]
|
||
PMID-19428958[0] Stochastic dynamics as a principle of brain function
____References____
| |||
{154} |
ref: OReilly-2006.02
tags: computational model prefrontal_cortex basal_ganglia
date: 12-07-2011 04:11 gmt
revision:1
[0] [head]
|
||
PMID-16378516[0] Making Working Memory Work: A Computational Model of Learning in the Prefrontal Cortex and Basal Ganglia found via: http://www.citeulike.org/tag/basal-ganglia ____References____
| |||
{714} | |||
PMID-12433288[0] Real-time computing without stable states: a new framework for neural computation based on perturbations.
____References____
| |||
{793} | |||
Andrew Ng's notes on learning theory
| |||
{661} |
ref: -0
tags: computational geometry triangulation ocaml kicadocaml zone fill edge
date: 01-26-2009 01:47 gmt
revision:3
[2] [1] [0] [head]
|
||
I have been working hard to add zone support to kicadocaml since the implementation in kicad's PCBnew is somewhat borken (at least for my boards). It is not a very easy task! Roughly, the task is this: given a zone of copper pour, perhaps attached to the ground net, and a series of tracks, vias, and pads also on that layer of the PCB but not on the same net, form cutouts in the zone so that there is an even spacing between the tracks/vias and zone. Currently I'm attacking the problem using triangles (not polygons like the other PCB softwares). I chose triangles since I'm using OpenGL to display the PCB, and triangles are a very native mode of drawing in OpenGL. Points are added to the triangle mesh with an incremental algorithm, where the triangles are stored as a linked-mesh : each triangle has a pointer (index#) to the triangle off edge ab,bc,ca. This allows finding the containing triangle when inserting a point a matter of jumping between triangles; since many of the points to be inserted are close to eachother, this is a relatively efficient algorithm. Once the triangle containing a point to be inserted is found, the triangle is split into three, the pointers are updated appropriately, and each triangle is tested to see if flipping with it's pair would result in a net larger smallest interior angle between the two. (This is not the same as Delaunay's criteria, but it is simpler, and it produces equally beautiful pictures.) The problem is when two triangles are allowed to overlap or a gap is allowed - this makes the search algorithm die or get into a loop, and is a major major problem of the approach. In Guibas and Stolfi's paper, "Primitives for the manipulation of general subdivisions and the computation of Voronoi diagrams", they use an edge data structure, rather than a triangle data structure, which I suppose avoids this problem. I was lazy when starting this project, and chose the more obvious triangle-centric way of storing the data. The insertion of points is actually not so hard; the big problem is making sure the edges in the original list of polygons are represented in the list of edges in the triangle mesh. Otherwise, triangles will span edges, which will result in DRC violations (e.g.g copper too close to vias). My inefficient way of doing this is to calculate, for all triangles, their intersections with the polygon segments, then adding this to the mesh until all segments are represented in the list. This process, too, is prone to numerical instability. Perhaps the solution is to move back to an edge-centric data representation, so that certain edges can be 'pinned' or frozen, and hence they are guaranteed to be in the triangle mesh's edge list. I don't know; need to think about this more. Update: I got most of it working; at least the triangulation & making sure the edges are in the triangle mesh are working. Mostly there were issues with numerical precision with narrow / small triangles; I rewrote the inside triangle function to use the cross product, which helped (this seems like the simplest way, and it avoids divisions!): ocaml let insidetri a b c d = cross (sub b a) (sub d a) > 0.0 && cross (sub c b) (sub d b) > 0.0 && cross (sub a c) (sub d c) > 0.0 ;; as well as the segment-segment intersection algorithm: ocaml let intersect a b c d = (* see if two line segments intersect *) (* return the point of intersection too *) let ab = sub b a in (* a prime is the origin *) let bp = length ab in let xx = norm ab in let yy = (-1.) *. (snd xx) , (fst xx) in let project e = (dot (sub e a) xx) , (dot (sub e a) yy) in let cp = project c in let dp = project d in let cd = sub dp cp in let m = (fst cd) /. (snd cd) in let o = (fst cp) -. m *. (snd cp) in let e = add (scl ab (o /. bp)) a in (* cp and dp must span the x-axis *) if ((snd cp) <= 0. && (snd dp) >= 0.) || ((snd cp) >= 0. && (snd dp) <= 0.) then ( if o >= 0. && o <= bp then ( true, e ) else ( false, e ) ) else ( false, e ) ;; Everything was very sensitive to ">" vs. ">=" -- all must be correct. All triangles must be CCW, too, for the inside algorithm to work - this requires that points to be inserted close to a triangle edge must be snapped to that edge to avoid any possible CW triangles. (Determining if a triangle is CW or CCW is as simple as measuring the sign of the smallest cross product between two segments). I tried, for a day or so, to include a specialized function to insert points along a triangle's edge, but that turned out not to matter; the normal flipping routine works fine. I also tried inserting auxiliary points to try to break up very small triangles, but that really didn't affect the stability of the algorithm much. It is either correct, or it is not, and my large board was a good test suite. I have, however, seeded the triangularization with a grid of (up to) 20x20 points (this depends on the aspect ratio of the region to be filled - the points are equally spaced in x and y). This adds (max) 800 triangles, but it makes the algorithm more stable - fewer very narrow triangles - and we are working with sets of 10,000 triangles anyway for the large zones of copper. Some corrections remain to be done regarding removing triangles based on DRC violation and using the linked-mesh of triangles when calculating edge-triangle edge intersection, but that should be relatively minor. Now I have to figure out how to store it in Kicad's ".brd" file format. Kicad uses "Kbool" library for intersection polygons - much faster than my triangle methods (well, it's in C not ocaml) - and generates concave polygons not triangles. Would prefer to do this so that I don't have to re-implement gerber export. (Of course, look at how much I have re-implemented! This was originally a project just to learn ocaml - Well, gotta have some fun :-) | |||
{80} |
ref: Chan-2006.12
tags: computational model primate arm musculoskeletal motor_control Moran
date: 04-09-2007 22:35 gmt
revision:1
[0] [head]
|
||
PMID-17124337[0] Computational Model of a Primate Arm: from hand position to joint angles, joint torques, and muscle forces ideas:
____References____ | |||
{151} | |||
PMID-11741014 Computational approaches to motor control. Tamar Flash and Terry Sejnowski.
|