**THIS IS HISTORY! Check fabien.benetou.fr for news.**

Seedea, scalable creativity

- Add your own ideas :
- Small unrefined ideas
- OIMP : Ongoing quality projects
- PIMP : Latest private projects
- The Lab
- Search within ideas
- Active innovators

Xye, consultancy for serious creators

Information

(Updates)

*Back to Research.*

delta ( scientogram(domain x,instant t), scientogram(domain x,instant t-1) ) use IG to guess t+1 (the higher the number of scientograms available for the smaller x, the better the precision)

hyp: limit ( of the amount of information one can memorize with "cramming strategy"/Rote learning ) = Lic << limit ( of the amount of information one can memorize with "incremental strategy" based on Lic(t)=Lic(t-1)+transformation) ~= fractal compression based on t-1 state ~= Mpeg4 ~= Fourrier transformation

pleasure given by teaching = finding the ideal transformation

from on knowledge/information state for an individual transform it to another state IG could be useful

by minimizing effort

and maximizing the amount of information added (as that probably maximize the pleasure for the learner)

- briefly studied materials
- LIX Emerging Trends in Visual Computing (ETVC'08)
- The Intrinsic Geometries of Learning Richard Nock, ETVC'08
- Information Geometry and Its Applications by Shun-ichi Amari, ETVC'08
- proceedings from this conference Information Geometry and Its Applications: Convex Function and Dually Flat Manifold, Shin-ichi Amari, Emerging Trends in Visual Computing, Springer, 2009

- Methods of Information Geometry by Shun-Ichi Amari, Hiroshi Nagaoka, American Mathematical Society, 2001
- Shun-ichi Amari on Information Geometry of Maximum Entropy Principle, MaxEnt2007, The 27th International Workshop on Bayesian Inference and Maximum Entropy, New York 2007

Page last modified on July 29, 2009, at 03:21 PM