Categories
medical science

Kolmogorov entropy

Also known as metric entropy. Divide phase space into D-dimensional hypercubes of content epsilon^D. Let P_(i_0,...,i_n) be the probability that a trajectory is in hypercube i_0 at t=0i_1 at t=Ti_2 at t=2T, etc. Then define

 K_n=h_K=-sum_(i_0,...,i_n)P_(i_0,...,i_n)lnP_(i_0,...,i_n), (1)

where K_(N+1)-K_N is the information needed to predict which hypercube the trajectory will be in at (n+1)T given trajectories up to nT. The Kolmogorov entropy is then defined by

 K=lim_(T->0)lim_(epsilon->0^+)lim_(N->infty)1/(NT)sum_(n=0)^(N-1)(K_(n+1)-K_n). “></td><td>(2)</td></tr></tbody></table></figure>



<p>The Kolmogorov entropy is related to <a href=Lyapunov characteristic exponents by

 h_K=int_Psum_(sigma_i>0)sigma_idmu. “></td><td>(3)</td></tr></tbody></table></figure>



<p><a href=Pubmed articles

Leave a Reply

Your email address will not be published. Required fields are marked *