Invited speakers:
[pdf]
We review recent advances in the statistical analysis of neuronal spike trains based on Gibbs distributions in a large sense (including non stationarity). We evoke some possible applications of Variable Length Markov Chains in this field. [pdf]
Abstract : VLMC allows to model time-series with finite state space and highly-varied dynamics. In this presentation we consider the situation where the realization of the VLMC are not observed directly but through an observation process. Variable Length Hidden Markov models (VLHMM) are hidden versions of VLMC models and they extend hidden Markov models. I will address the issue of the context tree estimation in VLHMM. The estimation is done by maximization of a penalized likelihood criteria. The strong consistency of the estimator is proved under general assumptions on the model. The estimator is built using a pruning technique that is combined with an Expectation Maximization based procedure. [pdf]
The semi-Markov chains generalize the Markov chains and renewal chains. We will present the basic theory of semi-Markov chains with discrete and general state space. Further, estimation of the semi-Markov kernel and the transition function as well as some applications concerning survival and reliability function, earthquake study, DNA analysis will be given. Finally, some results concerning random evolutions in an asymptotic setting will be presented. [pdf]
Reference: V. Barbu, N. Limnios, Semi-Markov Chains and Hidden Semi-Markov Models. Toward Applications. Their use in Reliability and DNA Analysis, Lecture Notes in Statistics, vol. 191, Springer, 2008.
We consider a non Markovian process with a countable number of interacting components. At each time unit, each component can take two values, indicating if it has a spike or not at this precise moment. For each component, the probability of having a spike at the next time unit depends on the entire time evolution of the system after the last spike time of the component. This class of systems extends both the interacting particle systems, which are Markovian, and the stochastic chains with memory of variable length which have finite state space.
We construct a stationary version of the process by using a probabilistic tool which is a Kalikow-type decomposition either in random environment or in space-time. This construction implies uniqueness of the stationary process. [pdf]
G-measures are discrete time stochastic processes generated by conditioning on the past. (One-dimensional) Gibbs measures correspond to random fields generated by conditioning simultaneously on the past and the future. The aim of this talk is to review and compare results of both theories. [pdf]
I will define the VLMC models and make an overview of related questions (probabilistic properties, connections to other non markov models, random walks). I will also present some applications to text algorithms, data structures and neurobiology. [pdf]
[pdf]
Interactions between neurons can generate very complex and time-delayed patterns. In fact, neural interactions may reflect a complex anatomical substrate, where chains of activations trigger complex collective and self-organized phenomena. This represents a potential problem in experimental configurations where delayed communications between neurons are taken into consideration. Standard techniques like correlation analysis are, in many cases, unable to detect such events. Such a problem can be solved by mathematical tools able to model arbitrarily long temporal relationships.
A novel framework wherein spike trains with arbitrarily long temporal dependencies are modeled by Markov stochastic models has been proposed. Typically, in regular Markov models, each state depends only on the previous state while higher-order Markov models suffer from high state-space computational complexity. Here, the Variable-length Markov Models (VMMs) are considered because they are able to overcome these limitations.
Such a methodology has showed several interesting potentialities capturing key elements in neural functional dependencies between couples of neurons. [pdf]