Download now Free registration required
Hidden Markov Models (HMMs) are one of the most fundamental and widely used statistical tools for modeling discrete time series. In general, learning HMMs from data is computationally hard (Under cryptographic assumptions), and practitioners typically resort to search heuristics which suffer from the usual local optima issues. The authors prove that under a natural separation condition (Bounds on the smallest singular value of the HMM parameters), there is an efficient and provably correct algorithm for learning HMMs. The sample complexity of the algorithm does not explicitly depend on the number of distinct (Discrete) observations - it implicitly depends on this quantity through spectral properties of the underlying HMM.
- Format: PDF
- Size: 276.12 KB