buchspektrum Internet-Buchhandlung

Neuerscheinungen 2017

Stand: 2020-02-01
Schnellsuche
ISBN/Stichwort/Autor
Herderstraße 10
10625 Berlin
Tel.: 030 315 714 16
Fax 030 315 714 14
info@buchspektrum.de

Sathyanarayanan Kuppusami

Recurrent Neural Network based Probabilistic Language Model


Speech Recognition with Probabilistic Language Model
2017. 60 S. 220 mm
Verlag/Jahr: AV AKADEMIKERVERLAG 2017
ISBN: 6-202-20544-X (620220544X)
Neue ISBN: 978-6-202-20544-3 (9786202205443)

Preis und Lieferzeit: Bitte klicken


Statistical n-gram language models are widely used for their state of the art performance in a continuous speech recognition system. In a domain based scenario, the sequences vary at large for expressing same context by the speakers. But, holding all possible sequences in training corpora for estimating n-gram probabilities is practically difficult. Capturing long distance dependencies from a sequence is an important feature in language models that can provide non zero probability for a sparse sequence during recognition. A simpler back-off n-gram model has a problem of estimating the probabilities for sparse data, if the size of n gram increases. Also deducing knowledge from training patterns can help the language models to generalize on an unknown sequence or word by its linguistic properties like noun, singular or plural, novel position in a sentence. For a weaker generalization, n-gram model needs huge sizes of corpus for training. A simple recurrent neural network based language model approach is proposed here to efficiently overcome the above difficulties for domain based corpora.
In my beloved interest of research in robotics, I obtained my Master´s degree in Intelligent Adaptive Systems from Universität Hamburg. I have good experience in Machine learning, Neural networks, Ros programming for NAO robot and image processing. I like to explore in the field of decision making from knowledge processing.