![]() ![]() Large-scale Kernel Machines edited by Léon Bottou, Olivier Chapelle, Dennis DeCoste, and Jason Weston Pervasive and networked computers have dramatically reduced the cost of collecting and distributing large datasets. This volume presents and analyzes the state of the art in machine learning algorithms and theory in this novel field. A new domain of machine learning, in which the prediction must satisfy the additional constraints found in structured data, poses one of machine learning's greatest challenges: learning functional dependencies between arbitrary input and output domains. Machine learning develops intelligent computer systems that are able to generalize from previously seen examples. The book then presents new or improved statistical machine translation techniques, including a discriminative training framework for leveraging syntactic information, the use of semi-supervised and kernel-based learning methods, and the combination of multiple machine translation outputs in order to improve overall translation quality. These include the acquisition of bilingual sentence-aligned data from comparable corpora, automatic construction of multilingual name dictionaries, and word alignment. The book looks first at enabling technologies-technol-ogies that solve problems that are not machine translation proper but are linked closely to the development of a machine translation system. This volume investigates how machine learning techniques can improve statistical machine translation, currently at the forefront of research in the field. The investigation of automated or semi-automated approaches to translation has become a thriving research field with enormous commercial potential. The Internet gives us access to a wealth of information in languages we don't understand.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |