The Community for Technology Leaders
RSS Icon
Subscribe
Palo Alto, California USA
Sept. 18, 2011 to Sept. 21, 2011
ISBN: 978-0-7695-4492-2
pp: 213-214
ABSTRACT
Machine Translation (MT) systems are evaluated and debugged using the BLEU automated metric. However, the current community implementation of BLEU is not ideal for MT system developers and researchers since it only produces textual information. I present a novel tool called iBLEU that organizes BLEU scoring information in a visual and easy-to-understand manner, making it easier for MT system developers & researchers to quickly locate documents and sentences on which their system performs poorly. It also allows comparing translations from two different MT systems. Furthermore, one can also choose to compare to the publicly available MT systems, e.g., Google Translate and Bing Translator, with a single click. It can run on all major platforms and requires no setup whatsoever.
INDEX TERMS
machine translation, information visualization
CITATION
Nitin Madnani, "iBLEU: Interactively Debugging and Scoring Statistical Machine Translation Systems", ICSC, 2011, 2012 IEEE Sixth International Conference on Semantic Computing, 2012 IEEE Sixth International Conference on Semantic Computing 2011, pp. 213-214, doi:10.1109/ICSC.2011.36
34 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool