《镜子大全》《朝华午拾》分享 http://blog.sciencenet.cn/u/liwei999 曾任红小兵,插队修地球,1991年去国离乡,不知行止。



已有 4146 次阅读 2013-9-28 21:16 |个人分类:立委科普|系统分类:科研笔记|关键词:学者| 理性主义

【立委按】笔者曾就几十年研发的切身体会,撰写系列文章回顾领域里普遍存在的来自主流的《傲慢与偏见》。这些偏见重统计,轻文法,重经验主义,轻理性主义。由此而起,前辈董振东教授推荐了Church教授的主流反思文章《钟摆摆得太远》。Church 教授是自然语言领域深有远见的泰山北斗人物。他的这篇长文回顾了自然语言和人工智能发展中,理性主义和经验主义各领风骚振荡起伏的历史规律。高屋建瓴,气势恢宏,真乃大手笔,令人拍案叫绝。于是决定译介给大家,虽然我一般不做翻译工作。最近联系 Church 教授,请他对深度学习(deep learning)的热潮及其在经验主义与理性主义振荡中的影响发表看法。老教授预测,这波热潮为主流经验主义添了一把火,将会主导领域10几年,从而延宕理性主义回归的日程表,但是他不认为理性主义复兴的历史步伐会真正改变。他对主流社区统计一边倒的现状仍有忧虑,担心下一代淹没在一波一波的经验主义热潮中,缺失理性主义经典的教育,重犯历史的错误。

老教授回函:由于 deep nets 的热潮,理性主义回摆可能要再延迟10几年

Kenneth Church

Your point about deep nets is well taken. I expect that will keep the community busy for a decade or so. So it may take a decade or so longer than I thought before the pendulum starts swinging back. I could believe that deep nets will seem super exciting at first but after a while it will become harder and harder to make much progress.

As for your comments about center embedding, I made a similar arg in my masters thesis and I still pretty much believe what I said then.

My concern is that while we continue to make progress with whatever is hot right now, we will graduate a generation of students who were never taught the classic treatment. At some point it may become useful to know that treatment if only to avoid reinventing it and repeating mistakes of the past

On Sep 27, 2013 11:56 AM, "Wei Li" wrote:
Dr. Church,

Just to keep you informed of the development, the editors decided to publish the entire translation of your article. So my assistant and I have been working on translating it in its entirety. There might be some technical details of your descriptions which we will need your help in understanding as we come across them.

Meanwhile, I am sending you the parts (with bilingual parallel text) which are in relatively good shape, in case you have students who read Chinese and can offer advice. Most of the paper is done, and we still have "3.2", "3.3" and "3.4" to work on.

By the way, Chomsky's remarks on limitations of finite state methods present theoretic criticism to both the approximations in the learning community and, of course, the FSA based rule approaches which have been practised in industry for many years. Based on my own decades of NLP development using some type of cascaded/pipelined FSAs, the center embedding argument does not pose a real life challenge to deep parsing at all. With proper multi-level design (from phrase chunking to SVO parsing, step by step, etc.), finite state parsing can parse natural language as deep as we want it to be (and in an efficient way) and proves to be much easier and modular to design, develop and maintain. (We are using this seasoned technology with back-off to machine learning classification in our current product that mines consumer sentiments from social media big data: rules give results with high precision and learning helps with recall). Just a practical feedback to the theoretical criticism and to the question where we can reach fruits beyond the low hanging ones. This is from the traditional rule side. In the learning world, the recent buzz on deep learning seems to be able to help do more beyond low hanging fruit. What do you think if you have time to comment?

Thanks for attention and again thanks for the great article which we enjoy reading and have the pleasure of translating and recommending to the Chinese community,





7 曹聪 徐晓 戴德昌 翟自洋 吴飞鹏 吴吉良 杨正瓴

该博文允许注册用户评论 请点击登录 评论 (0 个评论)


Archiver|手机版|科学网 ( 京ICP备07017567号-12 )

GMT+8, 2020-5-30 05:04

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社