WO

is Shi Wang, Associate Professor from Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China. I am mainly engaged in the fields of LLM based NLP, knowledge graph, and neural-symbolic dual-process computing. Specifically, I focus on integrating symbolic knowledge with deep learning to realize dual-process cognitive computing for LLM based natural language reasoning involved tasks. Research papers were published in AAAI, WWW, EMNLP, ACL and other international top conferences.

I am vice secretary general of Chinese Association for Artificial Intelligence(CAAI) Mind Computation Committee, member of TCM informatization professional committee, Beijing chronic disease big data professional committee. I am supported by the National Key Research and Development Program of China, National Natural Science Foundation of China, the National Information Security Program, Beijing NOVA Program, etc.

I welcome all undergraduate students who have solid foundation in computer science or mathematics to accomplish practical and interesting research which can really change something. Feel free to contact me!

Research interests

  • LLM & NLP

    My research in LLM & NLP mainly focuses on training large language models and investigating their mysterious features such as emergence effect, hallucination, COT, etc. We also fine-tune LLMs for NLP applications including dialogue system and text generation. LLM-based-NLP is widely used in information retrieving, recommendation, online advertising, and many other important products.

  • Knowledge Graph

    My research in knowledge graph field focuses on automatically extracting entity-relation and event from unstructured/simi-structured data such as text, wiki pages or heterogeneous tables. For example, we extract particular person/time/location and their relationships from web pages. Knowledge graph is the fundamental resource for many nlp and other intelligent applications, or even considered to be the bottleneck of reliable AI.

  • Neural-Symbolic Dual-Process Computing

    Deep learning implemented by neural network has some limitations including zero-shot learning, interpretability, and noise sensitive, etc. We believe integrating symbolic knowledge graph and logic rules with vectorized deep learning model in a dual-process way is a better simulation. Research topics include symbolic knowledge representation, dual-process NN structure, and knowledge enhanced LLM, etc.

WO: hello world ~

Contact

Address: No.6 Kexueyuan South Road Zhongguancun, Haidian District, Beijing, China

Email: wangshi (at) ict dot ac dot cn