Информационные технологии
Интеллектуальный анализ данных
Методы и модели в естественных науках
Компьютерный анализ текстов
N.D. Todosiev, V.I. Yankovskiy, Y.E. Gapanyuk, A.M. Andreev "The Conceptual Modeling System Based on Metagraph Approach"
N.D. Todosiev, V.I. Yankovskiy, Y.E. Gapanyuk, A.M. Andreev "The Conceptual Modeling System Based on Metagraph Approach"
Abstract. 

The article is devoted to an approach to building a conceptual modeling system, which includes text recognition in a conceptual structure and text generation based on a conceptual structure. The metagraph is used as a conceptual structure. The architecture of the conceptual modeling system is proposed. The metagraph model is considered as a data model for conceptual modeling. The main ideas of the work of the text parsing module and text generation module are considered.

Keywords: 

complex Graph Structures, Metagraph, Conceptual Compression, Text Parsing, Text Generation.

Стр. 176-184.

DOI: 10.14357/20790279230120
 
 
References

1. The XMind homepage. Available at: https://www.xmind.net/ (accessed August 30, 2022)
2. Baker, C.F., C.J. Fillmore and J.B. Lowe. 1998. The Berkeley FrameNet Project, In Proceedings of the 36th Annual Meeting of the Association for Computational Linguistics and 17th International Conference on Computational Linguistics - Volume 1, Association for Computational Linguistics, pp: 86–90.
3. Buzan, T. 2018. Mind Map Mastery: The Complete Guide to Learning and Using the Most Powerful Thinking Tool in the Universe. Watkins Media.
4. Cui, B., Y. Li, M. Chen and Z. Zhang. 2018. Deep attentive sentence ordering network, In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, , pp: 4340–4349.
5. Flesch, R. 1948. A new readability yardstick. The Journal of Applied Psychology, 32(3): 221–233.
6. Ishikawa, K. 1986. Guide to Quality Control. Asian Productivity Organization.
7. Lyashevskaya, O.N. and J.L. Kuznetsova. 2009. Russian FrameNet: constructing a corpus--based dictionary of constructions [Russkij Frejmnet: k zadache sozdanija korpusnogo slovarja konstruktsij], In Computational Linguistics and Intellectual Technologies: Proceedings of the International Conference “Dialog”[Komp’juternaja Lingvistika I Intelleltual’nye Tehnologii: Po Materialam Ezhegodnoj Mezhdunarodnoj Konferentsii “Dialog”], , pp: 306–312.
8. Lyashevskaya, O. and E. Kashkin. 2015. Frame- Bank: A Database of Russian Lexical Constructions, In Analysis of Images, Social Networks and Texts, Springer International Publishing, pp: 350–360.
9. McCreesh, C., P. Prosser and J. Trimble. 2020. The Glasgow Subgraph Solver: Using Constraint Programming to Tackle Hard Subgraph Isomorphism Problem Variants. Graph Transformation, 316–324.
10. Novak, J. and A.J. Cañas. 2006. The Origins of the Concept Mapping Tool and the Continuing Evolution of the Tool. Information Visualization, 5(3): 175–184.
https://doi.org/10.1057/palgrave.ivs.9500126
11. Raffel, C., N. Shazeer, A. Roberts, K. Lee, S. Narang, M. Matena, Y. Zhou, W. Li and P.J. Liu. 2019. Exploring the limits of transfer learning with a unified text-to-text transformer. arXiv [cs.
LG]. https://doi.org/10.48550/arxiv.1910.10683
12. Swayamdipta, S., S. Thomson, C. Dyer and N.A. Smith. 2017. Frame-Semantic Parsing with Softmax-Margin Segmental RNNs and a Syntactic Scaffold. arXiv [cs.CL].
https://doi.org/10.48550/arxiv.1706.09528
13. Talvitie, V. 2018. The Foundations of Psychoanalytic Theories.
14. Gapanyuk, Y. 2021. The development of the metagraph data and knowledge model. In Selected Contributions to the 10th International Conference on “Integrated Models and Soft Computing in Artificial Intelligence (IMSC-2021)”. pp: 1–7
15. Chernenkiy, V.M., Y.E. Gapanyuk, Y. Kaganov and I. Dunin. 2018. Storing Metagraph Model in Relational, Document-Oriented, and Graph Databases. DAMDID/RCDL.
16. Tarassov, V., Y. Kaganov and Y. Gapanyuk. 2021. The Metagraph Model for Complex Networks: Definition, Calculus, and Granulation Issues, In Artificial Intelligence, Springer International Publishing,
pp: 135–151. https://doi.org/10.23919/FRUCT48808.2020.9087470
17. Chernenkiy, V., Y. Gapanyuk, A. Nardid and N. Todosiev. 2020. The Implementation of Metagraph Agents Based on Functional Reactive Programming, In 2020 26th Conference of Open Innovations Association (FRUCT), pp: 1–8. 
https://doi.org/10.23919/FRUCT48808.2020.9087470
18. Yin, Y., L. Song, J. Su, J. Zeng, C. Zhou and J. Luo. 2019. Graph-based Neural Sentence Ordering. arXiv [cs.CL].
19. Zhabotynska, S.A. 2010. Principles of building conceptual models for thesaurus dictionaries. Cognition, Communication, Discourse, 1: 75–92.
20. Ji, H., P. Ke, S. Huang, F. Wei, X. Zhu and M. Huang. 2020. Language Generation with Multi- Hop Reasoning on Commonsense Knowledge Graph. arXiv [cs.CL]. https://doi.org/10.18653/
v1/2020.emnlp-main.54
21. Bai, H., P. Shi, J. Lin, Y. Xie, L. Tan, K. Xiong, W. Gao and M. Li. 2021. Segatron: Segment-Aware Transformer for Language Modeling and Understanding. Proceedings of the AAAI Conference on Artificial Intelligence, 35(14): 12526–12534.
22. Wu, G., W. Wu, L. Li, G. Zhao, D. Han and B. Qiao. 2020. BCRL: Long Text Friendly Knowledge Graph Representation Learning, In The Semantic Web – ISWC 2020, Springer International Publishing, pp: 636–653.
23. Rajpurkar, P., J. Zhang, K. Lopyrev and P. Liang. 2016. SQuAD: 100,000+ Questions for MachineComprehension of Text. arXiv [cs.CL]. https://
doi.org/10.18653/v1/D16-1264
24. Bordes, A., N. Usunier and A. Garcia-Duran. 2013. Translating embeddings for modeling multi-relational data. Advances in Neural Information Processing Systems.
25. Radford, A., J. Wu, R. Child, D. Luan, D. Amodei and I. Sutskever. 2019. Language models are unsupervised multitask learners. OpenAI Blog, 1(8): 9.
26. Devlin, J., M.-W. Chang, K. Lee and K. Toutanova. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv [cs.CL].
 
2024-74-1
2023-73-4
2023-73-3
2023-73-2

© ФИЦ ИУ РАН 2008-2018. Создание сайта "РосИнтернет технологии".