Search

Article

x
Special topic

更多 
Topics
Article Type

AI物质科学

Material design accelerated by large language models: end-to-end empowerment from knowledge mining to intelligent design
HUANG Yudan, XIA Wanjun, DU Junmei, JIANG Yu, WANG Xin, CHEN Yuanzheng, WANG Hongyan, ZHAO Jijun, GUO Chunsheng
2025, 74 (18): 188101. doi: 10.7498/aps.74.20250497
Abstract +
With the rapid development of artificial intelligence technology, large language models (LLMs) have become the core driving force for the paradigm shift in materials science research. This review explores the comprehensive role of LLMs in accelerating material design throughout the entire research lifecycle from knowledge mining to intelligent design. This work aims to emphasize how LLMs can leverage their advantages in information retrieval, cross-modal data integration, and intelligent reasoning to address challenges in traditional materials research, such as data fragmentation, high experimental costs, and limited reasoning capabilities.Key methods include applying LLMs to knowledge discovery through techniques such as retrieval-augmented generation (RAG), multi-modal information retrieval, and knowledge graph construction. These approaches can efficiently extract and construct material data from a vast repository of scientific literature and experimental records. Additionally, LLMs are integrated with automated experimental platforms to optimize workflows from natural language-driven experiment design to high-throughput iterative testing.The results demonstrate that LLMs significantly enhance material research efficiency and accuracy. For instance, in knowledge mining, LLMs improve information retrieval accuracy by up to 29.4% in tasks such as predicting material synthesis conditions. In material design, LLMs can accelerate computational modeling, structure and performance prediction, and reverse engineering, reducing experimental trial-and-error cycles. Notably, LLMs perform well in cross-scale knowledge integration, linking material composition, processing parameters, and performance metrics to guide innovative synthesis pathways.However, challenges still exist, including dependence on high-quality data, the “black-box” nature of LLMs, and limitations in handling complex material systems. The future direction emphasizes improving data quality through multi-source integration, enhancing model explainability through visualization tools, and deepening interdisciplinary collaboration, and bridging the gaps between AI and domain-specific expertise.In summary, LLMs are reshaping materials science by implementing a data-driven, knowledge-intensive research paradigms. The ability of LLMs to integrate vast datasets, predict material properties, and automate experimental workflows makes them indispensable tools for accelerating material discovery and innovation. With the development of LLMs, their synergistic effect with physical constraints and experimental platforms is expected to open new fields in material design.