Search

Article

x

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

Band gap prediction of perovskite materials based on transfer learning

Sun Tao Yuan Jian-Mei

Citation:

Band gap prediction of perovskite materials based on transfer learning

Sun Tao, Yuan Jian-Mei
PDF
HTML
Get Citation
  • The band gap is a key physical quantity in material design. First-principles calculations based on density functional theory can approximately predict the band gap, which often requires significant computational resources and time. Deep learning models have the advantages of good fitting capability and automatic feature extraction from the data, and are gradually used to predict the band gap. In this paper, aiming at the problem of quickly obtaining the band gap value of perovskite material, a feature fusion neural network model, named CGCrabNet, is established, and the transfer learning strategy is used to predict the band gap of perovskite material. The CGCrabNet extracts features from both chemical equation and crystal structure of materials, and fits the mapping between feature and band gap. It is an end-to-end neural network model. Based on the pre-training data obtained from the Open Quantum Materials Database (OQMD dataset), the CGCrabNet parameters can be fine-tuned by using only 175 perovskite material data to improve the robustness of the model.The numerical and experimental results show that the prediction error of the CGCrabNet model for band gap prediciton based on the OQMD dataset is 0.014 eV, which is lower than that obtained from the prediction based on compositionally restricted attention-based network (CrabNet). The mean absolute error of the model developed in this paper for predicting perovskite materials is 0.374 eV, which is 0.304 eV, 0.441 eV and 0.194 eV lower than that obtained from random forest regression, support vector machine regression and gradient boosting regression, respectively. The mean absolute error of the test set of CGCrabNet trained only by using perovskite data is 0.536 eV, and the mean absolute error of the pre-trained CGCrabNet decreases by 0.162 eV, which indicates that the transfer learning strategy plays a significant role in improving the prediction accuracy of small data sets (perovskite material data sets). The difference between the predicted band gap of some perovskite materials such as SrHfO3 and RbPaO3 by the model and the band gap calculated by first-principles is less than 0.05 eV, which indicates that the CGCrabNet can quickly and accurately predict the properties of new materials and accelerate the development process of new materials.
      Corresponding author: Yuan Jian-Mei, yuanjm@xtu.edu.cn
    • Funds: Project supported by the Natural Science Foundation of Hunan Province, China (Grant Nos. 2023JJ30567, 2021JJ30650).
    [1]

    范晓丽 2015 中国材料进展 34 689Google Scholar

    Fan X L 2015 Mater. China 34 689Google Scholar

    [2]

    万新阳, 章烨辉, 陆帅华, 吴艺蕾, 周跫桦, 王金兰 2022 物理学报 71 177101Google Scholar

    Wan X Y, Zhang Y H, Lu S H, Wu Y L, Zhou Q H, Wang J L 2022 Acta Phys. Sin. 71 177101Google Scholar

    [3]

    Xie T, Grossman J C 2018 Phys. Rev. Lett. 120 145301Google Scholar

    [4]

    Chen C, Ye W K, Zuo Y X, Zheng C, Ong S P 2019 Chem. Mater. 31 3564Google Scholar

    [5]

    Karamad M, Magar R, Shi Y T, Siahrostami S, Gates L D, Farimani A B 2020 Phys. Rev. Materials 4 093801Google Scholar

    [6]

    Jha D, Ward L, Paul A, Liao W K, Choudhary A, Wolverton C, Agrawal A 2018 Sci. Rep. 8 17593Google Scholar

    [7]

    Goodall R E A, Lee A A 2020 Nat. Commun. 11 6280Google Scholar

    [8]

    Wang A Y T, Kauwe S K, Murdock R J, Sparks T D 2021 NPJ Comput. Mater. 7 77Google Scholar

    [9]

    胡扬, 张胜利, 周文瀚, 刘高豫, 徐丽丽, 尹万健, 曾海波 2023 硅酸盐学报 51 452Google Scholar

    Hu Y, Zhang S L, Zhou W H, Liu G Y, Xu L L, Yin W J, Zeng H B 2023 J. Chin. Chem. Soc. 51 452Google Scholar

    [10]

    Guo Z, Lin B 2021 Sol. Energy 228 689Google Scholar

    [11]

    Gao Z Y, Zhang H W, Mao G Y, Ren J N, Chen Z H, Wu C C, Gates I D, Yang W J, Ding X L, Yao J X 2021 Appl. Surf. Sci. 568 150916Google Scholar

    [12]

    Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, Kaiser L, Polosukhin I 2017 arXiv: 1706.03762v5 [cs. CL

    [13]

    Nix D A, Weigend A S 1994 Proceedings of 1994 Ieee International Conference on Neural Networks (ICNN’94) Orlando, FL, USA, 28 June–02 July, 1994 p55

    [14]

    You Y, Li J, Reddi S, et al. 2020 arXiv: 1904.00962v5 [cs. LG

    [15]

    Smith L N 2017 arXiv: 1506.01186v6 [cs. CV

    [16]

    Saal J E, Kirklin S, Aykol M, Meredig B, Wolverton C 2013 JOM 65 1501Google Scholar

    [17]

    Jain A, Ong S P, Hautier G, Chen W, Richards W D, Dacek S, Cholia S, Gunter D, Skinner D, Ceder G, Persson K A 2013 APL Mater. 1 011002Google Scholar

    [18]

    Yamamoto T 2019 Crystal Graph Neural Networks for Data Mining in Materials Science (Yokohama: Research Institute for Mathematical and Computational Sciences, LLC

    [19]

    Kirklin S, Saal J E, Meredig B, Thompson A, Doak J W, Aykol M, Rühl S, Wolverton C 2015 NPJ Comput. Mater. 1 15Google Scholar

    [20]

    Calfa B A, Kitchin J R 2016 AIChE J. 62 2605Google Scholar

    [21]

    Ward L, Agrawal A, Choudhary A, Wolverton C 2016 NPJ Comput. Mater. 2 16028Google Scholar

    [22]

    Tshitoyan V, Dagdelen J, Weston L, Dunn A, Rong Z Q, Kononova O, Persson K A, Ceder G, Jain A 2019 Nature 571 95Google Scholar

    [23]

    Breiman L 2001 Mach. Learn. 45 5Google Scholar

    [24]

    Wu Y R, Li H P, Gan X S 2013 Adv. Mater. Res. 848 122Google Scholar

    [25]

    孙涛, 袁健美 2023 物理学报 72 028901Google Scholar

    Sun T, Yuan J M 2023 Acta Phys. Sin. 72 028901Google Scholar

    [26]

    Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, Blondel M, Prettenhofer P, Weiss R, Dubourg V, Vanderplas J, Passos A, Cournapeau D, Brucher M, Perrot M, Duchesnay É 2011 J. Mach. Learn. Res. 12 2825Google Scholar

  • 图 1  CGCrabNet模型算法

    Figure 1.  CGCrabNet model algorithm.

    图 2  CGCrabNet预训练损失值变化

    Figure 2.  CGCrabNet pre-training loss value change.

    图 3  验证集预测带隙值

    Figure 3.  Predicted band gap values on the validation set.

    图 4  测试集预测带隙值

    Figure 4.  Predicted band gap values on the test set.

    图 5  钙钛矿材料带隙预测的平均绝对误差对比

    Figure 5.  MAE of band gap prediction for perovskite materials.

    图 6  预测带隙与计算带隙散点图

    Figure 6.  Predicting and calculating band gap scatter maps.

    表 1  超参数取值

    Table 1.  Hyperparameter value.

    超参数名称 含义
    $ {d}_{{\rm{m}}} $ 元素特征构造得到的向量维度 512
    $ {N}_{{{f}}} $ 化学式中最大元素种类 7
    N 注意力机制层数 3
    n 注意力机制头数 4
    I 参与训练的元素种类和 89
    T 图卷积层数 3
    $ {V}_{{\rm{c}}{\rm{g}}} $ 节点嵌入后元素向量维度 16
    $ {w}_{1} $∶$ {w}_{2} $ 权重比参数 7:3
    Epochs 最大迭代次数 300
    batch_size 批处理大小 256
    DownLoad: CSV

    表 2  元素嵌入法测试结果(单位: eV)

    Table 2.  Elemental embedding method test results (in eV).

    元素嵌入方法 Train MAE Val MAE Test MAE
    One-Hot 0.185 0.423 0.433
    Magpie 0.428 0.546 0.566
    Mat2vec 0.203 0.408 0.420
    DownLoad: CSV

    表 3  深度学习模型测试结果(单位: eV)

    Table 3.  Deep learning model test results (in eV).

    Train MAEVal MAETest MAE
    CGCNN0.5020.6050.601
    Roost0.1780.4470.455
    CrabNet0.2260.4220.427
    HotCrab0.1770.4220.440
    CGCrabNet0.1870.4080.413
    DownLoad: CSV

    表 4  回归模型参数

    Table 4.  Regression model parameters.

    机器学习方法超参数名称取值
    RF子学习器数量90
    SVR核函数多项式核
    多项式核次数3
    正则化强度2
    伽马参数2
    零系数1.5
    GBR子学习器数量500
    学习率0.2
    最大深度4
    损失函数绝对误差函数
    DownLoad: CSV

    表 5  钙钛矿材料预测值和计算值对比 (单位: eV)

    Table 5.  Comparison of predicted and calculated values for perovskite materials (in eV).

    化学式 带隙计算值 CGCrabNet RF SVR GBR
    NbTlO3 0.112 0.658 1.458 1.296 1.614
    ZnAgF3 1.585 1.776 1.836 2.194 1.840
    AcAlO3 4.102 3.212 2.881 3.197 2.963
    BeSiO3 0.269 1.116 2.813 2.963 3.777
    TmCrO3 1.929 1.682 1.612 1.987 1.668
    SmCoO3 0.804 0.644 0.821 1.043 0.724
    CdGeO3 0.102 0.586 0.911 1.675 0.196
    CsCaCl3 5.333 4.891 4.918 5.116 5.157
    HfPbO3 2.415 2.724 1.733 2.346 1.967
    SiPbO3 1.185 1.327 1.407 1.543 1.079
    SrHfO3 3.723 3.683 2.821 3.370 3.253
    PrAlO3 2.879 3.139 2.665 2.091 2.984
    BSbO3 1.405 1.123 0.653 -0.025 0.579
    CsEuCl3 0.637 0.388 1.500 4.477 0.949
    LiPaO3 3.195 3.100 2.443 -0.306 2.553
    PmErO3 1.696 1.309 1.550 1.682 1.252
    TlNiF3 3.435 2.806 2.063 3.049 3.255
    MgGeO3 3.677 1.256 0.979 1.623 1.073
    NaVO3 0.217 0.785 0.911 0.180 0.989
    RbVO3 0.250 0.616 1.736 0.290 1.534
    KZnF3 3.695 3.785 2.853 3.203 3.295
    NdInO3 1.647 1.587 1.653 0.889 1.590
    RbCaF3 6.397 6.974 6.482 6.372 6.028
    RbPaO3 3.001 2.952 2.864 -0.234 2.937
    PmInO3 1.618 1.480 1.896 1.222 1.754
    KMnF3 2.656 2.991 2.647 2.428 2.730
    NbAgO3 1.334 1.419 1.369 1.227 1.265
    CsCdF3 3.286 3.078 2.990 2.724 2.879
    KCdF3 3.101 3.125 2.789 2.365 2.990
    CsYbF3 7.060 6.641 6.325 6.523 6.736
    NaTaO3 2.260 1.714 1.680 2.093 1.715
    CsCaF3 6.900 6.874 6.291 6.416 6.379
    RbSrCl3 4.626 4.470 4.966 4.647 4.795
    AcGaO3 2.896 3.199 2.740 2.869 2.981
    BaCeO3 2.299 1.789 3.918 2.696 3.655
    注: CGCrabNet, RF, SVR和GBR分别代表特征融合神经网络、随机森林回归、支持向量回归和梯度提升回归模型计算得到的带隙值.
    DownLoad: CSV
  • [1]

    范晓丽 2015 中国材料进展 34 689Google Scholar

    Fan X L 2015 Mater. China 34 689Google Scholar

    [2]

    万新阳, 章烨辉, 陆帅华, 吴艺蕾, 周跫桦, 王金兰 2022 物理学报 71 177101Google Scholar

    Wan X Y, Zhang Y H, Lu S H, Wu Y L, Zhou Q H, Wang J L 2022 Acta Phys. Sin. 71 177101Google Scholar

    [3]

    Xie T, Grossman J C 2018 Phys. Rev. Lett. 120 145301Google Scholar

    [4]

    Chen C, Ye W K, Zuo Y X, Zheng C, Ong S P 2019 Chem. Mater. 31 3564Google Scholar

    [5]

    Karamad M, Magar R, Shi Y T, Siahrostami S, Gates L D, Farimani A B 2020 Phys. Rev. Materials 4 093801Google Scholar

    [6]

    Jha D, Ward L, Paul A, Liao W K, Choudhary A, Wolverton C, Agrawal A 2018 Sci. Rep. 8 17593Google Scholar

    [7]

    Goodall R E A, Lee A A 2020 Nat. Commun. 11 6280Google Scholar

    [8]

    Wang A Y T, Kauwe S K, Murdock R J, Sparks T D 2021 NPJ Comput. Mater. 7 77Google Scholar

    [9]

    胡扬, 张胜利, 周文瀚, 刘高豫, 徐丽丽, 尹万健, 曾海波 2023 硅酸盐学报 51 452Google Scholar

    Hu Y, Zhang S L, Zhou W H, Liu G Y, Xu L L, Yin W J, Zeng H B 2023 J. Chin. Chem. Soc. 51 452Google Scholar

    [10]

    Guo Z, Lin B 2021 Sol. Energy 228 689Google Scholar

    [11]

    Gao Z Y, Zhang H W, Mao G Y, Ren J N, Chen Z H, Wu C C, Gates I D, Yang W J, Ding X L, Yao J X 2021 Appl. Surf. Sci. 568 150916Google Scholar

    [12]

    Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, Kaiser L, Polosukhin I 2017 arXiv: 1706.03762v5 [cs. CL

    [13]

    Nix D A, Weigend A S 1994 Proceedings of 1994 Ieee International Conference on Neural Networks (ICNN’94) Orlando, FL, USA, 28 June–02 July, 1994 p55

    [14]

    You Y, Li J, Reddi S, et al. 2020 arXiv: 1904.00962v5 [cs. LG

    [15]

    Smith L N 2017 arXiv: 1506.01186v6 [cs. CV

    [16]

    Saal J E, Kirklin S, Aykol M, Meredig B, Wolverton C 2013 JOM 65 1501Google Scholar

    [17]

    Jain A, Ong S P, Hautier G, Chen W, Richards W D, Dacek S, Cholia S, Gunter D, Skinner D, Ceder G, Persson K A 2013 APL Mater. 1 011002Google Scholar

    [18]

    Yamamoto T 2019 Crystal Graph Neural Networks for Data Mining in Materials Science (Yokohama: Research Institute for Mathematical and Computational Sciences, LLC

    [19]

    Kirklin S, Saal J E, Meredig B, Thompson A, Doak J W, Aykol M, Rühl S, Wolverton C 2015 NPJ Comput. Mater. 1 15Google Scholar

    [20]

    Calfa B A, Kitchin J R 2016 AIChE J. 62 2605Google Scholar

    [21]

    Ward L, Agrawal A, Choudhary A, Wolverton C 2016 NPJ Comput. Mater. 2 16028Google Scholar

    [22]

    Tshitoyan V, Dagdelen J, Weston L, Dunn A, Rong Z Q, Kononova O, Persson K A, Ceder G, Jain A 2019 Nature 571 95Google Scholar

    [23]

    Breiman L 2001 Mach. Learn. 45 5Google Scholar

    [24]

    Wu Y R, Li H P, Gan X S 2013 Adv. Mater. Res. 848 122Google Scholar

    [25]

    孙涛, 袁健美 2023 物理学报 72 028901Google Scholar

    Sun T, Yuan J M 2023 Acta Phys. Sin. 72 028901Google Scholar

    [26]

    Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, Blondel M, Prettenhofer P, Weiss R, Dubourg V, Vanderplas J, Passos A, Cournapeau D, Brucher M, Perrot M, Duchesnay É 2011 J. Mach. Learn. Res. 12 2825Google Scholar

  • [1] Ouyang Xin-Jian, Zhang Yan-Xing, Wang Zhi-Long, Zhang Feng, Chen Wei-Jia, Zhuang Yuan, Jie Xiao, Liu Lai-Jun, Wang Da-Wei. Modeling ferroelectric phase transitions with graph convolutional neural networks. Acta Physica Sinica, 2024, 73(8): 086301. doi: 10.7498/aps.73.20240156
    [2] Sun Tao, Yuan Jian-Mei. Prediction of band gap of transition metal sulfide with Janus structure by deep learning atomic feature representation method. Acta Physica Sinica, 2023, 72(2): 028901. doi: 10.7498/aps.72.20221374
    [3] Lei Bo, He Zhao-Yang, Zhang Rui. Simulation study of underwater intruder localization based on transfer learning. Acta Physica Sinica, 2021, 70(22): 224302. doi: 10.7498/aps.70.20210277
    [4] Wang Chen-Yang, Duan Qian-Qian, Zhou Kai, Yao Jing, Su Min, Fu Yi-Chao, Ji Jun-Yang, Hong Xin, Liu Xue-Qin, Wang Zhi-Yong. A hybrid model for photovoltaic power prediction of both convolutional and long short-term memory neural networks optimized by genetic algorithm. Acta Physica Sinica, 2020, 69(10): 100701. doi: 10.7498/aps.69.20191935
    [5] Peng Xiang-Kai, Ji Jing-Wei, Li Lin, Ren Wei, Xiang Jing-Feng, Liu Kang-Kang, Cheng He-Nan, Zhang Zhen, Qu Qiu-Zhi, Li Tang, Liu Liang, Lü De-Sheng. Online learning method based on artificial neural network to optimize magnetic shielding characteristic parameters. Acta Physica Sinica, 2019, 68(13): 130701. doi: 10.7498/aps.68.20190234
    [6] Yuan Lin, Yang Xue-Song, Wang Bing-Zhong. Prediction of time reversal channel with neural network optimized by empirical knowledge based genetic algorithm. Acta Physica Sinica, 2019, 68(17): 170503. doi: 10.7498/aps.68.20190327
    [7] Wei De-Zhi, Chen Fu-Ji, Zheng Xiao-Xue. Internet public opinion chaotic prediction based on chaos theory and the improved radial basis function in neural networks. Acta Physica Sinica, 2015, 64(11): 110503. doi: 10.7498/aps.64.110503
    [8] Meng Qing-Fang, Chen Yue-Hui, Feng Zhi-Quan, Wang Feng-Lin, Chen Shan-Shan. Nonlinear prediction of small scale network traffic based on local relevance vector machine regression model. Acta Physica Sinica, 2013, 62(15): 150509. doi: 10.7498/aps.62.150509
    [9] Li Pan-Chi, Wang Hai-Ying, Dai Qing, Xiao Hong. Quantum process neural networks model algorithm and applications. Acta Physica Sinica, 2012, 61(16): 160303. doi: 10.7498/aps.61.160303
    [10] Wang Yong-Sheng, Sun Jin, Wang Chang-Jin, Fan Hong-Da. Prediction of the chaotic time series from parameter-varying systems using artificial neural networks. Acta Physica Sinica, 2008, 57(10): 6120-6131. doi: 10.7498/aps.57.6120
    [11] Zhang Jun-Feng, Hu Shou-Song. Chaotic time series prediction based on multi-kernel learning support vector regression. Acta Physica Sinica, 2008, 57(5): 2708-2713. doi: 10.7498/aps.57.2708
    [12] Ding Gang, Zhong Shi-Sheng. Sunspot number prediction based on process neural network with time-varying threshold functions. Acta Physica Sinica, 2007, 56(2): 1224-1230. doi: 10.7498/aps.56.1224
    [13] Zhang Jun-Feng, Hu Shou-Song. Chaotic time series prediction based on RBF neural networks with a new clustering algorithm. Acta Physica Sinica, 2007, 56(2): 713-719. doi: 10.7498/aps.56.713
    [14] Hu Yu-Xia, Gao Jin-Feng. A neuro-fuzzy method for predicting the chaotic time series. Acta Physica Sinica, 2005, 54(11): 5034-5038. doi: 10.7498/aps.54.5034
    [15] Li Jun, Liu Jun-Hua. On the prediction of chaotic time series using a new generalized radial basis function neural networks. Acta Physica Sinica, 2005, 54(10): 4569-4577. doi: 10.7498/aps.54.4569
    [16] Cui Wan-Zhao, Zhu Chang-Chun, Liu Jun-Hua. Wavelet neural networks model on threshold electric field of field emission from thin films. Acta Physica Sinica, 2004, 53(5): 1583-1587. doi: 10.7498/aps.53.1583
    [17] Tan Wen, Wang Yao-Nan, Zhou Shao-Wu, Liu Zu-Run. Prediction of the chaotic time series using neuro-fuzzy networks. Acta Physica Sinica, 2003, 52(4): 795-801. doi: 10.7498/aps.52.795
    [18] CHEN SHU, CHANG SHENG-JIANG, YUAN JING-HE, ZHANG YAN-XIN, K.W.WONG. ADAPTIVE TRAINING AND PRUNING FOR NEURAL NETWORKS:ALGORITHMS AND APPLICATION. Acta Physica Sinica, 2001, 50(4): 674-681. doi: 10.7498/aps.50.674
    [19] Yu Li-juan, Zhu Chang-chun. Neural Networks Models on Threshold Electric Field of Field Emission. Acta Physica Sinica, 2000, 49(1): 170-173. doi: 10.7498/aps.49.170
    [20] MA YU-QIANG, ZHANG YUE-MING, GONG CHANG-DE. RETRIEVAL PROPERTIES OF HOPFIELD NEURAL NETWORK MODELS. Acta Physica Sinica, 1993, 42(8): 1356-1360. doi: 10.7498/aps.42.1356
Metrics
  • Abstract views:  1396
  • PDF Downloads:  59
  • Cited By: 0
Publishing process
  • Received Date:  22 June 2023
  • Accepted Date:  31 July 2023
  • Available Online:  24 August 2023
  • Published Online:  05 November 2023

/

返回文章
返回