Materials Discovery and Properties Prediction in Thermal Transport

15 May 2019 - (15) Materials informatics (Figure 2) introduces a brand new way of accelerating .... In addition to the discovery of lattice thermal co...
0 downloads 0 Views 5MB Size
Mini Review Cite This: Nano Lett. 2019, 19, 3387−3395

pubs.acs.org/NanoLett

Materials Discovery and Properties Prediction in Thermal Transport via Materials Informatics: A Mini Review Xiao Wan,†,‡,∥ Wentao Feng,‡,∥ Yunpeng Wang,†,‡ Haidong Wang,§ Xing Zhang,§ Chengcheng Deng,*,‡ and Nuo Yang*,†,‡ State Key Laboratory of Coal Combustion and ‡School of Energy and Power Engineering, Huazhong University of Science and Technology, Wuhan 430074, China § Department of Engineering Mechanics, Tsinghua University, Beijing 100084, China Downloaded via UNIV OF SOUTHERN INDIANA on July 20, 2019 at 13:34:16 (UTC). See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.



ABSTRACT: There has been increasing demand for materials with functional thermal properties, but traditional experiments and simulations are high-cost and time-consuming. The emerging discipline, materials informatics, is an effective approach that can accelerate materials development by combining material science and big data techniques. Recently, materials informatics has been successfully applied to designing thermal materials, such as thermal interface materials for heat-dissipation, thermoelectric materials for power generation, and so forth. This Mini Review summarizes the research progress associated with studies regarding the prediction and discovery of materials with desirable thermal transport properties by using materials informatics. On the basis of the review of past research, perspectives are discussed and future directions for studying functional thermal materials by materials informatics are given. KEYWORDS: Materials informatics, machine learning, material discovery, thermal conductivity, thermoelectric properties, interfacial thermal conductance

T

of accelerating the discovery of materials with special properties.16,17 Intrinsically, materials informatics is the process that allows one to survey complex, multiscale information in a high-throughput, statistically robust, and yet physically meaningful manner.17 Materials informatics is an emerging area of materials science16−18 based on simulations or experiments in materials science and machine learning algorithms.16 Materials informatics can effectively and accurately capture the relationship between structures and properties by data mining techniques for materials discovery and properties prediction. Seeking structure−property relationships is an accepted paradigm in materials science, yet these relationships are often nonlinear and complicated.17 There is rarely a wellaccepted multiscale relationship that is accurately captured by traditional theory or experiments because there are different physical laws that act at the macro-/microscale. Hence, there

hermal properties, such as thermal conductivity, interfacial thermal conductance (ITC), and so forth, play a critical role in micro/nanoelectronics, optoelectronics, thermoelectrics, and other thermal/phonon engineering areas.1,2 For example, there is an increasing demand for materials with high thermal conductivities that can dissipate the massive heat in electronic devices.3−6 In addition, ITC dominates the thermal dissipation of composites with interfaces on the micro/ nanoscale.7,8 Therefore, the effective discovery of materials with high thermal conductivities or ITC is crucial for improving the performance and extending the lifetime of a wide variety of related devices. On the other hand, thermoelectric power generation is essential for utilizing low-grade wasted heat. Researchers have been seeking materials with high conversion efficiency for decades to improve their performance9−14 for which materials with low thermal conductivity are essential. Because of the limitations of cost, time, and hardware, the discovery of materials with desirable thermal properties remains challenging in both experiments and simulations.15 Materials informatics (Figure 2) introduces a brand new way © 2019 American Chemical Society

Received: December 29, 2018 Revised: May 13, 2019 Published: May 15, 2019 3387

DOI: 10.1021/acs.nanolett.8b05196 Nano Lett. 2019, 19, 3387−3395

Mini Review

Nano Letters

Figure 1. Schematics of applying the materials informatics method to studying thermal transport issues.

procedure of materials informatics in the thermal field is shown in Figure 1, and the specific contents of the three steps are described as follows.

are opportunities for using materials informatics, which can build these relationships by data mining without concern for the principles. Data mining is a new field that merges ideas from statistics, machine learning, databases, and parallel and distributed computing.19 Data mining takes the form of building models from a given data set, which can capture the nonlinear mapping relations between material structures and properties for materials discovery. In addition to pattern recognition, data mining in big data techniques has another primary function in understanding materials behavior: prediction. The predictive aspect of data mining, classification and regression analysis can help facilitate the understanding of multivariable correlations in the “processing−structure− properties” paradigm that form the core of materials development.16 In light of this feature, materials informatics, seeking material structure−property relationships using the big data technique, can significantly advance all functional materials fields, such as optical/electronic/phononic materials, acoustics materials, magnetic materials, mechanical materials, nuclear materials, and so forth. The role of materials informatics is popular throughout all fields and applications in materials science and engineering.17 Recently, materials informatics has been successfully applied in the search for materials or structures with desirable thermal properties, such as thermal conductivity, ITC, and thermoelectric properties.20−23 Considering that there have already been studies in this emerging field, it is necessary to review their progress and provide an outlook on future work, which will be helpful for the development of materials informatics in the thermal field. In this paper, a mini-review is given of the recent research progress on the applications of materials informatics in studying thermal transport. First, we provide a brief introduction of materials informatics. Then, the related studies of using materials informatics in thermal properties, including thermal conductivity, interfacial thermal conductance, and thermoelectric conversion efficiency, are summarized. Finally, some perspectives on the challenges, shortcomings and outlook are provided to aid future investigations related to this topic. Materials Informatics. The framework of materials informatics mainly consists of three parts: (1) data procurement, or the acquisition of data generated by simulations or experiments in materials science; (2) data representation, or systematic storage of representative information about the structures and properties of these materials; and (3) data mining, or data analysis aimed at searching for relationships between structure information and desired properties.17 The

Figure 2. Processing−structure−property−performance relationships of materials science and engineering, and how materials informatics approaches can help decipher these relationships via forward and inverse models. Adapted with permission from ref 16, licensed under a Creative Commons Attribution (CC BY) license. Copyright 2016 AIP publishing.

Data Procurement. Data procurement is acquiring the physical properties and structural information on given materials. Calculations (such as first-principles,21,24 molecular dynamics,23,25,26 lattice dynamics,27,28 and so forth), experiments29 and online libraries30 have been used to collect these data. With these different techniques, database repositories containing effective training data can be constructed. Data Representation. Data representation refers to the systematic storage of representative information about the structures and properties of materials. The key component of data representation is the selection of characteristics (e.g., formation energies, band structure, density of states, magnetic moments) to describe the materials, which are called “descriptors”. The descriptors represent different kinds of materials, and they are only one part of the input in data mining. One purpose of materials informatics is to establish mapping relations between the descriptors and target properties, which, herein, are thermal properties. Thus, good descriptors are the key to effective materials informatics. Once a series of good descriptors is identified, the search for optimum materials or properties prediction within the database can be performed intrinsically or extrinsically.31 Data Mining. Data mining aims at searching for novel materials or exploring new physical insights in which machine 3388

DOI: 10.1021/acs.nanolett.8b05196 Nano Lett. 2019, 19, 3387−3395

Mini Review

Nano Letters

Figure 3. (a) Prototype Half-Heusler structure with primitive vectors and a conventional cell. (b) Elements considered in this study. Adapted with permission from ref 20, licensed under a Creative Commons Attribution (CC BY) license. Copyright 2014 APS publishing.

learning is widely used.17 The main machine learning algorithms used in materials informatics include supervised learning, the task of which is finding a function that maps an input to an output based on samples.32 Through the training models built by machine learning algorithms, materials with novel properties can easily be selected or predicted. Currently, the most popular algorithms include Bayesian optimization, random-forest regression, and artificial neural networks. A brief introduction of these algorithms is provided below. Bayesian optimization is a well-established technique for the global optimization of black-box functions.33,34 Bayesian prediction models, most commonly based on the Gaussian process, are usually employed to predict the black-box function, where the uncertainty of the predicted function is also evaluated as predictive variance.35 The Bayesian optimization algorithm (BOA) typically works by assuming that the unknown predicted function is sampled from a Gaussian process and maintains a posterior distribution for this function as observations are made.34 The procedure for Bayesian optimization is as follows. First, a Gaussian process model is developed from two observations that are randomly selected from the database. The model is updated by (i) sampling the point at which the observation property is expected to be the best and (ii) updating the model by including the observation at the sampled point. These two steps are repeated until all data are sampled.31 Random forest36 is a prominent ensemble method adapted from bagging, which combines multiple decision trees into one predictive model to improve performance.20 Random forest is relatively robust to various problems, such as compound classification, and can handle outlier data or high-dimensional data well.36,37 A random forest model consists of K decision trees that are established in three steps. First, K sets of data are generated from the initial data set by a bootstrap method. Second, a tree is grown with a particular random selection algorithm to obtain the predictions for each data point. Third, the final prediction is made by a weighted vote (in classification) or weighted average (in regression) of all forest predictions.37 In addition to performing the prediction task, random forest also provides an intrinsic metric to evaluate the importance of each descriptor.20 The artificial neural network (ANN) and deep neural network are well-developed machine learning methods that mimic human brains to learn the relationships between certain inputs and outputs based on experience.38 The ANN has recently been successfully applied in the fields of modeling and prediction in many thermal engineering systems.39−41 The

ANN has become increasingly attractive in the past decade. The assets of the ANN compared to classical methods are its high speed and simplicity, which decrease engineering efforts.29,42,43 The most basic and commonly used ANN consists of at least three or more layers, including an input layer, an output layer, and a number of hidden layers.29 The number of neurons in the input layer equals the number of parameters in the material selection process. The output layer represents the fitness of the candidate materials. In addition, the hidden layer represents the relationships between the input and output layers. Through training and testing stages, the ANN model is well-established. In the training stage, the network is trained to predict an output based on input data. The training stage is stopped when the testing error is within the tolerance limits. In the testing stage, the network is tested to stop or continue training according to measures of error.29,40−42 In addition to the three machine learning algorithms mentioned above, there are some other efficient algorithms in progress, such as autoencoder, convolutional neural networks and generative adversarial networks, which are more advanced and powerful. The autoencoder is a type of artificial neural network that is used to learn efficient data coding in an unsupervised manner. The convolutional neural network (CNN) is a class of deep neural networks that requires relatively little preprocessing compared to other image classification algorithms.44 The generative adversarial network (GAN) is a class of machine learning systems in which two neural networks contest with each other in a zero-sum game framework.45 Recently, these three machine learning algorithms have become widely used for image recognition and data generation. Thermal Conductivity. As a popular topic, thermal conductivity is one of the most important material properties. In some cases, materials with ultralow or superhigh thermal conductivities are essential for engineering applications.2,3,8,46−50 Many studies have focused on low thermal conductivity materials for thermoelectrics and thermal insulation materials.10,14 In search of compounds with ultralow thermal conductivity, several studies have been performed on predicting the lattice thermal conductivity by materials informatics. In addition, high thermal conductivity materials for improving the thermal management of electronic devices have also attracted wide attention, such as single-crystal boron arsenide with special band structure.46−50 However, no relevant reports on predicting high thermal conductivity materials by machine learning algorithms currently exist. In 3389

DOI: 10.1021/acs.nanolett.8b05196 Nano Lett. 2019, 19, 3387−3395

Mini Review

Nano Letters

Figure 4. Correlation between the experimental values and the values of interfacial thermal resistance predicted by the AMM, DMM, GLR, GPR, and SVR using the same descriptors. Adapted with permission from ref 54, licensed under a Creative Commons Attribution (CC BY) license. And no changes were made. Copyright 2017 Springer Nature publishing.

models and effective medium theory is consistent with experimental data. Furthermore, Tanaka’s group combined the Bayesian optimization and first-principles anharmonic lattice-dynamics calculations to find materials with ultralow thermal conductivity.21 In 2015, these authors discovered 221 materials with very low thermal conductivity in a library containing 54,779 compounds. Two compounds even have an electronic band gap 0.88), showing that the computational accuracy is acceptable. The accuracy is largely influenced by the selection of models and descriptors. Generally, simple models, such as generalized linear regression and second order polynomial regression, show bad performance. Complicated models, such as random forest and artificial neural network, can be much more accurate. Moreover, a sophisticated selection of descriptors could also improve the accuracy; for example, it is observed that the coefficient of determination is improved from 0.92 to 0.96 in the prediction of ITC.54 Interestingly, the predicted value of the Seebeck coefficient is comparable to the measurement of recently manufactured materials, which is not included in the training database.59 However, machine learning methods can do little to predict abnormal properties. The machine learning model can predict a new sample within the normal scope, but the amount of abnormal data usually is not large enough to precisely predict outliers.60 For instance, in 2019 superhigh thermoelectric figures of merit (ZT > 400) were reported.61 However, this high ZT could only exist at the structure phase transition temperature; because there are insufficient similar data around that temperature, this abnormal value of ZT is difficult to predict using machine learning methods. Aside from the challenge of materials informatics in thermal field, there are some common issues that must be resolved. When performing materials informatics, it remains challenging to generate more multipurpose and time-saving machine learning algorithm codes, select fast and effective descriptors, and transfer data to practical knowledge or physical pictures. The main challenge lies in the physical interpretation of the process by machine learning. The underlying physical mechanism cannot be fully understood only by machine learning, which benefits from the use of other theoretical or simulative methods. Advances in studying heat transfer in nanomaterials/nanostructures are needed by machine learning. Additionally, when preparing data, especially for complex structures, simulations or experiments require much time to obtain enough data for training. For example, the neural network usually requires a large amount of data. To avoid the difficulty in obtaining massive numbers of data, researchers may make full use of data reported in existing papers, instead of collecting all data themselves. Therefore, an online database containing comprehensive reported thermal properties of different materials is necessary and urgent. Perspectives. Looking beyond the success and shortage of materials informatics applications, there are some areas of progress that could be addressed in the near future. We conclude this paper by illustrating several important challenges that deserve further investigation. Recently, in the field of thermal transport three main machine learning methods, including Bayesian optimization, random forest, and artificial neural network, have been used in predicting the thermal transport properties of materials. With 3393

DOI: 10.1021/acs.nanolett.8b05196 Nano Lett. 2019, 19, 3387−3395

Mini Review

Nano Letters Author Contributions

predictions. Otherwise, the predicted data will be added to the training database to improve the machine learning models. Hence, it may be possible to merge different machine learning models and a combinatorial experiments strategy to realize a loop process in materials informatics. The critical issues may include the management of work flows, the tracking of multivariable measurements and data storage. In addition, machine learning can be used to fit parameters in experiments. Recently, the regression algorithms in machine learning have become very popular in economics and statistics and may also be widely applied to measuring the thermal properties of materials, especially at nanoscales. A proper regression algorithm can establish a specific mathematical model and obtain the quantitative relationship between the target properties and the experimental data, after which the unknown thermal properties can be calculated. The essence of the regression algorithm is to adjust a smooth and balanced model function f (x, y.....), which aims to minimize the fitting error and avoid overfitting problems. In measurements of thermal properties, there are hard issues caused by fitting curves for which machine learning algorithms could make a difference. For instance, in the measurements of time-domain thermoreflectance (TDTR) or in the 3ω method, thermal properties, such as thermal conductivity, electron−phonon coupling factor, and interfacial thermal conductance, can be calculated by multivariable fitting the experimental data. Traditional successive iterations can produce a fitting curve with slight deviation, where the fitting result is sensitive to the setting of the initial values. Interestingly, in the machine learning algorithms, the kernel ridge regression can solve this multiple nonlinear model and avoid the sensitivity problem of initial values. On the other hand, many environmental parameters are involved in the fabrication of materials, such as temperature, time, humidity, intensity of illumination, and so on. These parameters may have a great influence on the thermal properties of nanomaterial samples. For instance, in the fabrication of metallic nanofilms by physical vapor deposition, parameters, such as the pressure and temperature of chamber, deposition rate and time, thickness of adhesion layer, and annealing temperature, have significant effects on the final thermal properties of samples. Similarly, in Si nanowire synthesis by chemical vapor deposition, the ambient temperature and pause time in the ablation also have great influence on the final morphology of the Si nanowires.73 By principal component analysis or random forest algorithm, a relationship between the desired thermal properties and complicated environmental parameter setting can be obtained. Then, a sample with desired properties can be obtained by tuning the environmental parameters. Materials informatics has emerged as a powerful tool for many fields in materials science and engineering. It is highly desirable that materials informatics be applied in more fields to solve more difficult thermal issues.





X.W. and W.F. contributed equally to this work.

Notes

The authors declare no competing financial interest.



ACKNOWLEDGMENTS The work was sponsored by National Natural Science Foundation of China No. 51576076 (N.Y.), No. 51606072 (C.D.), No. 51711540031 (N.Y. and C.D.), the Natural Science Foundation of Hubei Province No. 2017CFA046 (N.Y.), and Fundamental Research Funds for the Central Universities No. 2019kfyRCPY045 (N.Y.). We are grateful to Xiaoxiang Yu, Dengke Ma, and Han Meng for useful discussions. The authors thank the National Supercomputing Center in Tianjin (NSCC-TJ) and the China Scientific Computing Grid (ScGrid) for providing assistance in computations.



REFERENCES

(1) Moore, A. L.; Shi, L. Mater. Today 2014, 17 (4), 163−174. (2) Pop, E. Nano Res. 2010, 3 (3), 147−169. (3) Xu, X.; Chen, J.; Zhou, J.; Li, B. Adv. Mater. 2018, 30 (17), 1705544. (4) Hansson, J.; Nilsson, T. M. J.; Ye, L.; Liu, J. Int. Mater. Rev. 2018, 63 (1), 22−45. (5) Razeeb, K. M.; Dalton, E.; Cross, G. L. W.; Robinson, A. Int. Mater. Rev. 2018, 63 (1), 1−21. (6) Bar-Cohen, A.; Matin, K.; Narumanchi, S. J. Electron. Packag. 2015, 137 (4), 040803. (7) Norris, P. M.; Le, N. Q.; Baker, C. H. J. Heat Transfer 2013, 135 (6), 061604. (8) Volz, S.; Shiomi, J.; Nomura, M.; Miyazaki, K. J. Therm. Sci. Technol. 2016, 11 (1), JTST0001. (9) Yang, L.; Chen, Z.-G.; Dargusch, M. S.; Zou, J. Adv. Energy Mater. 2018, 8 (6), 1701797. (10) Zhu, T.; Liu, Y.; Fu, C.; Heremans, J. P.; Snyder, J. G.; Zhao, X. Adv. Mater. 2017, 29 (14), 1605884. (11) Tan, G.; Zhao, L. D.; Kanatzidis, M. G. Chem. Rev. 2016, 116 (19), 12123−12149. (12) Kroon, R.; Mengistie, D. A.; Kiefer, D.; Hynynen, J.; Ryan, J. D.; Yu, L.; Muller, C. Chem. Soc. Rev. 2016, 45 (22), 6147−6164. (13) Gorai, P.; Stevanović, V.; Toberer, E. S. Nature Reviews Materials 2017, 2 (9), 17053. (14) Russ, B.; Glaudell, A.; Urban, J. J.; Chabinyc, M. L.; Segalman, R. A. Nature Reviews Materials 2016, 1 (10), 16050. (15) Gomez-Bombarelli, R.; Aguilera-Iparraguirre, J.; Hirzel, T. D.; Duvenaud, D.; Maclaurin, D.; Blood-Forsythe, M. A.; Chae, H. S.; Einzinger, M.; Ha, D. G.; Wu, T.; Markopoulos, G.; Jeon, S.; Kang, H.; Miyazaki, H.; Numata, M.; Kim, S.; Huang, W.; Hong, S. I.; Baldo, M.; Adams, R. P.; Aspuru-Guzik, A. Nat. Mater. 2016, 15 (10), 1120− 7. (16) Agrawal, A.; Choudhary, A. APL Mater. 2016, 4 (5), 053208. (17) Rajan, K. Mater. Today 2005, 8 (10), 38−45. (18) Rajan, K. Annu. Rev. Mater. Res. 2015, 45 (1), 153−169. (19) Wu, X.; Zhu, X.; Wu, G.; Ding, W. IEEE Transactions on Knowledge and Data Engineering 2014, 26 (1), 97−107. (20) Carrete, J.; Li, W.; Mingo, N.; Wang, S.; Curtarolo, S. Phys. Rev. X 2014, 4 (1), 011019. (21) Seko, A.; Togo, A.; Hayashi, H.; Tsuda, K.; Chaput, L.; Tanaka, I. Phys. Rev. Lett. 2015, 115 (20), 205901. (22) Ju, S.; Shiga, T.; Feng, L.; Hou, Z.; Tsuda, K.; Shiomi, J. Phys. Rev. X 2017, 7 (2), 021024. (23) Yang, H.; Zhang, Z.; Zhang, J.; Zeng, X. C. Nanoscale 2018, 10 (40), 19092−19099. (24) Mi, X. Y.; Yu, X.; Yao, K. L.; Huang, X.; Yang, N.; Lu, J. T. Nano Lett. 2015, 15 (8), 5229−34.

AUTHOR INFORMATION

Corresponding Authors

*E-mail: [email protected] (C.D.). *E-mail: [email protected] (N.Y.). ORCID

Nuo Yang: 0000-0003-0973-1718 3394

DOI: 10.1021/acs.nanolett.8b05196 Nano Lett. 2019, 19, 3387−3395

Mini Review

Nano Letters

(57) Carrete, J.; Mingo, N.; Wang, S.; Curtarolo, S. Adv. Funct. Mater. 2014, 24 (47), 7427−7432. (58) Yamawaki, M.; Ohnishi, M.; Ju, S.; Shiomi, J. Science Advances 2018, 4 (6), No. eaar4192. (59) Furmanchuk, A.; Saal, J. E.; Doak, J. W.; Olson, G. B.; Choudhary, A.; Agrawal, A. J. Comput. Chem. 2018, 39 (4), 191−202. (60) Witten, I. H.; Frank, E.; Hall, M. A. Implementations: Real Machine Learning Schemes. In Data Mining: Practical Machine Learning Tools and Techniques, 3rd ed.; Witten, I. H., Frank, E., Hall, M. A., Eds.; Morgan Kaufmann: Boston, 2011; Chapter 6, pp 191− 304. (61) Byeon, D.; Sobota, R.; Delime-Codrin, K.; Choi, S.; Hirata, K.; Adachi, M.; Kiyama, M.; Matsuura, T.; Yamamoto, Y.; Matsunami, M.; Takeuchi, T. Nat. Commun. 2019, 10 (1), 72. (62) Chen, G. Annu. Rev. Heat Transfer 2014, 17, 1−8. (63) An, M.; Song, Q.; Yu, X.; Meng, H.; Ma, D.; Li, R.; Jin, Z.; Huang, B.; Yang, N. Nano Lett. 2017, 17 (9), 5805−5810. (64) Xu, X.; Pereira, L. F. C.; Wang, Y.; Wu, J.; Zhang, K.; Zhao, X.; Bae, S.; Tinh Bui, C.; Xie, R.; Thong, J. T. L.; Hong, B. H.; Loh, K. P.; Donadio, D.; Li, B.; Ö zyilmaz, B. Nat. Commun. 2014, 5, 3689. (65) Yang, N.; Zhang, G.; Li, B. Nano Today 2010, 5 (2), 85−90. (66) Chen, S.; Wu, Q.; Mishra, C.; Kang, J.; Zhang, H.; Cho, K.; Cai, W.; Balandin, A. A.; Ruoff, R. S. Nat. Mater. 2012, 11, 203. (67) Lim, J.; Hippalgaonkar, K.; Andrews, S. C.; Majumdar, A.; Yang, P. Nano Lett. 2012, 12 (5), 2475−82. (68) Davis, B. L.; Hussein, M. I. Phys. Rev. Lett. 2014, 112 (5), 055505. (69) Yu, J.-K.; Mitrovic, S.; Tham, D.; Varghese, J.; Heath, J. R. Nat. Nanotechnol. 2010, 5, 718. (70) Ma, D.; Arora, A.; Deng, S.; Xie, G.; Shiomi, J.; Yang, N. Materials Today Physics 2019, 8, 56−61. (71) Qian, F.; Lan, P. C.; Freyman, M. C.; Chen, W.; Kou, T.; Olson, T. Y.; Zhu, C.; Worsley, M. A.; Duoss, E. B.; Spadaccini, C. M.; Baumann, T.; Han, T. Y.-J. Nano Lett. 2017, 17 (12), 7171−7176. (72) Rajan, K. Annu. Rev. Mater. Res. 2008, 38 (1), 299−322. (73) Gudiksen, M. S.; Lauhon, L. J.; Wang, J.; Smith, D. C.; Lieber, C. M. Nature 2002, 415 (6872), 617−620.

(25) Song, Q.; An, M.; Chen, X.; Peng, Z.; Zang, J.; Yang, N. Nanoscale 2016, 8 (32), 14943−9. (26) Li, S.; Yu, X.; Bao, H.; Yang, N. J. Phys. Chem. C 2018, 122 (24), 13140−13147. (27) Ma, D.; Ding, H.; Wang, X.; Yang, N.; Zhang, X. Int. J. Heat Mass Transfer 2017, 108, 940−944. (28) Ma, D.; Ding, H.; Meng, H.; Feng, L.; Wu, Y.; Shiomi, J.; Yang, N. Phys. Rev. B: Condens. Matter Mater. Phys. 2016, 94 (16), 165434. (29) Kurt, H.; Kayfeci, M. Appl. Energy 2009, 86 (10), 2244−2248. (30) Gaultois, M. W.; Oliynyk, A. O.; Mar, A.; Sparks, T. D.; Mulholland, G. J.; Meredig, B. APL Mater. 2016, 4 (5), 053213. (31) Seko, A.; Hayashi, H.; Nakayama, K.; Takahashi, A.; Tanaka, I. Phys. Rev. B: Condens. Matter Mater. Phys. 2017, 95 (14), 144110. (32) Caruana, R.; Niculescu-Mizil, A. An Empirical Comparison of Supervised Learning Algorithms. In Proceedings of the 23rd international conference on Machine learning; Association for Computing Machinery (ACM): Pittsburgh, Pennsylvania, USA, 2006; pp 161− 168. (33) Mockus, J. Bayesian Approach to Global Optimization; Kluwer Academic Publishers, 1989; pp 473−481. (34) Snoek, J.; Larochelle, H.; Adams, R. P. Advances in neural information processing systems 2012, 2951−2959. (35) E, R. C.; I, W. C. K. Gaussian processes for machine learning; The MIT Press, 2006. (36) Breiman, L. Random Forests 2001, 45, 261. (37) Svetnik, V.; Liaw, A.; Tong, C.; Culberson, J. C.; Sheridan, R. P.; Feuston, B. P. Journal of chemical information and computer sciences 2003, 43 (6), 1947−1958. (38) Hopfield, J. J. IEEE Circuits and Devices Magazine 1988, 4 (5), 3−10. (39) Aydinalp, M.; Ismet Ugursal, V.; Fung, A. S. Appl. Energy 2002, 71 (2), 87−110. (40) Ertunc, H. M.; Hosoz, M. Appl. Therm. Eng. 2006, 26 (5), 627−635. (41) Yang, I.-H.; Yeo, M.-S.; Kim, K.-W. Energy Convers. Manage. 2003, 44 (17), 2791−2809. (42) Kurt, H.; Atik, K.; Ozkaymak, M.; Binark, A. K. J. Energy Inst. 2007, 80 (1), 46−51. (43) Kalogirou, S. A. Renewable Sustainable Energy Rev. 2001, 5 (4), 373−401. (44) Zurada, J. M. Introduction to artificial neural systems; West: St. Paul, 1992; Vol. 8. (45) Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; WardeFarley, D.; Ozair, S.; Courville, A.; Bengio, Y. In Generative Adversarial Nets; Advances in Neural Information Processing Systems; Montréal, Québec, Canada, Dec 8−13, 2014; pp 2672−2680. (46) Lindsay, L.; Broido, D. A.; Reinecke, T. L. Phys. Rev. Lett. 2013, 111 (2), 025900−025901. (47) Kang, J. S.; Wu, H.; Hu, Y. Nano Lett. 2017, 17 (12), 7507− 7514. (48) Kang, J. S.; Li, M.; Wu, H.; Nguyen, H.; Hu, Y. Science 2018, 361 (6402), 575. (49) Li, S.; Zheng, Q.; Lv, Y.; Liu, X.; Wang, X.; Huang, P. Y.; Cahill, D. G.; Lv, B. Science 2018, 361 (6402), 579. (50) Tian, F.; Song, B.; Chen, X.; Ravichandran, N. K.; Lv, Y.; Chen, K.; Sullivan, S.; Kim, J.; Zhou, Y.; Liu, T.-H.; Goni, M.; Ding, Z.; Sun, J.; Udalamatta Gamage, G. A. G.; Sun, H.; Ziyaee, H.; Huyan, S.; Deng, L.; Zhou, J.; Schmidt, A. J.; Chen, S.; Chu, C.-W.; Huang, P. Y.; Broido, D.; Shi, L.; Chen, G.; Ren, Z. Science 2018, 361 (6402), 582. (51) Wei, H.; Zhao, S.; Rong, Q.; Bao, H. Int. J. Heat Mass Transfer 2018, 127, 908−916. (52) Zendehboudi, A.; Saidur, R. Heat Mass Transfer 2019, 55, 397− 411. (53) Prasher, R. Proc. IEEE 2006, 94 (8), 1571−1586. (54) Zhan, T.; Fang, L.; Xu, Y. Sci. Rep. 2017, 7 (1), 7109. (55) Dresselhaus, M. S.; Chen, G.; Tang, M. Y.; Yang, R. G.; Lee, H.; Wang, D. Z.; Ren, Z. F.; Fleurial, J. P.; Gogna, P. Adv. Mater. 2007, 19 (8), 1043−1053. (56) Majumdar, A. Science 2004, 303 (5659), 777. 3395

DOI: 10.1021/acs.nanolett.8b05196 Nano Lett. 2019, 19, 3387−3395