Machine learning based on eye-tracking data to identify Autism Spectrum Disorder: A systematic review and meta-analysis

作者全名:"Wei, Qiuhong; Cao, Huiling; Shi, Yuan; Xu, Ximing; Li, Tingyu"

作者地址:"[Wei, Qiuhong; Li, Tingyu] Chongqing Med Univ, Children Nutr Res Ctr, Natl Clin Res Ctr Child Hlth & Disorders, Minist Educ,Key Lab Child Dev & Disorders,Children, Chongqing, Peoples R China; [Cao, Huiling; Shi, Yuan] Chongqing Med Univ, Dept Neonatol, Childrens Hosp, Chongqing, Peoples R China; [Xu, Ximing] Chongqing Med Univ, Big Data Ctr Childrens Med Care, Childrens Hosp, 136 Zhongshan Er Rd, Chongqing 400014, Peoples R China; [Li, Tingyu] Chongqing Med Univ, Childrens Hosp, Children Nutr Res Ctr, 136 Zhongshan Er Rd, Chongqing 400014, Peoples R China"

通信作者:"Xu, XM (通讯作者),Chongqing Med Univ, Big Data Ctr Childrens Med Care, Childrens Hosp, 136 Zhongshan Er Rd, Chongqing 400014, Peoples R China.; Li, TY (通讯作者),Chongqing Med Univ, Childrens Hosp, Children Nutr Res Ctr, 136 Zhongshan Er Rd, Chongqing 400014, Peoples R China."

来源:JOURNAL OF BIOMEDICAL INFORMATICS

ESI学科分类:COMPUTER SCIENCE

WOS号:WOS:000910605800001

JCR分区:Q2

影响因子:4

年份:2023

卷号:137

期号: 

开始页: 

结束页: 

文献类型:Article

关键词:Machine learning; Eye-tracking; Autism spectrum disorder; Meta-analysis

摘要:"Background: Machine learning has been widely used to identify Autism Spectrum Disorder (ASD) based on eyetracking, but its accuracy is uncertain. We aimed to summarize the available evidence on the performances of machine learning algorithms in classifying ASD and typically developing (TD) individuals based on eye-tracking data. Methods: We searched Medline, Embase, Web of Science, Scopus, Cochrane Library, IEEE Xplore Digital Library, Wan Fang Database, China National Knowledge Infrastructure, Chinese BioMedical Literature Database, VIP Database for Chinese Technical Periodicals, from database inception to December 24, 2021. Studies using machine learning methods to classify ASD and TD individuals based on eye-tracking technologies were included. We extracted the data on study population, model performances, algorithms of machine learning, and paradigms of eye-tracking. This study is registered with PROSPERO, CRD42022296037. Results: 261 articles were identified, of which 24 studies with sample sizes ranging from 28 to 141 were included (n = 1396 individuals). Machine learning based on eye-tracking yielded the pooled classified accuracy of 81 % (I2 = 73 %), specificity of 79 % (I2 = 61 %), and sensitivity of 84 % (I2 = 61 %) in classifying ASD and TD individuals. In subgroup analysis, the accuracy was 88 % (95 % CI: 85-91 %), 79 % (95 % CI: 72-84 %), 71 % (95 % CI: 59-91 %) for preschool-aged, school-aged, and adolescent-adult group. Eye-tracking stimuli and machine learning algorithms varied widely across studies, with social, static, and active stimuli and Support Vector Machine and Random Forest most commonly reported. Regarding the model performance evaluation, 15 studies reported their final results on validation datasets, four based on testing datasets, and five did not report whether they used validation datasets. Most studies failed to report the information on eye-tracking hardware and the implementation process. Conclusion: Using eye-tracking data, machine learning has shown potential in identifying ASD individuals with high accuracy, especially in preschool-aged children. However, the heterogeneity between studies, the absence of test set-based performance evaluations, the small sample size, and the non-standardized implementation of eyetracking might deteriorate the reliability of results. Further well-designed and well-executed studies with comprehensive and transparent reporting are needed to determine the optimal eye-tracking paradigms and machine learning algorithms."

基金机构:National Natural Science Founda-tion of China; Guangzhou and Guangdong Key Project; [81771223]; [202007030002]; [2018B030335001]

基金资助正文:"Acknowledgments This study was supported by the National Natural Science Founda-tion of China [grant number 81771223] and Guangzhou and Guangdong Key Project [grant numbers 202007030002, 2018B030335001] ."