UM

Browse/Search Results:  1-10 of 12 Help

Selected(0)Clear Items/Page:    Sort:
Enhanced n-butanol production from lignocellulosic biomass hydrolysates by metabolically engineered Clostridium tyrobutyricum immobilized in fibrous-bed bioreactor Conference paper
Nanjing, October 29-30, 2022
Authors:  HOJAE SHIM
Microsoft Powerpoint | Favorite |  | TC[WOS]:0 TC[Scopus]:0 | Submit date:2022/10/29
KDCTime: Knowledge distillation with calibration on InceptionTime for time-series classification Journal article
Information Sciences, 2022,Volume: 613,Page: 184-203
Authors:  Gong, Xueyuan;  Si, Yain Whar;  Tian, Yongqi;  Lin, Cong;  Zhang, Xinyuan;  Liu, Xiaoxiang
Favorite |  | TC[WOS]:0 TC[Scopus]:0 | Submit date:2022/11/07
Deep Neural Networks  Inceptiontime  Knowledge Distillation  Overfitting  Time-series Classification  
Integrated in Silico Formulation Design of Lipid-based Drug Delivery Systems Thesis
University of Macau: University of Macau, 2022
Authors:  Haoshi Gao;  Li HF(李海峰);  Defang Ouyang
Adobe PDF | Favorite |  | Submit date:2022/08/15
Experimental Investigation of Heat Transfer and Pressure Drop Characteristics for Vertical Downflow using Traditional and 3D-printed Mini tubes Conference paper
Xi an, China, 2022.07.27-2022.07.31
Authors:  J.H. Chen;  L.M. Tam;  A.J. Ghajar
Adobe PDF | Favorite |  | TC[WOS]:28 TC[Scopus]:0 | Submit date:2022/08/30
Heat Transfer  Pressure Drop  3d Printed Mini Tube  
IID-Net: Image Inpainting Detection Network via Neural Architecture Search and Attention Journal article
IEEE Transactions on Circuits and Systems for Video Technology, 2022,Volume: 32,Issue: 3,Page: 1172-1185
Authors:  Wu, Haiwei;  Zhou, Jiantao
Adobe PDF | Favorite |  | TC[WOS]:4 TC[Scopus]:3 | Submit date:2022/03/28
Deep Neural Networks  Generalizability  Inpainting Forensics  
“In-Network Ensemble”: Deep Ensemble Learning with Diversified Knowledge Distillation Journal article
ACM Transactions on Intelligent Systems and Technology, 2021,Volume: 12,Issue: 5,Page: 63:1-63:19
Authors:  Xingjian Li;  Haoyi Xiong;  Zeyu Chen;  Jun Huan;  Cheng-Zhong Xu;  Dejing Dou
Adobe PDF | Favorite |  | TC[WOS]:1 TC[Scopus]:3 | Submit date:2022/08/22
Transfer Learning  Knowledge Distillation  Ensemble Learning  
Recent Advances in Dialogue Machine Translation Journal article
Information, 2021,Volume: 12,Issue: 11
Authors:  Liu, Siyou;  Sun, Yuqi;  Wang, Longyue
Adobe PDF | Favorite |  | TC[WOS]:0 TC[Scopus]:0 | Submit date:2022/08/28
Dialogue  Neural Machine Translation  Discourse Issue  Benchmark Data  Existing Approaches  Real-life Applications  Building Advanced System  
Knowledge Distillation with Attention for Deep Transfer Learning of Convolutional Networks Journal article
ACM Transactions on Knowledge Discovery from Data, 2021,Volume: 16,Issue: 3,Page: 42:1-42:20
Authors:  Xingjian Li;  Haoyi Xiong;  Zeyu Chen;  Jun Huan;  Ji Liu;  Cheng-Zhong Xu;  Dejing Dou
Adobe PDF | Favorite |  | TC[WOS]:0 TC[Scopus]:0 | Submit date:2022/08/22
Trasfer Learning  Framework  Algorithms  Knowledge Distillation  
Rejuvenating Low-Frequency Words: Making the Most of Parallel Data in Non-Autoregressive Translation Conference paper
The 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing
Authors:  Ding, L.;  Wang, L.;  Liu, X.;  Wong, F.;  Tao, D.;  Tu, Z.
Favorite |  | TC[WOS]:0 TC[Scopus]:0 | Submit date:2022/06/14
Non-Autoregressive  Neural Machine Translation  
Understanding and Improving Lexical Choice in Non-Autoregressive Translation Conference paper
Ninth International Conference on Learning Representations
Authors:  Ding, L.;  Wang, L.;  Liu, X.;  Wong, F.;  Tao, D.;  Tu, Z.
Favorite |  | TC[WOS]:0 TC[Scopus]:0 | Submit date:2022/08/19
Non-Autoregressive  Neural Machine Translation