第ä����å±��碩士è«𡝗��𤾸��𤾸���
�𡁜£«è«𡝗��𠬍�
�ªç��𦒘��㵪��²ç��睲��¬å��羓���
å¾㛖�å§枏�ï¼������(�ºç�大å¸è³��å·¥ç��𠉛©¶��)
䏿�é¡𣬚𤌍ï¼�èª墧���§£����¥ç��𣂷�å°滚��§ç��㗇��©ç鍂
�±æ�é¡𣬚𤌍ï¼�Exploiting the Duality between Language Understanding and Generation and Beyond
����蹱�ï¼𡁻䒰ç¸訫� �蹱�
��
ä½³ä��𦒘��㵪��²ç��睲��¬å��羓���
å¾㛖�å§枏�ï¼��擧錇��(æ¸�虾大å¸�»æ�å·¥ç�å¸ç�ç©¶æ�)
䏿�é¡𣬚𤌍ï¼�å¸ç�深度�¹å¾µ�𦠜�è¨条©º�㮖»¥å¼·å�辨è�ä»»å�
�±æ�é¡𣬚𤌍ï¼�Learning Deep Feature and Label Space to Enhance Discriminative Recognition Tasks
����蹱�ï¼𡁏�ç¥�� �蹱�
��
碩士è«𡝗��𠬍�
�ªç��𦒘��㵪��²ç��睲��¬å��羓���
å¾㛖�å§枏�ï¼������(ä¸å¤®å¤§å¸è³��å·¥ç�å¸ç�ç©¶æ�)
䏿�é¡𣬚𤌍ï¼��ºæ䲰�°è���虾èª硺�èª䂿¢¼æ··å�è³�����ç¿»è¯æ¨¡å�
�±æ�é¡𣬚𤌍ï¼�Hokkien-Mandarin Code-Mixing Dataset and Neural Machine Translation
����蹱�ï¼朞㷍å®㛖¿° �蹱�
ä½³ä��𤾸��㵪��²ç��睲�ä»笔��羓���
1. å¾㛖�å§枏�ï¼��𤾸���(�½æ�交é�𡁜¤§å¸é𤓖ä¿¡æ�)
䏿�é¡𣬚𤌍ï¼�å°齿�å¼讛§£çº讛��¶æ䲰循å�å¸ç�
�±æ�é¡𣬚𤌍ï¼�Contrastive Disentangled Memory for Sequential Learning
����蹱�ï¼�ç°¡ä�å®�
�蹱�
2. å¾㛖�å§枏�ï¼��鞾�ç§�(�𣂼�大å¸è³��å·¥ç��𠉛©¶��)
䏿�é¡𣬚𤌍ï¼�KE-BERT: �鞟·´ç¿垍䰻è睃�å¼·è�表示模å��¼è�è¨���§£
�±æ�é¡𣬚𤌍ï¼�KE-BERT: Pre-training of Knowledge-Enhanced Word Representation For Language Understanding
����蹱�ï¼��³å��� �蹱�
3. å¾㛖�å§枏�ï¼��³æ���(�ºç�大å¸è³��å·¥ç��𠉛©¶��)
䏿�é¡𣬚𤌍ï¼�å¼訫����審æ䰻�羓�辯ä¸�滩�é»硺�ç¶𨅯�審稿è©閗����
�±æ�é¡𣬚𤌍ï¼�Incorporating Peer Reviews and Rebuttal Counter-Arguments for Meta-Review Generation
����蹱�ï¼��³ä¿¡å¸� �蹱�
4. å¾㛖�å§枏�ï¼��³é�å®�(�½æ�交é�𡁜¤§å¸äººå·¥æ惣�§æ�è¡栞��厩鍂碩士å¸ç�å¸ä�)
䏿�é¡𣬚𤌍ï¼��ºæ䲰å°齿�å¼讛�é»墧㟲����堒�ç·´ç�
�±æ�é¡𣬚𤌍ï¼�Incorporating Peer Reviews and Rebuttal Counter-Arguments for Meta-Review Generation
����蹱�ï¼�ç°¡ä�å®� �蹱�����½ç� �蹱�