Difference between revisions of "KiYoon Yoo"

From MIPAL
Jump to: navigation, search

Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /var/www/html/mediawiki/includes/diff/DairikiDiff.php on line 390
(Publications)
(One intermediate revision by one user not shown)
Line 16: Line 16:
 
:<font size = 3>''International Conferences'' </font><br>
 
:<font size = 3>''International Conferences'' </font><br>
 
<ul>
 
<ul>
[1] '''KiYoon Yoo''', Nojun Kwak, "Backdoor Attacks in Federated Learning by Poisoned Word Embeddings", Workshop on Federated Learning for Natural Language Processing (FL4NLP) at ACL 2022, May 2022. <br>
+
[1] '''KiYoon Yoo''', Nojun Kwak, ``Backdoor Attacks in Federated Learning by Rare Embeddings and Gradient Ensembling, EMNLP 2022 (Long Paper) (oral presentation). <br>
[1] '''KiYoon Yoo''', Jangho Kim, Jiho Jang, Nojun Kwak, "Detection of Word Adversarial Examples in Text Classification: Benchmark and Baseline via Robust Density Estimation", Findings of ACL 2022. <br>
+
[2] '''KiYoon Yoo''', Nojun Kwak, "Backdoor Attacks in Federated Learning by Poisoned Word Embeddings", Workshop on Federated Learning for Natural Language Processing (FL4NLP) at ACL 2022, May 2022. <br>
[2] Jangho Kim, '''KiYoon Yoo''', Nojun Kwak, "Position-based Scaled Gradient for Model Quantization and Sparse Training", Thirty-fourth Conference on Neural Information Processing Systems (NeurIPS 2020), Dec. 2020, Online<br>
+
[3] '''KiYoon Yoo''', Jangho Kim, Jiho Jang, Nojun Kwak, "Detection of Word Adversarial Examples in Text Classification: Benchmark and Baseline via Robust Density Estimation", Findings of ACL 2022 (Long Paper). <br>
 +
[4] Jangho Kim, '''KiYoon Yoo''', Nojun Kwak, "Position-based Scaled Gradient for Model Quantization and Sparse Training", Thirty-fourth Conference on Neural Information Processing Systems (NeurIPS 2020), Dec. 2020, Online<br>
 
</ul>
 
</ul>
  
Line 31: Line 32:
 
:<font size = 3>''Open Archive'' </font><br>
 
:<font size = 3>''Open Archive'' </font><br>
 
<ul>
 
<ul>
[1] '''KiYoon Yoo''' and Nojun Kwak, "Backdoor Attacks in Federated Learning by Rare Embeddings and Gradient Ensembling", arXiv, April 2022. (Extended) <br>
+
[1] SeongUk Park, '''Kiyoon Yoo''' and Nojun Kwak, "On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective", arXiv, Sep. 2020.<br>
[2] SeongUk Park, '''Kiyoon Yoo''' and Nojun Kwak, "On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective", arXiv, Sep. 2020.<br>
+
[2] Sang-ho Lee, '''Kiyoon Yoo''', Nojun Kwak, "Asynchronous Edge Learning using Cloned Knowledge Distillation, arXiv, Oct. 2020.<br>
[3] Sang-ho Lee, '''Kiyoon Yoo''', Nojun Kwak, "Asynchronous Edge Learning using Cloned Knowledge Distillation, arXiv, Oct. 2020.<br>
+
  
 
</ul>
 
</ul>

Revision as of 14:24, 15 November 2022

Kiyoonyoo.jpg KiYoon Yoo (유기윤)


서울대학교 융합과학기술대학원 석박사통합과정 (2019.09 ~ )
서울대학교 식품생명공학/산업공학 (부전공) 졸업 (2019.08)


Tel: +82-31-888-9579
e-mail: 961230@snu.ac.kr

Publications

International Conferences
    [1] KiYoon Yoo, Nojun Kwak, ``Backdoor Attacks in Federated Learning by Rare Embeddings and Gradient Ensembling, EMNLP 2022 (Long Paper) (oral presentation).
    [2] KiYoon Yoo, Nojun Kwak, "Backdoor Attacks in Federated Learning by Poisoned Word Embeddings", Workshop on Federated Learning for Natural Language Processing (FL4NLP) at ACL 2022, May 2022.
    [3] KiYoon Yoo, Jangho Kim, Jiho Jang, Nojun Kwak, "Detection of Word Adversarial Examples in Text Classification: Benchmark and Baseline via Robust Density Estimation", Findings of ACL 2022 (Long Paper).
    [4] Jangho Kim, KiYoon Yoo, Nojun Kwak, "Position-based Scaled Gradient for Model Quantization and Sparse Training", Thirty-fourth Conference on Neural Information Processing Systems (NeurIPS 2020), Dec. 2020, Online


International Journals
    [1] Hojun Lee, Donghwan Yun, Jayeon Yoo, Kiyoon Yoo, Yong Chul Kim, Dong Ki Kim, Kook-Hwan Oh, Kwon Wook Joo, Yon Su Kim, Nojun Kwak, Seung Seok Han, "Deep Learning Model for Real-Time Prediction of Intradialytic Hypotension", Clinical Journal of the American Society of Nephrology, Accepted.


Open Archive
    [1] SeongUk Park, Kiyoon Yoo and Nojun Kwak, "On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective", arXiv, Sep. 2020.
    [2] Sang-ho Lee, Kiyoon Yoo, Nojun Kwak, "Asynchronous Edge Learning using Cloned Knowledge Distillation, arXiv, Oct. 2020.