Difference between revisions of "KiYoon Yoo"

From MIPAL
Jump to: navigation, search

Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /var/www/html/mediawiki/includes/diff/DairikiDiff.php on line 390
(Publications)
 
(5 intermediate revisions by one user not shown)
Line 16: Line 16:
 
:<font size = 3>''International Conferences'' </font><br>
 
:<font size = 3>''International Conferences'' </font><br>
 
<ul>
 
<ul>
[1] '''KiYoon Yoo''', Nojun Kwak, "Backdoor Attacks in Federated Learning by Poisoned Word Embeddings", Workshop on Federated Learning for Natural Language Processing (FL4NLP) at ACL 2022, May 2022. <br>
+
'''KiYoon Yoo''', WonHyuk Ahn, Jiho Jang, Nojun Kwak. "Robust Natural Language Watermarking through Invariant Features", ACL 2023, July 2023 (Long Paper, accepted). <br>
[1] '''KiYoon Yoo''', Jangho Kim, Jiho Jang, Nojun Kwak, "Detection of Word Adversarial Examples in Text Classification: Benchmark and Baseline via Robust Density Estimation", Findings of ACL 2022. <br>
+
'''KiYoon Yoo''', Nojun Kwak, "Backdoor Attacks in Federated Learning by Rare Embeddings and Gradient Ensembling", EMNLP 2022 (Long Paper, oral presentation). <br>
[2] Jangho Kim, '''KiYoon Yoo''', Nojun Kwak, "Position-based Scaled Gradient for Model Quantization and Sparse Training", Thirty-fourth Conference on Neural Information Processing Systems (NeurIPS 2020), Dec. 2020, Online<br>
+
'''KiYoon Yoo''', Nojun Kwak, "Backdoor Attacks in Federated Learning by Poisoned Word Embeddings", Workshop on Federated Learning for Natural Language Processing (FL4NLP) at ACL 2022, May 2022. <br>
 +
'''KiYoon Yoo''', Jangho Kim, Jiho Jang, Nojun Kwak, "Detection of Word Adversarial Examples in Text Classification: Benchmark and Baseline via Robust Density Estimation", Findings of ACL 2022 (Long Paper). <br>
 +
Jangho Kim, '''KiYoon Yoo''', Nojun Kwak, "Position-based Scaled Gradient for Model Quantization and Sparse Training", Thirty-fourth Conference on Neural Information Processing Systems (NeurIPS 2020), Dec. 2020, Online<br>
 
</ul>
 
</ul>
  
Line 25: Line 27:
 
:<font size = 3>''International Journals'' </font><br>
 
:<font size = 3>''International Journals'' </font><br>
 
<ul>
 
<ul>
[1] Hojun Lee, Donghwan Yun, Jayeon Yoo, '''Kiyoon Yoo''', Yong Chul Kim, Dong Ki Kim, Kook-Hwan Oh, Kwon Wook Joo, Yon Su Kim, Nojun Kwak, Seung Seok Han, "Deep Learning Model for Real-Time Prediction of Intradialytic Hypotension", Clinical Journal of the American Society of Nephrology, Accepted.
+
Hojun Lee, Donghwan Yun, Jayeon Yoo, '''Kiyoon Yoo''', Yong Chul Kim, Dong Ki Kim, Kook-Hwan Oh, Kwon Wook Joo, Yon Su Kim, Nojun Kwak, Seung Seok Han, "Deep Learning Model for Real-Time Prediction of Intradialytic Hypotension", Clinical Journal of the American Society of Nephrology, Accepted.
 
</ul>
 
</ul>
 
<br>
 
<br>
Line 31: Line 33:
 
:<font size = 3>''Open Archive'' </font><br>
 
:<font size = 3>''Open Archive'' </font><br>
 
<ul>
 
<ul>
[1] '''KiYoon Yoo''' and Nojun Kwak, "Backdoor Attacks in Federated Learning by Rare Embeddings and Gradient Ensembling", arXiv, April 2022. (Extended) <br>
+
SeongUk Park, '''Kiyoon Yoo''' and Nojun Kwak, "On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective", arXiv, Sep. 2020.<br>
[2] SeongUk Park, '''Kiyoon Yoo''' and Nojun Kwak, "On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective", arXiv, Sep. 2020.<br>
+
Sang-ho Lee, '''Kiyoon Yoo''', Nojun Kwak, "Asynchronous Edge Learning using Cloned Knowledge Distillation, arXiv, Oct. 2020.<br>
[3] Sang-ho Lee, '''Kiyoon Yoo''', Nojun Kwak, "Asynchronous Edge Learning using Cloned Knowledge Distillation, arXiv, Oct. 2020.<br>
+
  
 
</ul>
 
</ul>

Latest revision as of 13:20, 31 May 2023

Kiyoonyoo.jpg KiYoon Yoo (유기윤)


서울대학교 융합과학기술대학원 석박사통합과정 (2019.09 ~ )
서울대학교 식품생명공학/산업공학 (부전공) 졸업 (2019.08)


Tel: +82-31-888-9579
e-mail: 961230@snu.ac.kr

Publications

International Conferences
    KiYoon Yoo, WonHyuk Ahn, Jiho Jang, Nojun Kwak. "Robust Natural Language Watermarking through Invariant Features", ACL 2023, July 2023 (Long Paper, accepted).
    KiYoon Yoo, Nojun Kwak, "Backdoor Attacks in Federated Learning by Rare Embeddings and Gradient Ensembling", EMNLP 2022 (Long Paper, oral presentation).
    KiYoon Yoo, Nojun Kwak, "Backdoor Attacks in Federated Learning by Poisoned Word Embeddings", Workshop on Federated Learning for Natural Language Processing (FL4NLP) at ACL 2022, May 2022.
    KiYoon Yoo, Jangho Kim, Jiho Jang, Nojun Kwak, "Detection of Word Adversarial Examples in Text Classification: Benchmark and Baseline via Robust Density Estimation", Findings of ACL 2022 (Long Paper).
    Jangho Kim, KiYoon Yoo, Nojun Kwak, "Position-based Scaled Gradient for Model Quantization and Sparse Training", Thirty-fourth Conference on Neural Information Processing Systems (NeurIPS 2020), Dec. 2020, Online


International Journals
    Hojun Lee, Donghwan Yun, Jayeon Yoo, Kiyoon Yoo, Yong Chul Kim, Dong Ki Kim, Kook-Hwan Oh, Kwon Wook Joo, Yon Su Kim, Nojun Kwak, Seung Seok Han, "Deep Learning Model for Real-Time Prediction of Intradialytic Hypotension", Clinical Journal of the American Society of Nephrology, Accepted.


Open Archive
    SeongUk Park, Kiyoon Yoo and Nojun Kwak, "On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective", arXiv, Sep. 2020.
    Sang-ho Lee, Kiyoon Yoo, Nojun Kwak, "Asynchronous Edge Learning using Cloned Knowledge Distillation, arXiv, Oct. 2020.