We enthusiastically welcome researchers who take initiative and show strong willingness.
To apply for our internship programs, you can choose either way: (1) pick one of
our internship topics and email to the contact in the chosen topic announcement;
(2) simply show your own interest to us (internmilab@gmail.com), no matter all of our topics below are closed.
In your application email, we expect your
transcript, future plan to apply for our lab, purpose of your internship(ex further education, study abroad), expected period of internship, and the topics you are interested in.
We prefer to pick an applicant who fits into our
current projects or
past publications, but it is not mandatory.
Experiencing our internship programs is NOT MANDATORY for joining our lab. However if you are interested in joining our lab we encourage you to apply for our internship at least 6 months before your admission.
Internship Contact
If you are interested in any of the topics below, please send an email to the contact email for that topic. If not, please send an email to the email below.
Deep Learning for Natural Language Processing (Recruiting)
Description
Deep learning methods have been applied for solving a wide range of natural language processing (NLP) tasks such as dialog, summarization, and question answering.
You can suggest any NLP tasks you want to attack. The
recent publications of our lab would help to choose the task.
Candidate Qualifications
- Strong attitude for the investigation
- Basic python skills (required)
- Understanding of recent algorithms in NLP literature. e.g., BERT.
- Experience in implementing algorithms using PyTorch or Tensorflow
Expected Internship Period
- Minimum three months - excluding period for the preliminary study (Pytorch, base ML, NLP basics)
Contact
Learning Consistent Skills from Denoised Knowledge Distribution (Recruiting)
Description
Neural Network learns a proposal distribution that best explains the given dataset distribution. In other words, the more clear and consistent the target distribution is, the more stable the learning becomes, and the more robust it is to out-of-distribution data. This research aims to find a consistent subset from a noisy dataset or find a stable denoising methodology so that an LLM Agent can learn consistent knowledge.
Candidate Qualifications
- Strong attitude for the investigation
- Experience in fine-tuning (lora) on 7B+ models
- Understanding of recent algorithms in NLP literature.
- It is a further plus if you can guarantee sufficient time and effort will be invested, as there is an opportunity to participate as the first author depending on your contribution.
Expected Internship Period
- Minimum three months - excluding period for the preliminary study (Pytorch, base ML, NLP basics)
Contact
- {koojahyun,minbeomkim,kyungmin97}@snu.ac.kr