Fine-tune Bert for specific domain (unsupervised)

孤人 提交于 2021-01-20 08:39:56

问题


I want to fine-tune BERT on texts that are related to a specific domain (in my case related to engineering). The training should be unsupervised since I don't have any labels or anything. Is this possible?


回答1:


What you in fact want to is continue pre-training BERT on text from your specific domain. What you do in this case is to continue training the model as masked language model, but on your domain-specific data.

You can use the run_mlm.py script from the Huggingface's Transformers.



来源:https://stackoverflow.com/questions/64712375/fine-tune-bert-for-specific-domain-unsupervised

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!