Question asking pipeline for Huggingface transformers

家住魔仙堡 提交于 2021-01-29 13:20:42

问题


Huggingface tranformers has a pipeline for question answering tuning on the Squad dataset.

What would I need to do to develop a pipeline for a question asking pipeline? This would use the context, question and answer to generate questions with answers from a context. Are there any examples for creating new hunggingface pipelines?


回答1:


Pipelines can simply be treated as a wrapper around pre-trained models. In this case, you could perform fine-tuning/pre-training in the same way as existing example scripts such as run_squad. You can of course

An example on how to load custom checkpoint models is given on the HuggingFace website (slightly modified here):

from transformers import pipeline

# Question answering pipeline, specifying the checkpoint identifier
pipeline('question-answering', model='distilbert-base-cased-distilled-squad', tokenizer='bert-base-cased')

If your pipeline differs from existing options to a point where you indeed need to implement a new pipeline class, then I would suggest to look at the respective implementation. Specifically, most models do "nothing more" than to neatly wrap the tokenization step on pre-trained models, and calling the correct parameters on the model.forward function. This is probably as concise as an example as you will currently (version 2.8) find, since the feature is still relatively new.

I hope all the pointers helped to get you started!



来源:https://stackoverflow.com/questions/61132448/question-asking-pipeline-for-huggingface-transformers

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!