How to access BERT intermediate layer outputs in TF Hub Module?

假装没事ソ 提交于 2020-01-14 13:27:08

问题


Does anybody know a way to access the outputs of the intermediate layers from BERT's hosted models on Tensorflow Hub?

The model is hosted here. I have explored the meta graph and found the only signatures available are "tokens, "tokenization_info", and "mlm". The first two are illustrated in the examples on github, and the masked language model signature doesn't help much. Some models like inception allow you to access all of the intermediate layers, but not this one.

Right now, all I can think of to do is:

  1. Run [i.values() for i in tf.get_default_graph().get_operations()] to get the names of the tensors, find the ones I want (out of thousands) then
  2. tf.get_default_graph().get_tensor_by_name(name_of_the_tensor) to access the values and stitch them together and connect them to my downstream layers.

Anybody know a cleaner solution with Tensorflow?

来源:https://stackoverflow.com/questions/55333558/how-to-access-bert-intermediate-layer-outputs-in-tf-hub-module

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!