问题
I have an prototxt as follows:
layer {
name: "data"
type: "HDF5Data"
top: "data"
top: "label"
include {
phase: TRAIN
}
hdf5_data_param {
source: "./train.txt"
batch_size: 2
}
}
layer {
name: "data_scale2"
type: "HDF5Data"
top: "data_scale2"
top: "label_scale2"
include {
phase: TRAIN
}
hdf5_data_param {
source: "./train_scale2.txt"
batch_size: 2
}
}
where the second layer contains the scale factor of 2 of the original data, I named it as data_scale2
. During training, I just used the data
,label
and data_scale2
without using label_scale2
. Hence, when I run training, the value of label_scale2
always printing in my terminal. How can I ignore this? This log looks like
I0302 18:01:57.356312 30995 solver.cpp:245] Train net output #221177: label_scale2 = 0
I0302 18:01:57.356314 30995 solver.cpp:245] Train net output #221178: label_scale2 = 0
I0302 18:01:57.356320 30995 solver.cpp:245] Train net output #221179: label_scale2 = 0
I0302 18:01:57.356324 30995 solver.cpp:245] Train net output #221180: label_scale2 = 0
I0302 18:01:57.356328 30995 solver.cpp:245] Train net output #221181: label_scale2 = 0
I0302 18:01:57.356329 30995 solver.cpp:245] Train net output #221182: label_scale2 = 0
I0302 18:01:57.356333 30995 solver.cpp:245] Train net output #221183: label_scale2 = 0
回答1:
General solution:
If you wan to silence a "top"
layer you can use "Silence" layer:
layer {
type: "Silence"
name: "silence_this_layer_for_me"
bottom: "label_scale2"
}
And that's it! You won;t hear from label_scale2
again
A solution for HDF5Data
Alternatively, you do not have to expose all datasets
of an hdf5
input. You can simply comment out the redundant "top"
:
layer {
name: "data_scale2"
type: "HDF5Data"
top: "data_scale2"
# top: "label_scale2"
include {
phase: TRAIN
}
hdf5_data_param {
source: "./train_scale2.txt"
batch_size: 2
}
}
来源:https://stackoverflow.com/questions/42550998/how-to-ignore-log-print-a-unused-layer-in-caffe