Save and load custom optimizers for continued training in TensorFlow
问题 My question is essentially the exact same as that specified here but without using the Keras backend. Namely, how does one save and restore custom optimizers to their last state in TensorFlow (e.g. L-BFGS-B , Adam) when continuing training? As per the solution here for the Adam optimizer specifically, it appears one approach is to use tf.add_collection and tf.get_collection , but that appears to not work if I need to restore the optimizer in a new session/shell. I have written a simple test