I wonder to know how can I create placeholders like dnc_core.initial_state(batch_size=n) with the batch_size n changed in trainning.
dnc_core = dnc.DNC(...)
initial_core_state = dnc_core.initial_state(batch_size=1)
core_state_placeholders = snt.nest.map(lambda t: tf.placeholder(t.dtype, shape=t.shape),
initial_core_state)
When I create the graph,the shape of initial_state seems be fixed with the batch_size .