Open
Description
Weight sharing is a very popular technology. BigDL has provided weight sharing in some container(e.g. recurrent). It's better if we convert this to a general API.
The API can refer to keras
layer = Layer(...)
output1 = layer(input1)
output2 = layer(input2)
In the backend, when graph check one layer exists in multiple nodes, it can auto clone the layer and share their weights.