Using Weight Sharing to Improve Machine Learning (IMAGE)
Caption
An example of grouped partial weight sharing, here with two groups. Lease's team stochastically selects embedding weights to be shared between words belonging to the same groups. Weight sharing constrains the number of free parameters that a system must learn, increases the efficiency and accuracy of the neural model, and serves as a flexible way to incorporate prior knowledge, combining the best of human knowledge with machine learning.
Credit
Ye Zhang, Matthew Lease, UT Austin; Byron C. Wallace, Northeastern University
Usage Restrictions
None
License
Licensed content