No Looking Back(prop): Forward Pass to Faster Gradients
Gradients without backpropogation: This paper uses only a forward pass and forward mode auto-differentiation to compute gradient exactly, about 2X faster than backprop.
Gradients without backpropogation: This paper uses only a forward pass and forward mode auto-differentiation to compute gradient exactly, about 2X faster than backprop.