You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
, why do you apply SiLU on the input x before passing it through the linear layer and then the residual connection? Why not the linear layer first then SiLU? Or why even use SiLU altogether?
The text was updated successfully, but these errors were encountered:
Title. In
pykan/kan/KANLayer.py
Line 65 in ef4861e
The text was updated successfully, but these errors were encountered: