We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我疑惑的是在models.py的class ADNet(nn.Module)的forward(x)里怎么没有用到注意力机制class Self_Attn(nn.Module),是不是我理解错了??望指教
The text was updated successfully, but these errors were encountered:
请仔细看网络图和论文原文https://www.sciencedirect.com/science/article/abs/pii/S0893608019304241
Sorry, something went wrong.
好的,谢谢!
No branches or pull requests
您好,我疑惑的是在models.py的class ADNet(nn.Module)的forward(x)里怎么没有用到注意力机制class Self_Attn(nn.Module),是不是我理解错了??望指教
The text was updated successfully, but these errors were encountered: