Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AB注意力模块 #4

Open
yichuan123 opened this issue Mar 7, 2020 · 3 comments
Open

AB注意力模块 #4

yichuan123 opened this issue Mar 7, 2020 · 3 comments

Comments

@yichuan123
Copy link

您好,我疑惑的是在models.py的class ADNet(nn.Module)的forward(x)里怎么没有用到注意力机制class Self_Attn(nn.Module),是不是我理解错了??望指教

@hellloxiaotian
Copy link
Owner

请仔细看网络图和论文原文https://www.sciencedirect.com/science/article/abs/pii/S0893608019304241

@hellloxiaotian
Copy link
Owner

请仔细看网络图和论文原文https://www.sciencedirect.com/science/article/abs/pii/S0893608019304241

@yichuan123
Copy link
Author

好的,谢谢!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants