Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dough about the Accuracy claimed about Attention SNN #4

Open
A227902 opened this issue Mar 18, 2024 · 3 comments
Open

Dough about the Accuracy claimed about Attention SNN #4

A227902 opened this issue Mar 18, 2024 · 3 comments

Comments

@A227902
Copy link

A227902 commented Mar 18, 2024

The value of 'dt' as 15 and 'T 'as 60 at TABLE I for the data set DVS128 Gesture, and found the testing accuracy 89.9305% in the epoch 138, where your work in the paper says it has an efficiency of 96.53%. So any modification in the code is there that needs to be done to improve the testing accuracy to the claimed value.

@A227902
Copy link
Author

A227902 commented Mar 20, 2024

Thank you sir for your response, I also use CPU, will try to check for your 'dt'.

@StCross
Copy link

StCross commented Apr 26, 2024

I have the same question. Got 90.1 acc when reproduction, can you provide the dt and T for a better acc?

@oteomamo
Copy link

oteomamo commented Apr 27, 2024

I tested all A-SNNs for this paper, and my results are consistent with those in the paper. If you are achieving accuracy around 90%-92%, it's because you are running the training on a vanilla SNN without incorporating attention. To enable attention, you should specify the type of attention you want to use in the Config.py file of each dataset under the self.attention = "no" hyperparameter. You can set this to CA, TA, SA, CSA, TCA, TSA, TCSA, or no. The results shown in the paper use dt = 15 and T = 60 for the DVS128 Gesture dataset. The only difference in my results was that CSA gave the best test accuracy, with 96.32%. These are all GPU results however, cpu should is the same too, just longer training time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants