Skip to content

issues Search Results · repo:pmichel31415/are-16-heads-really-better-than-1 language:Shell

Filter by

11 results
 (71 ms)

11 results

inpmichel31415/are-16-heads-really-better-than-1 (press backspace or delete to remove)

Hi, I am trying to understand why did we need different normalization factors for the last layer of the BERT compared to all other layers? https://github.com/pmichel31415/pytorch-pretrained-BERT/blob/18a86a7035cf8a48d16c101a66e439bf6ab342f1/examples/classifier_eval.py#L246 ...
  • Hritikbansal
  • 1
  • Opened 
    on Jul 6, 2022
  • #11

Hi, I m currently working on attention head pruning on models. I think in your reported experiments, you fine-tuned bert when training downstream MNLI task, right? But does it also work to fix the bert ...
  • Huan80805
  • 2
  • Opened 
    on Apr 26, 2022
  • #10

Hi, I am trying to reproduce your result of BERT. I followed the Prerequisite: # Pytorch pretrained BERT git clone https://github.com/pmichel31415/pytorch-pretrained-BERT cd pytorch-pretrained-BERT git ...
  • bing0037
  • 3
  • Opened 
    on Aug 28, 2021
  • #9

Hi ! @pmichel31415 1.In are-16-heads-really-better-than-1/experiments/MT/prune_wmt.sh you have the --raw-text $EXTRA_OPTIONS, and I don t know the meaning. Can you tell me its explanation and how to use ...
  • LiangQiqi677
  • Opened 
    on Mar 12, 2021
  • #8

Sorry to bother you. I met a bug druing runing the heads_pruning.sh , and the error is: 12:21:27-INFO: ***** Running evaluation ***** 12:21:27-INFO: Num examples = 9815 12:21:27-INFO: Batch size = 32 ...
  • YJiangcm
  • 2
  • Opened 
    on Feb 22, 2021
  • #7

1、 (1)I do this and get a pruned model: model.bert.prune_heads(to_prune) (2) I set n_retrain_steps_after_pruning a value greater than 0 next: aaa then: bbb to retrain my pruned model, that is ok? ...
  • Ixuanzhang
  • 1
  • Opened 
    on Jul 21, 2020
  • #5

Hello, I am trying to run the MT ablation experiments. When I ran the command wget https://s3.amazonaws.com/fairseq-py/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 I get the following error --2020-03-31 ...
  • marwash25
  • 2
  • Opened 
    on Mar 31, 2020
  • #4

Hi, thanks for your code! The pruning works great using masking. However, when I tried to actually prune the model to see, if there s a speedup, it fails. bash experiments/BERT/heads_pruning.sh SST-2 ...
  • pglock
  • 1
  • Opened 
    on Jul 4, 2019
  • #3

Hi, I am running the command bash experiments/BERT/heads_ablation.sh MNLI I am getting the following error Traceback (most recent call last): File pytorch-pretrained-BERT/examples/run_classifier.py ...
need more info
  • ishita1995
  • 7
  • Opened 
    on Jun 28, 2019
  • #2
Issue origami icon

Learn how you can use GitHub Issues to plan and track your work.

Save views for sprints, backlogs, teams, or releases. Rank, sort, and filter issues to suit the occasion. The possibilities are endless.Learn more about GitHub Issues
ProTip! 
Press the
/
key to activate the search input again and adjust your query.
Issue origami icon

Learn how you can use GitHub Issues to plan and track your work.

Save views for sprints, backlogs, teams, or releases. Rank, sort, and filter issues to suit the occasion. The possibilities are endless.Learn more about GitHub Issues
ProTip! 
Press the
/
key to activate the search input again and adjust your query.
Issue search results · GitHub