-
Notifications
You must be signed in to change notification settings - Fork 40
BASE/IQL_S3 v20221003
Cryolite edited this page Oct 11, 2022
·
2 revisions
- Type: Transformer encoder layers (the same network structure as the one used for BERTBASE)
- Dimension: 768
- # of heads: 12
- Dimension of feedforward networks: 3072
- # of layers: 12
- Activation function: GELU
- Dropout rate in training: 0.1
- Initialization: Random
- Type: Single-layer position-wise feedforward network
- Dimension: 3072
- Activation function: GELU
- Dropout rate in training: 0.1
- Initialization: Random
- Type: Transformer encoder layers (the same network structure as the one used for BERTBASE)
- Dimension: 768
- # of heads: 12
- Dimension of feedforward networks: 3072
- # of layers: 12
- Activation function: GELU
- Dropout rate in training: 0.1
- Initialization: Random
- Type: Dueling network with two single-layer position-wise feedforward networks
- Dimension: 3072
- Activation function: GELU
- Dropout rate in training: 0.1
- Initialization: Random
- Type: Implicit Q-learning (IQL)
- Reward: Game delta of grading points as a Saint 3 player in the Jade room
Crawled Game Records v202007_202109
110000000 samples randomly sampled from the crawled game records and shuffled.
(Since there is a large jump in the value of one of the Q loss functions after 120000000 samples, the point where the loss function has the lowest value is taken before that point.)
- Discount factor (γ): 1.0
- Expectile (τ): 0.90
- Soft update (Polyak averaging) rate of target networks (α): 0.1
- Optimizer: LAMB
- Learning rate: 0.001
- ε: 1.0e-6
- Batch size: 65536
- # of training epochs: N/A
(TODO)
Quantitative Comparison with BASE/BC_H13 v20220210 as the Baseline
(TODO)