Skip to content

Commit

Permalink
DeepPoly ReLU and FullyConnected
Browse files Browse the repository at this point in the history
  • Loading branch information
MekAkUActOR committed Nov 21, 2022
1 parent 1b1f55d commit fd5f98d
Show file tree
Hide file tree
Showing 3 changed files with 34 additions and 13 deletions.
26 changes: 23 additions & 3 deletions code/deeppoly.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ def compute_verify_result(self, true_label):

return weights

################################################################## to understand
# TODO: implement Conv in resolve
def resolve(self, constrains, layer, lower=True):
"""
lower = True: return the lower bound
Expand Down Expand Up @@ -82,7 +82,7 @@ def __init__(self, in_features):
super(DPReLU, self).__init__()
self.in_features = in_features
self.out_features = in_features
self.alpha = torch.nn.Parameter(torch.ones(in_features) * 0.3)
self.alpha = torch.nn.Parameter(torch.ones(in_features))
self.alpha.requires_grad = True

def forward(self, x):
Expand Down Expand Up @@ -151,5 +151,25 @@ def forward(self, x):


class DPConv(nn.Module):
def __init__(self, nested: nn.Conv2d):
def __init__(self, nested: nn.Conv2d, in_feature):
super(DPConv, self).__init__()
self.weight = nested.weight.detach()
self.bias = nested.bias.detach()
self.in_channels = nested.in_channels
self.out_channels = nested.out_channels
self.kernel_size = nested.kernel_size
self.stride = nested.stride
self.padding = nested.padding
self.padding_mode = nested.padding_mode
self.dilation = nested.dilation
self.groups = nested.groups
self.in_feature = in_feature
self.out_feature = func(self.in_feature)

def forward(self, x):
x.save()

def feature_size(self):
self.in_feature


8 changes: 4 additions & 4 deletions code/evaluate
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
#!/bin/bash

rm ../examples/res.txt
rm $1/res.txt
for net in {1..3}
do
echo Evaluating network net${net}...
for spec in `ls ../examples/net${net}/`
for spec in `ls $1/net${net}/`
do
echo ${spec}
res=$(python verifier.py --net net${net} --spec ../examples/net${net}/${spec})
echo net${k}_${net},$spec,$res >> ../examples/res.txt
res=$(python verifier.py --net net${net} --spec $1/net${net}/${spec})
echo net${k}_${net},$spec,$res >> $1/res.txt
done
done
13 changes: 7 additions & 6 deletions project_log
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,15 @@


已完成:

实现给ReLU加上DeepPoly Relaxation的函数(类)
在resolve中实现Linear的BackSub
完成FullyConnected的测试
在verifier.py中使用可验证的网络验证各网络

------------------------------------------------
要做的:
在deeppoly.py中实现给ReLU加上DeepPoly的函数(类)
- linear和conv的神经元也需要加上DeepPoly吗?
在resolve中实现Conv和Res的BackSub
DPConv的out_feature的计算
在verifier.py中使用上述函数(类)将各网络转化为可验证的网络
- FullyConnected
- Conv
- NormalizedResnet
在verifier.py中使用可验证的网络验证各网络
- NormalizedResnet

0 comments on commit fd5f98d

Please sign in to comment.