Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better version of PR #7985 (Modify load() for inference) #8024

Merged
merged 21 commits into from
Feb 8, 2018

Conversation

sidgoyal78
Copy link
Contributor

@sidgoyal78 sidgoyal78 commented Feb 1, 2018

Related PR #7985
Fix #7972

@Xreki Xreki added the 预测 原名Inference,包含Capi预测问题等 label Feb 1, 2018
framework::Scope& scope,
const std::string& dirname,
const framework::ProgramDesc& main_program,
const std::string& param_filename) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are two LoadPersistables. Can merge them into one?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

framework::Executor& executor,
framework::Scope& scope,
const std::string& dirname,
const std::string& param_filename) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to keep two Load() interfaces as:

std::unique_ptr<framework::ProgramDesc> Load(
     framework::Executor& executor,
     framework::Scope& scope,
     const std::string& dirname);

std::unique_ptr<framework::ProgramDesc> Load(
     framework::Executor& executor,
     framework::Scope& scope,
     const std::string& prog_filename,
     const std::string& params_filename);

Users are free to rename the file which saving the program. The argument list of save_inference_model may be changed in the future. #7995 is merged, we need to test the new Load. Please use the new Load in the unittest instead.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@Xreki Xreki requested a review from luotao1 February 2, 2018 09:21
framework::Scope& scope,
const std::string& dirname,
const std::string& prog_filename,
const std::string& param_filename) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for not describing my suggest clearly. I mean to define the interface to:

std::unique_ptr<framework::ProgramDesc> Load(
    framework::Executor& executor,
    framework::Scope& scope,
    const std::string& prog_filepath,
    const std::string& param_filepath);

No need of dirname any more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I think you described your suggestion clearly: #8024 (comment)

But, I chose to keep another argument of dirname to make stuff consistent with the python side. More specifically, in save_vars() we have: https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/v2/fluid/io.py#L124, and in load_vars() we have: https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/v2/fluid/io.py#L202

Hence, I put that extra argument. But it seems that you would prefer an implementation without it, so I will modify it. Thanks.

inputfs.seekg(0, std::ios::beg);
VLOG(3) << "program_desc_str's size: " << program_desc_str.size();
inputfs.read(&program_desc_str[0], program_desc_str.size());
inputfs.close();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please try to reuse these codes (line 101 - 109 and line 124 - 133).Maybe we can directly call the following Load() to simplify the implementation.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

cc_test(test_inference_recognize_digits_mlp_combine
SRCS test_inference_recognize_digits_combinels.cc
DEPS ARCHIVE_START paddle_fluid ARCHIVE_END
ARGS --dirname=${PYTHON_TESTS_DIR}/book/recognize_digits_mlp_combine.inference.model --prog_filename=__model__ --param_filename=params.dat)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can implement the two unittests in one .cc file.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


DEFINE_string(dirname, "", "Directory of the inference model.");
DEFINE_string(prog_filename, "", "File storing programDesc of inference model");
DEFINE_string(param_filename, "", "File storing parameters of inference model");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In our unittests, we can fix the param_filename to __PARAMS__, so that no need for the two gflags variables.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, sure. I will hardcode the string then.


// 2. Initialize the inference_program and load all parameters from file
auto inference_program = paddle::inference::Load(
executor, *scope, dirname, prog_filename, param_filename);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change the definition of TestInference to:

template <typename Place, typename T, boolean IsCombined>
void TestInference(const std::string& dirname,
                   ...

Then change line 38 to:

if (IsCombined) {
  ...
} else {
  ...
}

So that we can define and call a common TestInference function for the two unittests.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

delete scope;
}

TEST(inference, recognize_digits_combine) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move this function to test_inference_recognize_digits.cc.
Please note that we should avoid repeated codes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, i didn't think about hardcoding the strings, hence couldn't think of a way to have everything together.

@@ -214,7 +226,8 @@ def inject_all_tests():
for use_cuda in (False, True):
for parallel in (False, True):
for nn_type in ('mlp', 'conv'):
inject_test_method(use_cuda, parallel, nn_type)
for combine in (False, True):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please don't add this loop. This will twice the burden of CI. If there is not another way, we may only test the combined save/load.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added just 1 unit-test for the 'separate' params case.

@@ -384,7 +389,12 @@ def load_inference_model(dirname, executor, load_file_name=None):
if not os.path.isdir(dirname):
raise ValueError("There is no directory named '%s'", dirname)

model_file_name = dirname + "/__model__"
model_file_name = ""
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here. Please delete this line.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

@@ -342,7 +342,12 @@ def save_inference_model(dirname,
prepend_feed_ops(inference_program, feeded_var_names)
append_fetch_ops(inference_program, fetch_var_names)

model_file_name = dirname + "/__model__"
model_file_name = ""
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please delete this line.
Unlike C++, if else statements don't define a scope in Python. So model_file_name is visible outside.
https://stackoverflow.com/questions/7382638/python-variable-scope-in-if-statements

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, thanks.

kexinzhao
kexinzhao previously approved these changes Feb 8, 2018
Copy link
Contributor

@kexinzhao kexinzhao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@@ -21,6 +21,18 @@ limitations under the License. */
namespace paddle {
namespace inference {

void ReadProgramDescFromFile(const std::string& model_filename,
std::string& program_desc_str) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this function is a common function to read binary to string. If there is no other use for this reading function, how about change it to

std::unique_ptr<framework::ProgramDesc> ReadProgram(const std::string& filename)

But we can change this in next PR.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this function will also be useful for "loading-from-buffer" case. I have changed the name to ReadBinaryFile.


if (!param_filename.empty()) {
// sort paramlist to have consistent ordering
std::sort(paramlist.begin(), paramlist.end());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tensor and LoDTensor have no name. If there is something wrong (for example, user coincidentally gives a wrong param_filename), there is no warning information at all. It will be difficult for users to debug. Can we add another mechanism to check or ensure the correctness? We can refer to other frameworks. Also, this can be done in next PR.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point, will think about it and as you mentioned will do later.

if save_file_name == None:
model_file_name = dirname + "/__model__"
else:
model_file_name = dirname + "/__model_combined__"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The naming style is not friendly for users. We need to refine this interface.

Copy link
Contributor

@kexinzhao kexinzhao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some of the issues mentioned by @Xreki will be fixed in future PRs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
预测 原名Inference,包含Capi预测问题等
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants