0

I have built syntaxnet, and tensorflow-serving using bazel. Both embed their own (partial?) copy of tensorflow itself. I already have the problem where I'd like to "import" some parts of tensorflow-serving in a script that "lives" in the syntaxnet tree which I can't figure out (without doing some VERY ugly things).

Now I'd like "tensorboard", but that apparently doesn't get built as part of the embedded tensorflow inside of syntaxnet or tensorflow-serving.

So now I'm sure "I'm doing it wrong". How am I supposed to be combining the artifacts built by various separate bazel workspaces?

In particular, how can I build tensorflow (with tensorboard) AND syntaxnet AND tensorflow-serving and have them "installed" for use so I can start writing my own scripts in a completely separate directory/repository?

Is "./bazel-bin/blah" really the end-game with bazel? There is no "make install" equivalent?

dmansfield
  • 1,108
  • 10
  • 22

1 Answers1

0

You're right, currently Tensorboard targets are only exposed in the Tensorflow repo, and not the other two that use it. That means that to actually bring up Tensorboard, you'll need to checkout Tensorflow on its own and compile/run Tensorboard there (pointing it to the generated logdir).

Actually generating the training summary data in a log directory is done during training, in your case in the tensorflow/models repo. It looks like SummaryWriter is used in inception_train.py, so perhaps you can add something similar to syntaxnet. If that doesn't work and you're not able to link Tensorboard, I'd recommend filing an issue in tensorflow/models to add support for Tensorboard there. You shouldn't need Tensorboard in Tensorflow Serving.

Importing parts of Tensorflow Serving in syntaxnet would require you to add this new depedency as a submodule (like is done with tensorflow) or possibly a git_repository in the WORKSPACE file if that works. We've never tried this, so it's possible that something is broken for this untested use case. Please file issues if you encounter a problem with this.

As for just installing and running, Tensorflow Serving doesn't support that right now. It's a set of libraries that you link in directly into your server binary and compile (the repo offers some example servers and clients), but right now there is no simple "installed server". Tensorflow along with Tensorboard however can be installed and linked from anywhere.

  • Thanks. I guess I just don't understand the bazel mentality where the repository IS the installation. To build against "x" you check it out and point your WORKSPACE at it. I'm used to building then installing and then compiling against /usr/include/*, /usr/lib64/*, and cosuming /usr/bin, /usr/lib/pythonx.y/site-packages etc. Bazel is a different model. I'll try it out the way you've suggested. Thanks! – dmansfield Jun 14 '16 at 22:19