2

I created a class and defined the train_step function inside it: TF tutorial: NMT_attention Without using the @tf.function significantly increases the training time. On defining it, I get a conversion error for the private variables declared inside the class.

@tf.function
    def train_step(self, input, target, encoderHidden):
      loss = 0

      with tf.GradientTape() as tape:
        encoderOutput, encoderHidden = self.__encoder(input, encoderHidden) #throws error

Below is the traceback:

    Using TensorFlow backend.
        2019-09-20 12:54:32.676302: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
        Traceback (most recent call last):
          File "/Users/Users/Library/Python/lib/python/site-packages/proj/Models/attention_model.py", line 499, in <module>
            model.fit(path, epochs)
          File "/Users/Users/Library/Python/lib/python/site-packages/proj/Models/attention_model.py", line 383, in fit
            loss = self.train_step(input, target, encoderHidden)
          File "/Users/Users/work_env/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 416, in __call__
            self._initialize(args, kwds, add_initializers_to=initializer_map)
          File "/Users/Users/work_env/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 359, in _initialize
            *args, **kwds))
          File "/Users/Users/work_env/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 1360, in _get_concrete_function_internal_garbage_collected
            graph_function, _, _ = self._maybe_define_function(args, kwargs)
          File "/Users/Users/work_env/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 1648, in _maybe_define_function
            graph_function = self._create_graph_function(args, kwargs)
          File "/Users/Users/work_env/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 1541, in _create_graph_function
            capture_by_value=self._capture_by_value),
          File "/Users/Users/work_env/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py", line 716, in func_graph_from_py_func
            func_outputs = python_func(*func_args, **func_kwargs)
          File "/Users/Users/work_env/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 309, in wrapped_fn
            return weak_wrapped_fn().__wrapped__(*args, **kwds)
          File "/Users/Users/work_env/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 2155, in bound_method_wrapper
            return wrapped_fn(*args, **kwargs)
          File "/Users/Users/work_env/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py", line 706, in wrapper
            raise e.ag_error_metadata.to_exception(type(e))
        AttributeError: in converted code:
            relative to /Users/Users:

            Library/Python/lib/python/site-packages/proj/Models/attention_model.py:262 train_step  *
                encoderOutput, encoderHidden = self.__encoder(input, encoderHidden)
            work_env/lib/python3.7/site-packages/tensorflow/python/autograph/impl/api.py:329 converted_call
                f = getattr(owner, f)

            AttributeError: 'Model' object has no attribute '__encoder'
Hackerds
  • 1,195
  • 2
  • 16
  • 34

0 Answers0