MLBEDSW-2637 Refactor util funcs out of softmax.py
There were a number of "TensorUtil" functions defined in softmax.py
These have been moved to their respective classes for Tensor and
Operator respectively.
Two of the functions were not a simple tensor/op function. These helper
functions have been moved to tensor.py for the simple fact that they
return Tensor's
Signed-off-by: Michael McGeagh <michael.mcgeagh@arm.com>
Change-Id: I17d39c4e11f0837b7867b4a54da2e4a56383e095
diff --git a/ethosu/vela/operation.py b/ethosu/vela/operation.py
index 7134fd8..adbbff5 100644
--- a/ethosu/vela/operation.py
+++ b/ethosu/vela/operation.py
@@ -311,3 +311,12 @@
self.attrs["fused_activation_function"] = "LUT"
self.activation_lut = lut_tensor
self.inputs.append(lut_tensor)
+
+ def add_input_tensor(self, tens):
+ self.inputs.append(tens)
+ if self not in tens.consumer_list:
+ tens.consumer_list.append(self)
+
+ def set_output_tensor(self, tens):
+ tens.ops = [self]
+ self.outputs = [tens]