MLBEDSW-2688: LeakyRelu rewrite to LUT or MUL/MAX
Replaces LeakyRelu operations with LUT activation function when possible,
else to a combination of multiplication/maximization.
Signed-off-by: Louis Verhaard <louis.verhaard@arm.com>
Change-Id: I3d2eb2dba7145997c3cc711d0ef18ab355fbb416
diff --git a/ethosu/vela/tensor.py b/ethosu/vela/tensor.py
index 5fdea97..f0e7ea4 100644
--- a/ethosu/vela/tensor.py
+++ b/ethosu/vela/tensor.py
@@ -728,6 +728,9 @@
return True
return False
+ def is_scaling_equal(self, tens):
+ return self.quantization.is_scaling_equal(tens.quantization)
+
def equivalent(self, tens):
return self.equivalence_id == tens.equivalence_id