Add threshold for floating-point SOFT_RELU activation

Added missing threshold for calculating SOFT_RELU when SVE and CL implementations are used. As a result removed from the testing bounds for input values that were set to be in the interval [-40, 40].

Resolves: COMPMID-5658
Signed-off-by: Milos Puzovic <Milos.Puzovic@arm.com>
Change-Id: I3d14df60125e36e4eb85aeb222f4fb0cc5741521
Reviewed-on: https://review.mlplatform.org/c/ml/ComputeLibrary/+/8536
Comments-Addressed: Arm Jenkins <bsgcomp@arm.com>
Reviewed-by: Viet-Hoa Do <viet-hoa.do@arm.com>
Reviewed-by: Gunes Bayir <gunes.bayir@arm.com>
Tested-by: Arm Jenkins <bsgcomp@arm.com>
Benchmark: Arm Jenkins <bsgcomp@arm.com>
diff --git a/tests/validation/reference/ActivationLayer.h b/tests/validation/reference/ActivationLayer.h
index 2bf9683..a813ba5 100644
--- a/tests/validation/reference/ActivationLayer.h
+++ b/tests/validation/reference/ActivationLayer.h
@@ -64,7 +64,7 @@
             ret = (x > 0) ? x : a * x;
             break;
         case ActivationLayerInfo::ActivationFunction::SOFT_RELU:
-            ret = std::log(static_cast<T>(1) + std::exp(x));
+            ret = std::log(static_cast<T>(1) + std::exp(static_cast<double>(x)));
             break;
         case ActivationLayerInfo::ActivationFunction::ELU:
             ret = (x > 0) ? x : a * (std::exp(x) - static_cast<T>(1));