MLBEDSW-5384 FC layers run on NPU if underlying shape is 2D

*Added generic function which checks if underlying shape of
FullyConnected operation is 2D and performs shape reduction
*Fully connected operation >2 dimensions now run on NPU if the above
case is satisfied
*constraint_fc_output_2d and rewrite_fully_connected_input refactored
*Added unit test to confirm this functionality

Signed-off-by: Ayaan Masood <Ayaan.Masood@arm.com>
Change-Id: I0e29c767e5b84841eb53bbc44464b36a454f7b38
diff --git a/ethosu/vela/tensor.py b/ethosu/vela/tensor.py
index 38b0e43..e981584 100644
--- a/ethosu/vela/tensor.py
+++ b/ethosu/vela/tensor.py
@@ -823,6 +823,19 @@
         else:
             return self.values.item(0)
 
+    def get_shape_as_2d(self, dimension_2_size: int) -> Optional[Shape4D]:
+
+        elms = self.elements()
+        dimension_1_size = elms // dimension_2_size
+        # Checks if the reduction works and shape is not 1D
+        is_reducible = dimension_1_size * dimension_2_size == elms and not (len(self.shape) == 1)
+
+        new_shape = None
+        if is_reducible:
+            new_shape = Shape4D([dimension_1_size, 1, 1, dimension_2_size])
+
+        return new_shape
+
     def __lt__(self, other: "Tensor") -> bool:
         return self.equivalence_id < other.equivalence_id