IVGCVSW-6929 Support for models with implicit expanded
 dimensions

 * Added allow-expanded-dims to TFLite parser and ArmNN delegate
   * If true ArmNN will disregard dimensions with a size of 1 when
     validating tensor shapes. Tensor sizes must still match.
   * This allows us to support models where tensors have expanded
     dimensions (i.e. extra dimensions with a size of 1).
 * Fixed bug in Network where it assumed that only the first option
   could be ShapeInferenceMethod.
 * Fixed bug where m_ShapeInferenceMethod was lost when copying or
   moving Graphs.
 * Changed Delegate to pass "infer-output-shape", "allow-expanded-dims"
   and other BackendOptions through to the Network during construction.

Signed-off-by: Mike Kelly <mike.kelly@arm.com>
Change-Id: Ibe7c5ae6597796fc9164cb07bd372bd7f8f8cacf
diff --git a/tests/ExecuteNetwork/ExecuteNetwork.cpp b/tests/ExecuteNetwork/ExecuteNetwork.cpp
index ddabf3c..f0a3d08 100644
--- a/tests/ExecuteNetwork/ExecuteNetwork.cpp
+++ b/tests/ExecuteNetwork/ExecuteNetwork.cpp
@@ -389,6 +389,7 @@
         // Creates an InferenceModel, which will parse the model and load it into an IRuntime.
         typename InferenceModel<TParser, TDataType>::Params inferenceModelParams;
         inferenceModelParams.m_ModelPath                      = params.m_ModelPath;
+        inferenceModelParams.m_AllowExpandedDims              = params.m_AllowExpandedDims;
         inferenceModelParams.m_IsModelBinary                  = params.m_IsModelBinary;
         inferenceModelParams.m_ComputeDevices                 = params.m_ComputeDevices;
         inferenceModelParams.m_DynamicBackendsPath            = params.m_DynamicBackendsPath;