IVGCVSW-2029 Tweak results handling for batch size 2 test

When looking for the top probability, use the 'first' result not the 'second'.
This avoids an issue where for batched tests the classification index was reported wrongly.

Still doesn't correctly handle multiple results with the exact probabibility, or batched testing,
but it's slightly more correct than before.

Change-Id: I57d33552754667613e222d9d2037e12c87a96854
diff --git a/tests/InferenceTest.inl b/tests/InferenceTest.inl
index 5e858f0..7ce017c 100644
--- a/tests/InferenceTest.inl
+++ b/tests/InferenceTest.inl
@@ -60,7 +60,18 @@
         int index = 0;
         for (const auto & o : output)
         {
-            resultMap[ToFloat<typename TModel::DataType>::Convert(o, m_QuantizationParams)] = index++;
+            float prob = ToFloat<typename TModel::DataType>::Convert(o, m_QuantizationParams);
+            int classification = index++;
+
+            // Take the first class with each probability
+            // This avoids strange results when looping over batched results produced
+            // with identical test data.
+            std::map<float, int>::iterator lb = resultMap.lower_bound(prob);
+            if (lb == resultMap.end() ||
+                !resultMap.key_comp()(prob, lb->first)) {
+                // If the key is not already in the map, insert it.
+                resultMap.insert(lb, std::map<float, int>::value_type(prob, classification));
+            }
         }
     }