1. 758b5ba COMPMID-3069: Improve build time by splitting up ToolchainSupport.h by Matthew Bentham · 4 years, 4 months ago
  2. 1856ff7 COMPMID-3097 Fuse activation with fully connected layer CL by Giorgio Arena · 4 years, 5 months ago
  3. 6e9d0e0 COMPMID-2856 Add PrintLayer at graph level by Giorgio Arena · 4 years, 6 months ago
  4. f4261ad COMPMID-2779: Add support for generating synthetic int8 graphs. by Georgios Pinitas · 4 years, 7 months ago
  5. 5dea19e COMPMID-2579: Fuse batch normalization with convolution and depthwise convolution at graph level on NEON by Gian Marco Iodice · 4 years, 8 months ago
  6. 04ea4e8 COMPMID-2581: Fuse batch normalization with convolution and depthwise convolution at graph level for OpenCL - FP16 by giuros01 · 4 years, 9 months ago
  7. 351bd13 compmid-2573: Investigate FP16 Winograd reference implementations by giuros01 · 4 years, 11 months ago
  8. f948b10 COMPMID-2582: Disable fuse batch normalization on OpenCL for FP16 by Gian Marco Iodice · 4 years, 11 months ago
  9. 169cda3 COMPMID-2055: Fusion of ConvolutionLayer with BatchNormalization at graph only for CL by Manuel Bottini · 5 years ago
  10. 2ea3761 COMPMID-2336: Fix InPlaceMutator condition and add SaveNumpyAccessor by Isabella Gottardi · 5 years ago
  11. bffb41e COMPMID-2273: Fuse Batch Normalization with Depthwise Convolution layer at graph level (only for CL) by Manuel Bottini · 5 years ago
  12. 299fdd3 COMPMID-2177 Fix clang warnings by Michalis Spyrou · 5 years ago
  13. 9e4824c COMPMID-2111: ConcatenateLayer API should accept an index instead of an enum by Georgios Pinitas · 5 years ago
  14. cadb368 COMPMID-1995: Fixed graph fusion mutator for float types. by Georgios Pinitas · 5 years ago
  15. 0ae5de9 COMPMID-1995: Prepare Graph to support different input/output quantization info by Isabella Gottardi · 5 years ago
  16. 749021a COMPMID-1995: Revert fusing convolution to batch norm due to performance regressions by giuros01 · 5 years ago
  17. acce504 COMPMID-1740: Fuse batch normalization with Convolution Layer at graph level by giuros01 · 5 years ago
  18. 1c32bf396 COMPMID-1451: Perform fusion before GroupConvolution unrolling by Georgios Pinitas · 6 years ago
  19. 60e9825 COMPMID-1451: Fuse activation in DepthwiseConvolution. by Georgios Pinitas · 6 years ago
  20. 08346e9 COMPMID-1451:Fuse RELU,LU_BOUNDED_RELU with requantization in NEGEMMConvolutionLayer. by Georgios Pinitas · 6 years ago
  21. 890ad1b COMPMID-1246: Fix bug in handling backends that can't be loaded in the Graph API by Anthony Barbier · 6 years ago
  22. 2a2db59 COMPMID-1505: Add native grouping support at graph level by Georgios Pinitas · 6 years ago
  23. e222055 COMPMID-1367: Enable NHWC in graph examples by Georgios Pinitas · 6 years ago
  24. 6f109bd COMPMID-1409: Disable BN + Act fusion in the graph if activation is not supported by Georgios Pinitas · 6 years ago
  25. d3a78ab COMPMID-1283: (GitHub issue) after convolution output data is zero by Georgios Pinitas · 6 years ago
  26. cac13b1 COMPMID-1097: Port mobilenet to NHWC by Georgios Pinitas · 6 years ago
  27. d9eb275 COMPMID-797: Switch to new graph. by Georgios Pinitas · 6 years ago