MLECO-4732: Updating FVP version

Dockerfile updated for it to download the latest Fixed Virtual
Platform (FVP) version 11.24.

Minor documentation change to say that Arm Corstone-310 FVP is
also available to download from the Arm Ecosystem FVP page.

Change-Id: Icd19f71f53b4a6f1b912919e42e941fb23800d57
Signed-off-by: Kshitij Sisodia <kshitij.sisodia@arm.com>
2 files changed
tree: 3765eb6ec55c88cda952115bd0e9e9e6f203da2f
  1. .clang-format
  2. .gitignore
  3. .gitmodules
  4. .pylintrc
  5. CMakeLists.txt
  6. CMakePresets.json
  7. Dockerfile
  8. LICENSE_APACHE_2.0.txt
  9. Readme.md
  10. build_default.py
  11. dependencies/
  12. docs/
  13. download_dependencies.py
  14. model_conditioning_examples/
  15. release_notes.txt
  16. resources/
  17. scripts/
  18. set_up_default_resources.py
  19. source/
  20. tests/
Readme.md

Arm® ML embedded evaluation kit

Overview

The ML embedded evaluation kit provides a range of ready to use machine learning (ML) applications for users to develop ML workloads running on the Arm® Ethos-U NPU and Arm® Cortex-M CPUs. You can also access metrics such as inference cycle count to estimate performance.

The Arm® Ethos-U NPU is a new class of ML processor, specifically designed to accelerate ML computation in constrained embedded and IoT devices.

ML use cases

Experiment with the included end-to-end software use cases and create your own ML applications for Cortex-M CPU and Ethos-U NPU.

ML applicationDescriptionNeural Network Model
Image classificationRecognize the presence of objects in a given imageMobilenet V2
Keyword spotting(KWS)Recognize the presence of a key word in a recordingMicroNet
Automated Speech Recognition(ASR)Transcribe words in a recordingWav2Letter
KWS and ASRUtilise Cortex-M and Ethos-U to transcribe words in a recording after a keyword was spottedMicroNet Wav2Letter
Anomaly DetectionDetecting abnormal behavior based on a sound recording of a machineMicroNet
Visual Wake WordRecognize if person is present in a given imageMicroNet
Noise ReductionRemove noise from audio while keeping speech intactRNNoise
Object detectionDetects and draws face bounding box in a given imageYolo Fastest
Generic inference runnerCode block allowing you to develop your own use case for Ethos-U NPUYour custom model

Recommended build targets

This repository is for building and deploying Machine Learning (ML) applications targeted for Arm® Cortex®-M and Arm® Ethos™-U NPU. To run evaluations using this software, we suggest using:

Arm® Corstone™-300 and Corstone™-310 design implementations are publicly available on Download FPGA Images page, or as a Fixed Virtual Platform of the MPS3 development board.

Quick Start

To run ML applications on the Cortex-M and Ethos-U NPU:

  1. First, verify that you have installed all of the required prerequisites.

    NOTE: Dockerfile is also available if you would like to build a Docker image.

  2. Clone the Ethos-U evaluation kit repository:

    git clone "https://review.mlplatform.org/ml/ethos-u/ml-embedded-evaluation-kit"
    cd ml-embedded-evaluation-kit
    
  3. Pull all the external dependencies with the following command:

    git submodule update --init
    
  4. Next, run the build_default Python script. It handles the downloading of the neural network models, compiling using Vela, and building the project using CMake.

Arm compiler
python3.9 ./build_default.py --toolchain arm
GNU Arm Embedded toolchain
python3.9 ./build_default.py
  1. Change directory to the generated cmake build folder which contains the .axf file output in the bin subdirectory. Launch the application by passing the .axf to the FVP you downloaded when installing the prerequisites. Alternatively, from the root directory add <cmake-build-your_config> to the path to the axf and use one of the following commands:

    From auto-generated (or custom) build directory:
    <path_to_FVP>/FVP_Corstone_SSE-300_Ethos-U55 -a ./bin/ethos-u-kws.axf
    
    From root directory:
    <path_to_FVP>/FVP_Corstone_SSE-300_Ethos-U55 -a <cmake-build-your_config>/bin/ethos-u-kws.axf
    
  2. A telnet window is launched through which you can interact with the application and obtain performance figures.

For more details, you can view the quick start guide.

Note: The default flow assumes Arm® Ethos™-U55 NPU usage, configured to use 128 Multiply-Accumulate units and is sharing SRAM with the Arm® Cortex®-M55.

Ml embedded evaluation kit supports:

Ethos™-U NPUDefault MACs/ccOther MACs/cc supportedDefault Memory ModeOther Memory Modes supported
Ethos™-U5512832, 64, 256Shared_SramSram_Only
Ethos™-U65256512Dedicated_SramShared_Sram

For more information see Building.

See full documentation:

Software and hardware overview

  • The ML use cases have common code such as initializing the Hardware Abstraction Layer (HAL)

  • The common application code can be run on native host machine (x86_64 or aarch64) or Arm Cortex-M architecture because of the HAL

  • Google® TensorFlow™ Lite for Microcontrollers inference engine is used to schedule the execution of neural network models

  • The Ethos-U NPU driver is integrated TensorFlow Lite for Microcontrollers

    • ML operators are delegated to the NPU with CPU fall-back for unsupported operators
    • CMSIS-NN is used to optimise CPU workload execution with int8 data type
    • Final ML operator fall-back is TensorFlow™ Lite for Microcontrollers' reference kernels
  • The provided set of common ML use-case functions will assist in implementing your application logic

    • When modifying use-case code, there is no requirement to modify other components of the eval kit
  • The CMake build system will discover and automatically include new ML application code into the compilation workflow

A high level overview of the different components in the software, and the platforms supported out-of-the-box, is shown in the diagram below.

APIs

Note: The Ethos-U NPU software stack is described here.

For a more detailed description of the build graph with all major components, see Building.

Reusable software

There are source files in the repository that form the core of the Machine Leaning flow for all the use cases. These are exposed as APIs that the examples can use and even be combined to form chained use cases. The API sources are designed to be portable across platforms and provide functionality for preprocessing of data, running an inference, and postprocessing of results. These allow a common flow for all use cases with minor differences in how each of these blocks are instantiated.

As an independent CMake project, these APIs can be used by or integrated into other projects easily. We also produce CMSIS Packs with these sources, so they could be used in all tools/IDEs (for example, Arm® Development Studio and Keil® µVision®) that support the use of CMSIS Packs.

Contributions

The ML embedded eval kit welcomes contributions. For more details on contributing to the eval kit please see the the contributors guide.

Communication

Please, if you want to start public discussion, raise any issues or questions related to this repository, use https://discuss.mlplatform.org/c/ml-embedded-evaluation-kit forum.

Inclusive language commitment

This product conforms to Arm's inclusive language policy and, to the best of our knowledge, does not contain any non-inclusive language. If you find something that concerns you, email terms@arm.com.

Licenses

The ML Embedded applications samples are provided under the Apache 2.0 license, see License Apache 2.0.

Application input data sample files are provided under their original license:

LicenceProvenience
Automatic Speech Recognition SamplesCreative Commons Attribution 4.0 International Public Licensehttp://www.openslr.org/12/
Image Classification SamplesCreative Commons Attribution 1.0https://www.pexels.com
Keyword Spotting SamplesCreative Commons Attribution 4.0 International Public Licensehttp://download.tensorflow.org/data/speech_commands_v0.02.tar.gz
Keyword Spotting and Automatic Speech Recognition SamplesCreative Commons Attribution 4.0 International Public Licensehttp://download.tensorflow.org/data/speech_commands_v0.02.tar.gz
Visual Wake Word SamplesCreative Commons Attribution 1.0https://www.pexels.com
Noise Reduction SamplesCreative Commons Attribution 4.0 International Public Licensehttps://datashare.ed.ac.uk/handle/10283/2791/
Object Detection SamplesCreative Commons Attribution 1.0https://www.pexels.com