Vidhya Sudhan Loganathan | d646ae1 | 2018-11-19 15:18:20 +0000 | [diff] [blame] | 1 | /// |
| 2 | /// Copyright (c) 2017-2018 ARM Limited. |
| 3 | /// |
| 4 | /// SPDX-License-Identifier: MIT |
| 5 | /// |
| 6 | /// Permission is hereby granted, free of charge, to any person obtaining a copy |
| 7 | /// of this software and associated documentation files (the "Software"), to |
| 8 | /// deal in the Software without restriction, including without limitation the |
| 9 | /// rights to use, copy, modify, merge, publish, distribute, sublicense, and/or |
| 10 | /// sell copies of the Software, and to permit persons to whom the Software is |
| 11 | /// furnished to do so, subject to the following conditions: |
| 12 | /// |
| 13 | /// The above copyright notice and this permission notice shall be included in all |
| 14 | /// copies or substantial portions of the Software. |
| 15 | /// |
| 16 | /// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR |
| 17 | /// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, |
| 18 | /// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE |
| 19 | /// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER |
| 20 | /// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, |
| 21 | /// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE |
| 22 | /// SOFTWARE. |
| 23 | /// |
| 24 | namespace arm_compute |
| 25 | { |
steniu01 | bee466b | 2017-06-21 16:45:41 +0100 | [diff] [blame] | 26 | /** |
| 27 | @page data_import Importing data from existing models |
| 28 | |
| 29 | @tableofcontents |
| 30 | |
| 31 | @section caffe_data_extractor Extract data from pre-trained caffe model |
| 32 | |
| 33 | One can find caffe <a href="https://github.com/BVLC/caffe/wiki/Model-Zoo">pre-trained models</a> on |
| 34 | caffe's official github repository. |
| 35 | |
Alex Gilday | c357c47 | 2018-03-21 13:54:09 +0000 | [diff] [blame] | 36 | The caffe_data_extractor.py provided in the scripts folder is an example script that shows how to |
SiCong Li | 86b5333 | 2017-08-23 11:02:43 +0100 | [diff] [blame] | 37 | extract the parameter values from a trained model. |
steniu01 | bee466b | 2017-06-21 16:45:41 +0100 | [diff] [blame] | 38 | |
SiCong Li | 86b5333 | 2017-08-23 11:02:43 +0100 | [diff] [blame] | 39 | @note complex networks might require altering the script to properly work. |
steniu01 | bee466b | 2017-06-21 16:45:41 +0100 | [diff] [blame] | 40 | |
Anthony Barbier | 6a5627a | 2017-09-26 14:42:02 +0100 | [diff] [blame] | 41 | @subsection caffe_how_to How to use the script |
steniu01 | bee466b | 2017-06-21 16:45:41 +0100 | [diff] [blame] | 42 | |
| 43 | Install caffe following <a href="http://caffe.berkeleyvision.org/installation.html">caffe's document</a>. |
| 44 | Make sure the pycaffe has been added into the PYTHONPATH. |
| 45 | |
| 46 | Download the pre-trained caffe model. |
| 47 | |
| 48 | Run the caffe_data_extractor.py script by |
| 49 | |
SiCong Li | 86b5333 | 2017-08-23 11:02:43 +0100 | [diff] [blame] | 50 | python caffe_data_extractor.py -m <caffe model> -n <caffe netlist> |
steniu01 | bee466b | 2017-06-21 16:45:41 +0100 | [diff] [blame] | 51 | |
| 52 | For example, to extract the data from pre-trained caffe Alex model to binary file: |
| 53 | |
SiCong Li | 86b5333 | 2017-08-23 11:02:43 +0100 | [diff] [blame] | 54 | python caffe_data_extractor.py -m /path/to/bvlc_alexnet.caffemodel -n /path/to/caffe/models/bvlc_alexnet/deploy.prototxt |
steniu01 | bee466b | 2017-06-21 16:45:41 +0100 | [diff] [blame] | 55 | |
| 56 | The script has been tested under Python2.7. |
| 57 | |
Anthony Barbier | 6a5627a | 2017-09-26 14:42:02 +0100 | [diff] [blame] | 58 | @subsection caffe_result What is the expected output from the script |
steniu01 | bee466b | 2017-06-21 16:45:41 +0100 | [diff] [blame] | 59 | |
SiCong Li | 86b5333 | 2017-08-23 11:02:43 +0100 | [diff] [blame] | 60 | If the script runs successfully, it prints the names and shapes of each layer onto the standard |
| 61 | output and generates *.npy files containing the weights and biases of each layer. |
steniu01 | bee466b | 2017-06-21 16:45:41 +0100 | [diff] [blame] | 62 | |
Alex Gilday | c357c47 | 2018-03-21 13:54:09 +0000 | [diff] [blame] | 63 | The arm_compute::utils::load_trained_data shows how one could load |
SiCong Li | 86b5333 | 2017-08-23 11:02:43 +0100 | [diff] [blame] | 64 | the weights and biases into tensor from the .npy file by the help of Accessor. |
| 65 | |
| 66 | @section tensorflow_data_extractor Extract data from pre-trained tensorflow model |
| 67 | |
| 68 | The script tensorflow_data_extractor.py extracts trainable parameters (e.g. values of weights and biases) from a |
| 69 | trained tensorflow model. A tensorflow model consists of the following two files: |
| 70 | |
| 71 | {model_name}.data-{step}-{global_step}: A binary file containing values of each variable. |
| 72 | |
| 73 | {model_name}.meta: A binary file containing a MetaGraph struct which defines the graph structure of the neural |
| 74 | network. |
| 75 | |
| 76 | @note Since Tensorflow version 0.11 the binary checkpoint file which contains the values for each parameter has the format of: |
| 77 | {model_name}.data-{step}-of-{max_step} |
| 78 | instead of: |
| 79 | {model_name}.ckpt |
| 80 | When dealing with binary files with version >= 0.11, only pass {model_name} to -m option; |
| 81 | when dealing with binary files with version < 0.11, pass the whole file name {model_name}.ckpt to -m option. |
| 82 | |
| 83 | @note This script relies on the parameters to be extracted being in the |
| 84 | 'trainable_variables' tensor collection. By default all variables are automatically added to this collection unless |
| 85 | specified otherwise by the user. Thus should a user alter this default behavior and/or want to extract parameters from other |
| 86 | collections, tf.GraphKeys.TRAINABLE_VARIABLES should be replaced accordingly. |
| 87 | |
Anthony Barbier | 6a5627a | 2017-09-26 14:42:02 +0100 | [diff] [blame] | 88 | @subsection tensorflow_how_to How to use the script |
SiCong Li | 86b5333 | 2017-08-23 11:02:43 +0100 | [diff] [blame] | 89 | |
| 90 | Install tensorflow and numpy. |
| 91 | |
| 92 | Download the pre-trained tensorflow model. |
| 93 | |
| 94 | Run tensorflow_data_extractor.py with |
| 95 | |
| 96 | python tensorflow_data_extractor -m <path_to_binary_checkpoint_file> -n <path_to_metagraph_file> |
| 97 | |
| 98 | For example, to extract the data from pre-trained tensorflow Alex model to binary files: |
| 99 | |
| 100 | python tensorflow_data_extractor -m /path/to/bvlc_alexnet -n /path/to/bvlc_alexnet.meta |
| 101 | |
| 102 | Or for binary checkpoint files before Tensorflow 0.11: |
| 103 | |
| 104 | python tensorflow_data_extractor -m /path/to/bvlc_alexnet.ckpt -n /path/to/bvlc_alexnet.meta |
| 105 | |
| 106 | @note with versions >= Tensorflow 0.11 only model name is passed to the -m option |
| 107 | |
| 108 | The script has been tested with Tensorflow 1.2, 1.3 on Python 2.7.6 and Python 3.4.3. |
| 109 | |
Anthony Barbier | 6a5627a | 2017-09-26 14:42:02 +0100 | [diff] [blame] | 110 | @subsection tensorflow_result What is the expected output from the script |
SiCong Li | 86b5333 | 2017-08-23 11:02:43 +0100 | [diff] [blame] | 111 | |
| 112 | If the script runs successfully, it prints the names and shapes of each parameter onto the standard output and generates |
| 113 | *.npy files containing the weights and biases of each layer. |
| 114 | |
Alex Gilday | c357c47 | 2018-03-21 13:54:09 +0000 | [diff] [blame] | 115 | The arm_compute::utils::load_trained_data shows how one could load |
SiCong Li | 86b5333 | 2017-08-23 11:02:43 +0100 | [diff] [blame] | 116 | the weights and biases into tensor from the .npy file by the help of Accessor. |
steniu01 | bee466b | 2017-06-21 16:45:41 +0100 | [diff] [blame] | 117 | */ |
Vidhya Sudhan Loganathan | d646ae1 | 2018-11-19 15:18:20 +0000 | [diff] [blame] | 118 | } |