|
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331 |
- # Demo of Image Classification
-
- The following describes how to use the MindSpore Lite C++ APIs (Android JNIs) and MindSpore Lite image classification models to perform on-device inference, classify the content captured by a device camera, and display the most possible classification result on the application's image preview screen.
-
- ## Running Dependencies
-
- - Android Studio 3.2 or later (Android 4.0 or later is recommended.)
-
- ## Building and Running
-
- 1. Load the sample source code to Android Studio.
-
- 
-
- Start Android Studio, click `File > Settings > System Settings > Android SDK`, and select the corresponding `SDK Tools`. As shown in the following figure, select an SDK and click `OK`. Android Studio automatically installs the SDK.
-
- 
-
- > Android SDK Tools is the default installation. You can see this by unchecking the `Hide Obsolete Packages`box.
- >
- > If you have any Android Studio configuration problem when trying this demo, please refer to item 4 to resolve it.
-
- 2. Connect to an Android device and runs this application.
-
- Connect to the Android device through a USB cable for debugging. Click `Run 'app'` to run the sample project on your device.
-
- 
-
- > Android Studio will automatically download MindSpore Lite, model files and other dependencies during the compilation process. Please be patient during this process.
- >
- > For details about how to connect the Android Studio to a device for debugging, see <https://developer.android.com/studio/run/device?hl=zh-cn>.
- >
- > The mobile phone needs to be turn on "USB debugging mode" before Android Studio can recognize the mobile phone. Huawei mobile phones generally turn on "USB debugging model" in Settings -> system and update -> developer Options -> USB debugging.
-
- 3. Continue the installation on the Android device. After the installation is complete, you can view the content captured by a camera and the inference result.
-
- 
-
- 4. The solutions of configuration problems:
-
- 4.1 Problems of NDK, CMake, JDK Tools:
-
- If the tools installed in Android Studio are not recognized, you can re-download and install them from the corresponding official website, and configure the path.
-
- - NDK >= 21.3 [NDK](https://developer.android.google.cn/ndk/downloads?hl=zh-cn)
- - CMake >= 3.10.2 [CMake](https://cmake.org/download)
- - Android SDK >= 26 [SDK](https://developer.microsoft.com/zh-cn/windows/downloads/windows-10-sdk/)
- - JDK >= 1.8 [JDK](https://www.oracle.com/cn/java/technologies/javase/javase-jdk8-downloads.html)
-
- 
-
- 4.2 NDK version does not match:
-
- Open `Android SDK`, click `Show Package Details`, and select the appropriate NDK version according to the error message.
- 
-
- 4.3 Problem of Android Studio version:
-
- Update the Android Studio version in `Toolbar - Help - Checkout for Updates`.
-
- 4.4 Gradle dependencies installed too slowly:
-
- As shown in the picture, open the Demo root directory `build. Gradle` file, then add huawei mirror source address: `maven {url 'https://developer.huawei.com/repo/'}`, modify the classpath to 4.0.0 and click ` sync ` . Once the download is complete, restore the classpath version and synchronize it again.
-
- ## Detailed Description of the Sample Program
-
- This image classification sample program on the Android device includes a Java layer and a JNI layer. At the Java layer, the Android Camera 2 API is used to enable a camera to obtain image frames and process images. At the JNI layer, the model inference process is completed in [Runtime](https://www.mindspore.cn/tutorial/lite/en/master/use/runtime.html).
-
- ### Sample Program Structure
-
- ```text
- app
- │
- ├── src/main
- │ ├── assets # resource files
- | | └── mobilenetv2.ms # model file
- │ |
- │ ├── cpp # main logic encapsulation classes for model loading and prediction
- | | |
- | | ├── mindspore_lite_x.x.x-runtime-arm64-cpu #MindSpore Lite Version
- | | ├── MindSporeNetnative.cpp # JNI methods related to MindSpore calling
- │ | └── MindSporeNetnative.h # header file
- │ |
- │ ├── java # application code at the Java layer
- │ │ └── com.mindspore.classification
- │ │ ├── gallery.classify # implementation related to image processing and MindSpore JNI calling
- │ │ │ └── ...
- │ │ └── widget # implementation related to camera enabling and drawing
- │ │ └── ...
- │ │
- │ ├── res # resource files related to Android
- │ └── AndroidManifest.xml # Android configuration file
- │
- ├── CMakeList.txt # CMake compilation entry file
- │
- ├── build.gradle # Other Android configuration file
- ├── download.gradle # MindSpore version download
- └── ...
- ```
-
- ### Configuring MindSpore Lite Dependencies
-
- When MindSpore C++ APIs are called at the Android JNI layer, related library files are required. You can use MindSpore Lite [source code compilation](https://www.mindspore.cn/tutorial/lite/en/master/use/build.html) to generate the MindSpore Lite version. In this case, you need to use the compile command of generate with image preprocessing module.
-
- In this example, the build process automatically downloads the `mindspore-lite-1.0.1-runtime-arm64-cpu` by the `app/download.gradle` file and saves in the `app/src/main/cpp` directory.
-
- Note: if the automatic download fails, please manually download the relevant library files and put them in the corresponding location.
-
- mindspore-lite-1.1.1-runtime-arm64-cpu.tar.gz [Download link](https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.1.1/MindSpore/lite/release_0220/android/mindspore-lite-1.1.1-runtime-arm64-cpu.tar.gz)
-
- ```text
- android{
- defaultConfig{
- externalNativeBuild{
- cmake{
- arguments "-DANDROID_STL=c++_shared"
- cppFlags "-std=c++17 -fexceptions -frtti"
- }
- }
-
- ndk{
- abiFilters'armeabi-v7a', 'arm64-v8a'
- }
- }
- }
- ```
-
- Create a link to the `.so` library file in the `app/CMakeLists.txt` file:
-
- ```text
- # ============== Set MindSpore Dependencies. =============
- include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp)
- include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION})
- include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/include)
-
- add_library(mindspore-lite SHARED IMPORTED)
- add_library(minddata-lite SHARED IMPORTED)
-
- set_target_properties(mindspore-lite PROPERTIES IMPORTED_LOCATION
- ${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/lib/libmindspore-lite.so)
- set_target_properties(minddata-lite PROPERTIES IMPORTED_LOCATION
- ${CMAKE_SOURCE_DIR}/src/main/cpp/${MINDSPORELITE_VERSION}/minddata/lib/libminddata-lite.so)
- # --------------- MindSpore Lite set End. --------------------
-
- # Link target library.
- target_link_libraries(
- ...
- # --- mindspore ---
- minddata-lite
- mindspore-lite
- ...
- )
- ```
-
- ### Downloading and Deploying a Model File
-
- In this example, the download.gradle File configuration auto download `mobilenetv2.ms`and placed in the 'app/libs/arm64-v8a' directory.
-
- Note: if the automatic download fails, please manually download the relevant library files and put them in the corresponding location.
-
- mobilenetv2.ms [mobilenetv2.ms]( https://download.mindspore.cn/model_zoo/official/lite/mobilenetv2_openimage_lite/mobilenetv2.ms)
-
- ### Compiling On-Device Inference Code
-
- Call MindSpore Lite C++ APIs at the JNI layer to implement on-device inference.
-
- The inference code process is as follows. For details about the complete code, see `src/cpp/MindSporeNetnative.cpp`.
-
- 1. Load the MindSpore Lite model file and build the context, session, and computational graph for inference.
-
- - Load a model file. Create and configure the context for model inference.
-
- ```cpp
- // Buffer is the model data passed in by the Java layer
- jlong bufferLen = env->GetDirectBufferCapacity(buffer);
- char *modelBuffer = CreateLocalModelBuffer(env, buffer);
- ```
-
- - Create a session.
-
- ```cpp
- void **labelEnv = new void *;
- MSNetWork *labelNet = new MSNetWork;
- *labelEnv = labelNet;
-
- // Create context.
- mindspore::lite::Context *context = new mindspore::lite::Context;
- context->thread_num_ = num_thread;
-
- // Create the mindspore session.
- labelNet->CreateSessionMS(modelBuffer, bufferLen, "device label", context);
- delete(context);
-
- ```
-
- - Load the model file and build a computational graph for inference.
-
- ```cpp
- void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, std::string name, mindspore::lite::Context* ctx)
- {
- CreateSession(modelBuffer, bufferLen, ctx);
- session = mindspore::session::LiteSession::CreateSession(ctx);
- auto model = mindspore::lite::Model::Import(modelBuffer, bufferLen);
- int ret = session->CompileGraph(model);
- }
- ```
-
- 2. Convert the input image into the Tensor format of the MindSpore model.
-
- Convert the image data to be detected into the Tensor format of the MindSpore model.
-
- ```cpp
- if (!BitmapToLiteMat(env, srcBitmap, &lite_mat_bgr)) {
- MS_PRINT("BitmapToLiteMat error");
- return NULL;
- }
- if (!PreProcessImageData(lite_mat_bgr, &lite_norm_mat_cut)) {
- MS_PRINT("PreProcessImageData error");
- return NULL;
- }
-
- ImgDims inputDims;
- inputDims.channel = lite_norm_mat_cut.channel_;
- inputDims.width = lite_norm_mat_cut.width_;
- inputDims.height = lite_norm_mat_cut.height_;
-
- // Get the mindsore inference environment which created in loadModel().
- void **labelEnv = reinterpret_cast<void **>(netEnv);
- if (labelEnv == nullptr) {
- MS_PRINT("MindSpore error, labelEnv is a nullptr.");
- return NULL;
- }
- MSNetWork *labelNet = static_cast<MSNetWork *>(*labelEnv);
-
- auto mSession = labelNet->session();
- if (mSession == nullptr) {
- MS_PRINT("MindSpore error, Session is a nullptr.");
- return NULL;
- }
- MS_PRINT("MindSpore get session.");
-
- auto msInputs = mSession->GetInputs();
- if (msInputs.size() == 0) {
- MS_PRINT("MindSpore error, msInputs.size() equals 0.");
- return NULL;
- }
- auto inTensor = msInputs.front();
-
- float *dataHWC = reinterpret_cast<float *>(lite_norm_mat_cut.data_ptr_);
- // Copy dataHWC to the model input tensor.
- memcpy(inTensor->MutableData(), dataHWC,
- inputDims.channel * inputDims.width * inputDims.height * sizeof(float));
- ```
-
- 3. Perform inference on the input tensor based on the model, obtain the output tensor, and perform post-processing.
-
- - Perform graph execution and on-device inference.
-
- ```cpp
- // After the model and image tensor data is loaded, run inference.
- auto status = mSession->RunGraph();
- ```
-
- - Obtain the output data.
-
- ```cpp
- auto names = mSession->GetOutputTensorNames();
- std::unordered_map<std::string,mindspore::tensor::MSTensor *> msOutputs;
- for (const auto &name : names) {
- auto temp_dat =mSession->GetOutputByTensorName(name);
- msOutputs.insert(std::pair<std::string, mindspore::tensor::MSTensor *> {name, temp_dat});
- }
- std::string retStr = ProcessRunnetResult(msOutputs, ret);
- ```
-
- - Perform post-processing of the output data.
-
- ```cpp
- std::string ProcessRunnetResult(const int RET_CATEGORY_SUM, const char *const labels_name_map[],
- std::unordered_map<std::string, mindspore::tensor::MSTensor *> msOutputs) {
- // Get the branch of the model output.
- // Use iterators to get map elements.
- std::unordered_map<std::string, mindspore::tensor::MSTensor *>::iterator iter;
- iter = msOutputs.begin();
-
- // The mobilenetv2.ms model output just one branch.
- auto outputTensor = iter->second;
-
- int tensorNum = outputTensor->ElementsNum();
- MS_PRINT("Number of tensor elements:%d", tensorNum);
-
- // Get a pointer to the first score.
- float *temp_scores = static_cast<float *>(outputTensor->MutableData());
- float scores[RET_CATEGORY_SUM];
- for (int i = 0; i < RET_CATEGORY_SUM; ++i) {
- scores[i] = temp_scores[i];
- }
-
- float unifiedThre = 0.5;
- float probMax = 1.0;
- for (size_t i = 0; i < RET_CATEGORY_SUM; ++i) {
- float threshold = g_thres_map[i];
- float tmpProb = scores[i];
- if (tmpProb < threshold) {
- tmpProb = tmpProb / threshold * unifiedThre;
- } else {
- tmpProb = (tmpProb - threshold) / (probMax - threshold) * unifiedThre + unifiedThre;
- }
- scores[i] = tmpProb;
- }
-
- for (int i = 0; i < RET_CATEGORY_SUM; ++i) {
- if (scores[i] > 0.5) {
- MS_PRINT("MindSpore scores[%d] : [%f]", i, scores[i]);
- }
- }
-
- // Score for each category.
- // Converted to text information that needs to be displayed in the APP.
- std::string categoryScore = "";
- for (int i = 0; i < RET_CATEGORY_SUM; ++i) {
- categoryScore += labels_name_map[i];
- categoryScore += ":";
- std::string score_str = std::to_string(scores[i]);
- categoryScore += score_str;
- categoryScore += ";";
- }
- return categoryScore;
- }
-
- ```
|