|
3 | 3 | MicroPython - DEEPCRAFT™ Integration |
4 | 4 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
5 | 5 |
|
6 | | -DEEPCRAFT™ is Infineon's comprehensive new Edge AI software & tools offering designed to fastrack edge machine learning |
| 6 | +`DEEPCRAFT™ <https://www.infineon.com/cms/en/design-support/software/deepcraft-edge-ai-solutions/>`_ is Infineon's comprehensive new Edge AI software & tools offering designed to fastrack edge machine learning |
7 | 7 | application development. |
8 | 8 |
|
9 | | -DEEPCRAFT™ Studio is a development platform for AI on edge devices. It provides unique modeling capabilities |
10 | | -to make a custom edge AI models for PSoC based hardware. It's integration with MicroPython allows easy end-to-end |
| 9 | +`DEEPCRAFT™ Studio`_ is a development platform for AI on edge devices. It provides unique modeling capabilities |
| 10 | +to make custom edge AI models for PSOC-based hardware. Its integration with MicroPython allows an easy end-to-end |
11 | 11 | application development and deployment journey. |
12 | 12 |
|
13 | | -Follow along to build your first edgeML application with MicroPython and DEEPCRAFT™ integration 🚀 |
| 13 | +Follow along to build your first edge Machine Learning (ML) application with MicroPython and DEEPCRAFT™ Studio. 🚀 |
14 | 14 |
|
15 | 15 | Overview |
16 | 16 | ========= |
17 | | -Like any typical edge ML application workflow, this solution also revolves around two key phases — training and inferencing |
18 | | -— with the training phase handled seamlessly in DEEPCRAFT™ Studio. |
| 17 | +This solution follows a standard edge ML application workflow, consisting of two primary phases: training and inferencing. |
| 18 | +The training phase is efficiently managed within DEEPCRAFT™ Studio. |
19 | 19 |
|
20 | | -The diagrams below illustrate the detailed workflows for each of these phases, outlining the |
21 | | -tools and steps involved from data acquisition to model deployment. |
| 20 | +The diagrams below provide a detailed overview of the workflows for both phases, highlighting the tools and steps involved, |
| 21 | +from data acquisition to model deployment. |
22 | 22 |
|
23 | 23 | .. image:: img/training_phase.png |
24 | 24 | :width: 1000 |
25 | 25 |
|
26 | | -In the training phase, the edge device runs a micropython application script that streams raw sensor data to a host machine. A capture server on the host |
27 | | -listens for this incoming data and saves it as timestamped files. This recorded dataset is then imported into the development studio, |
28 | | -where it is labeled, preprocessed, and used to train a machine learning model tailored to the application. With MicroPython integration, the trained model |
29 | | -can be seamlessly converted into a runtime-loadable format and deployed to the device's filesystem. Since achieving optimal accuracy often requires iterative tuning, |
30 | | -this streamlined model conversion and deployment process significantly reduces friction in the development cycle. |
| 26 | +During the training phase, the edge device executes a MicroPython script that streams raw sensor data to a host machine. A capture server running on the host |
| 27 | +receives this data and stores it as timestamped files. These files are then imported into DEEPCRAFT™ Studio, where they undergo labeling, preprocessing, |
| 28 | +and are used to train a machine learning model tailored to the specific application. With MicroPython integration, the trained model can be effortlessly converted |
| 29 | +into a runtime-loadable format and deployed to the device's filesystem. This streamlined workflow, which supports iterative tuning for optimal accuracy, |
| 30 | +greatly simplifies the model conversion and deployment process. |
31 | 31 |
|
32 | 32 |
|
33 | 33 | .. image:: img/inferencing_phase.png |
34 | 34 | :width: 1000 |
35 | 35 |
|
36 | | -Once the model achieves satisfactory accuracy, it is ready for integration into the final application. In this phase, a MicroPython script saved on filesystem of edge device, |
37 | | -typically handles sensor data acquisition and feeds it to the dynamically loaded model. The pre-trained model processes the input and outputs class probabilities |
38 | | -based on the learned patterns. Depending on the application, these outputs can be used to trigger actions via peripherals (e.g., LEDs, buzzers) or be |
39 | | -streamed to external interfaces such as a web dashboard for visualization. |
| 36 | +After achieving satisfactory accuracy, the model is ready for deployment in the final application. During this phase, a MicroPython script stored on the edge device's filesystem |
| 37 | +manages sensor data acquisition and feeds it to the dynamically loaded model. The pre-trained model processes the input data and generates class probabilities based on the learned patterns. |
| 38 | +These outputs can then be utilized to trigger actions through peripherals (e.g., LEDs, buzzers) or transmitted to external interfaces, such as a web dashboard, for visualization. |
40 | 39 |
|
41 | | -With this development workflow in place, let's begin building a complete EdgeML application step by step. |
| 40 | +With this streamlined development workflow, let's proceed to build a complete edge ML application step by step. |
42 | 41 |
|
43 | 42 | Pre-requisites |
44 | 43 | ================ |
45 | 44 |
|
46 | | -Ensure the following are available and installed as needed : |
| 45 | +Ensure the following tools are installed on your system: |
47 | 46 |
|
48 | | -1. `DEEPCRAFT™ Studio <https://softwaretools.infineon.com/tools/com.ifx.tb.tool.deepcraftstudio>`_ |
| 47 | +1. `DEEPCRAFT™ Studio`_ |
49 | 48 | 2. `Capture Server cloned in your local <https://bitbucket.org/imagimob/captureserver/src/master/>`_ |
50 | 49 |
|
51 | | -Tested Boards |
52 | | -================ |
| 50 | +Supported Boards |
| 51 | +================== |
53 | 52 | - `CY8CKIT-062S2-AI <https://www.infineon.com/cms/en/product/evaluation-boards/cy8ckit-062s2-ai/>`_ |
54 | 53 |
|
55 | | -1. Data acquisition |
| 54 | +1. Data Acquisition |
56 | 55 | ====================== |
57 | 56 |
|
58 | | -Setup & Installation |
59 | | ---------------------- |
60 | | -1. Flash the micropython device with firmware using the `mpy-psoc6.py utility <https://ifx-micropython.readthedocs.io/en/latest/psoc6/installation.html>`_. |
61 | | - |
62 | | -2. Clone the repository to get data acquisition scripts: |
63 | | - |
64 | | -.. code-block:: python |
65 | | -
|
66 | | - git clone https://gitlab.intra.infineon.com/epe-sw/innovation/edgeml/imagimob-mpy-app.git |
67 | | -
|
68 | | -and open data_acquisition.py script in MPY supported IDES's like thonny IDE. |
69 | | - |
70 | | -3. Clone the captureserver repo: |
71 | | - |
72 | | -.. code-block:: python |
73 | | - |
74 | | - git clone https://bitbucket.org/imagimob/captureserver/src/master/ |
75 | | - |
76 | | -and follow the instruction to setup. |
77 | | - |
78 | | -Steps |
79 | | ------- |
80 | | -While the capture server supports multiple data acquisition interfaces, this integration currently enables only |
81 | | -TCP, allowing wireless streaming of data from the edge device to the host machine. |
82 | | - |
83 | | -1. To enable Wi-Fi-based data streaming from your edge device to the host, update the data_acquisition.py script |
84 | | -with your network credentials: |
85 | | - |
86 | | -.. code-block:: python |
87 | | -
|
88 | | - SSID = "your_wifi_name" |
89 | | - PASSWORD = "your_wifi_password" |
90 | | -
|
91 | | -2. Navigate to the generic folder in the cloned capture server repository: |
92 | | - |
93 | | -.. code-block:: bash |
94 | | -
|
95 | | - cd examples/generic |
96 | | -
|
97 | | -3. To start the capture server using TCP, run the following command (update parameters as needed): |
98 | | - |
99 | | -.. code-block:: bash |
100 | | -
|
101 | | - python generic_local_capture_interface.py --output-dir "output directory location" --protocol TCP --ip-address "IP address" --port 5000 --data-format ".data or .wav based on sensor output" --data-type "expected sensor data type" --samples-per-packet "sensor values in a single output" --features "no. of sensor features" --sample-rate "sampling-rate" --video-disabled. |
| 57 | +This part is covered in the `data acquisition repository <https://github.com/Infineon/deepcraft-micropython-data-acquisition>`_, which provides |
| 58 | +instructions and examples for data acquisition using MicroPython. |
102 | 59 |
|
103 | | -To know about each parameters, please check the `documentation <https://bitbucket.org/imagimob/captureserver/src/master/>`_. |
104 | | - |
105 | | -4. Open Deepcraft Studio and either create a new project or open an existing one. Navigate to the DATA tab and click the Add Data button. Select the output directory where the captured .wav or .data and associated label files were saved. |
106 | | - |
107 | | -.. image:: img/training_add_data.png |
108 | | - :width: 800px |
109 | | - |
110 | | -5. Upon selection, the studio will automatically detect and load the audio/data and label files into a new data session. |
111 | | - |
112 | | -.. image:: img/training_data_view.png |
113 | | - :width: 800px |
114 | | - |
115 | | -6. Once the data session is created, your dataset is now available inside the studio and ready for preprocessing, labeling, and model training. |
116 | | - |
117 | | -.. image:: img/training_data_session.png |
118 | | - :width: 800px |
119 | | - |
120 | | -2. Model deployment |
| 60 | +2. Model Deployment |
121 | 61 | ===================== |
122 | 62 |
|
123 | | -Setup & Installation |
124 | | ---------------------- |
125 | | -1. Clone deepcraft-micropython-converter repository from the root of your DEEPCRAFT™ project. |
126 | | - |
127 | | -.. code:: bash |
128 | | -
|
129 | | - git clone https://github.com/Infineon/deepcraft-micropython-converter/tree/main |
130 | | -
|
131 | | -2. To make the script executable directly from the DEEPCRAFT™ Studio environment: |
132 | | - |
133 | | -- Go to the Tools tab and select ``Options``. |
134 | | -- Navigate to ``External Tools``. |
135 | | -- Click ``New Row`` to add a custom command to the file context menu. |
136 | | -- Set the following parameters:: |
137 | | - |
138 | | - File Filter: *.py |
139 | | - Console: Cmd (keep) |
140 | | - Confirm dialog: Checked |
141 | | - Command: python "path to deepcraft_micropython_converter script" ; where python is the interpreter and expected to be installed in your environment |
142 | | - |
143 | | -.. image:: img/deployment_script_addition.png |
144 | | - :width: 800px |
145 | | - |
146 | | - |
147 | | -Steps |
148 | | ------- |
149 | | -1. Once you are ready with your model to be deployed, simply right-click on the deepcraft_micropython_converter.py script and click on ``Run``. |
150 | | -This will generate a deepcraft_model.mpy which you can directly drop to your edge device. |
151 | | - |
152 | | - Placeholder to add image!! |
153 | | - |
154 | | -2. To drop the model to your edge device, open a Micropython supported IDE like Thonny and select this model from local. Right click and select ``Upload to \`` |
| 63 | +This part is covered in the `model converter repository <https://github.com/Infineon/deepcraft-micropython-converter>`_, which provides instructions |
| 64 | +on how to convert a DEEPCRAFT™ model into a MicroPython-compatible format and deploy it on the PSOC board. |
155 | 65 |
|
156 | | -Our published projects |
157 | | -======================= |
| 66 | +Example Projects |
| 67 | +================== |
| 68 | +- `Edge AI-based baby cry detector with Home Assistant integration <https://www.hackster.io/Infineon_Team/ai-baby-cry-detector-with-home-assistant-integration-05576f>`_ |
158 | 69 |
|
159 | | -Placeholder to add hackster/DEEPCRAFT™ newsletter links!! |
| 70 | +.. _DEEPCRAFT™ Studio: https://softwaretools.infineon.com/tools/com.ifx.tb.tool.deepcraftstudio |
0 commit comments