Skip to content

Commit e205b47

Browse files
committed
docs/psoc6: Add docs for DEEPCRAFT-mpy integration.
Signed-off-by: NikhitaR-IFX <nikhita.rajasekhar@infineon.com>
1 parent 9715c95 commit e205b47

8 files changed

Lines changed: 179 additions & 1 deletion
Lines changed: 167 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,167 @@
1+
.. _psoc6_mpy_deepcraft_integration:
2+
3+
MicroPython - DEEPCRAFT™ Integration
4+
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
5+
6+
DEEPCRAFT™ is Infineon's comprehensive new Edge AI software & tools offering designed to fastrack edge machine learning
7+
application development.
8+
9+
DEEPCRAFT™ Studio is a development platform for AI on edge devices. It provides unique modeling capabilities
10+
to make a custom edge AI model, or bring your own models that need to be optimized for the edge or for specific hardware.
11+
It's integration with MicroPython allows easy end-to-end application development and deployment journey.
12+
13+
Follow along to build your first edgeML application with MicroPython and DEEPCRAFT™ integration 🚀
14+
15+
Overview
16+
=========
17+
Like any typical edge ML application workflow, this solution also revolves around two key phases — training and inferencing
18+
— with the training phase handled seamlessly in DEEPCRAFT™ Studio.
19+
20+
The diagrams below illustrate the detailed workflows for each of these phases, outlining the
21+
tools and steps involved from data acquisition to model deployment.
22+
23+
.. image:: img/training_phase.png
24+
:width: 1000
25+
26+
In the training phase, the edge device runs a micropython application script that streams raw sensor data to a host machine. A capture server on the host
27+
listens for this incoming data and saves it as timestamped files. This recorded dataset is then imported into the development studio,
28+
where it is labeled, preprocessed, and used to train a machine learning model tailored to the application. With MicroPython integration, the trained model
29+
can be seamlessly converted into a runtime-loadable format and deployed to the device's filesystem. Since achieving optimal accuracy often requires iterative tuning,
30+
this streamlined model conversion and deployment process significantly reduces friction in the development cycle.
31+
32+
33+
.. image:: img/inferencing_phase.png
34+
:width: 1000
35+
36+
Once the model achieves satisfactory accuracy, it is ready for integration into the final application. In this phase, a MicroPython script saved on filesystem of edge device,
37+
typically handles sensor data acquisition and feeds it to the dynamically loaded model. The pre-trained model processes the input and outputs class probabilities
38+
based on the learned patterns. Depending on the application, these outputs can be used to trigger actions via peripherals (e.g., LEDs, buzzers) or be
39+
streamed to external interfaces such as a web dashboard for visualization.
40+
41+
With this development workflow in place, let's begin building a complete EdgeML application step by step.
42+
43+
Pre-requisites
44+
================
45+
46+
Ensure the following are available and installed as needed :
47+
48+
1. `DEEPCRAFT™ Studio <https://softwaretools.infineon.com/tools/com.ifx.tb.tool.deepcraftstudio>`_
49+
2. `Capture Server cloned in your local <https://bitbucket.org/imagimob/captureserver/src/master/>`_
50+
51+
Tested Boards
52+
================
53+
- `CY8CKIT-062S2-AI <https://www.infineon.com/cms/en/product/evaluation-boards/cy8ckit-062s2-ai/>`_
54+
55+
1. Data acquisition
56+
======================
57+
58+
Setup & Installation
59+
---------------------
60+
1. Flash the micropython device with firmware using the `mpy-psoc6.py utility <https://ifx-micropython.readthedocs.io/en/latest/psoc6/installation.html>`_.
61+
62+
2. Clone the repository to get data acquisition scripts:
63+
64+
.. code-block:: python
65+
66+
git clone https://gitlab.intra.infineon.com/epe-sw/innovation/edgeml/imagimob-mpy-app.git
67+
68+
and open data_acquisition.py script in MPY supported IDES's like thonny IDE.
69+
70+
3. Clone the captureserver repo:
71+
72+
.. code-block:: python
73+
74+
git clone https://bitbucket.org/imagimob/captureserver/src/master/
75+
76+
and follow the instruction to setup.
77+
78+
Steps
79+
------
80+
While the capture server supports multiple data acquisition interfaces, this integration currently enables only
81+
TCP, allowing wireless streaming of data from the edge device to the host machine.
82+
83+
1. To enable Wi-Fi-based data streaming from your edge device to the host, update the data_acquisition.py script
84+
with your network credentials:
85+
86+
.. code-block:: python
87+
88+
SSID = "your_wifi_name"
89+
PASSWORD = "your_wifi_password"
90+
91+
2. Navigate to the generic folder in the cloned capture server repository:
92+
93+
.. code-block:: bash
94+
95+
cd examples/generic
96+
97+
3. To start the capture server using TCP, run the following command (update parameters as needed):
98+
99+
.. code-block:: bash
100+
101+
python generic_local_capture_interface.py --output-dir "output directory location" --protocol TCP --ip-address "IP address" --port 5000 --data-format ".data or .wav based on sensor output" --data-type "expected sensor data type" --samples-per-packet "sensor values in a single output" --features "no. of sensor features" --sample-rate "sampling-rate" --video-disabled.
102+
103+
To know about each parameters, please check the `documentation <https://bitbucket.org/imagimob/captureserver/src/master/>`_.
104+
105+
4. Open Deepcraft Studio and either create a new project or open an existing one. Navigate to the DATA tab and click the Add Data button. Select the output directory where the captured .wav or .data and associated label files were saved.
106+
107+
.. image:: img/training_add_data.png
108+
:width: 800px
109+
110+
5. Upon selection, the studio will automatically detect and load the audio/data and label files into a new data session.
111+
112+
.. image:: img/training_data_view.png
113+
:width: 800px
114+
115+
6. Once the data session is created, your dataset is now available inside the studio and ready for preprocessing, labeling, and model training.
116+
117+
.. image:: img/training_data_session.png
118+
:width: 800px
119+
120+
2. Model deployment
121+
=====================
122+
123+
Setup & Installation
124+
---------------------
125+
1. Clone deepcraft-micropython-converter repository from the root of your DEEPCRAFT™ project.
126+
127+
.. code:: bash
128+
129+
git clone https://github.com/Infineon/deepcraft-micropython-converter/tree/main
130+
131+
2. To make the script executable directly from the DEEPCRAFT™ Studio environment:
132+
133+
- Go to the Tools tab and select ``Options``.
134+
- Navigate to ``External Tools``.
135+
- Click ``New Row`` to add a custom command to the file context menu.
136+
- Set the following parameters:
137+
- File Filter: *.py
138+
- Console: Cmd (keep)
139+
- Confirm Dialog: ✅ Checked
140+
- Command: python "path to deepcraft_micropython_converter script"
141+
*where python is the interpreter and expected to be installed in your environment*
142+
143+
.. image:: img/deployment_script_addition.png
144+
:width: 800px
145+
146+
147+
Steps
148+
------
149+
1. Once you are ready with your model to be deployed, simply right-click on the deepcraft_micropython_converter.py script and click on ``Run``.
150+
This will generate a deepcraft_model.mpy which you can directly drop to your edge device.
151+
152+
Placeholder to add image!!
153+
154+
2. To drop the model to your edge device, open a Micropython supported IDE like Thonny and select this model from local. Right click and select ``Upload to \``
155+
156+
Use cases
157+
==========
158+
159+
Placeholder to add hackster/DEEPCRAFT™ newsletter links
160+
161+
Resources
162+
==========
163+
164+
165+
166+
167+
1.28 MB
Loading
71 KB
Loading
24 KB
Loading
18.7 KB
Loading
17.8 KB
Loading
98.3 KB
Loading

docs/psoc6/quickref.rst

Lines changed: 12 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,16 @@ working with this port it may be useful to get an overview of the microcontrolle
2121
installation.rst
2222
mpy-usage.rst
2323

24+
And if you are already familiar with MicroPython enablement for PSoC6 and want to try it's integrations
25+
in other application domains, check below:
26+
27+
.. toctree::
28+
:maxdepth: 1
29+
:includehidden:
30+
31+
integrations/deepcraft_integration.rst
32+
33+
2434
.. note::
2535

2636
The PSoC6™ port is now a mature port and is expected any MicroPython built-in
@@ -1006,4 +1016,5 @@ The NeoPixel driver can be used as follows (see the :mod:`neopixel` for more det
10061016
.. note::
10071017
- The timing parameter can be used in the `NeoPixel()` constructor with timing tuples supported by the `machine.bitstream()` module. The timing parameter is optional and by default set to 1 which is the default timing [400, 850, 800, 450] for WS2812B LEDs at 800kHz.
10081018
- Use timing = 0 for WS2812B LEDs at 400kHz ie, [800, 1700, 1600, 900].
1009-
- Use timing = [300, 900, 600, 600] for SK6812 LEDs.
1019+
- Use timing = [300, 900, 600, 600] for SK6812 LEDs.
1020+

0 commit comments

Comments
 (0)