Nano edgeAI software
Hi everyone,
I am trying to train a large dataset in NanoEdge AI Studio, where each class is around 500 MB in size.
Totally 5 class.
However, I am facing limitations related to the data structure, memory usage, and training workflow in NanoEdge AI Studio. I would like guidance on:
Recommended data structuring or segmentation methods for very large datasets
Best practices to handle or reduce large data sizes (windowing, feature extraction, downsampling, etc.)
Any tool limitations or configuration settings to be aware of
Possible alternative approaches if direct training with such large data is not supported
If anyone has experience training large datasets with NanoEdge AI Studio, your guidance or references would be greatly appreciated.
Thank you in advance for your support.
Best regards,
Vijepradeepan
