Skip to main content
Visitor II
January 14, 2022
Question

How to upload data to AWS S3 that does not fit in memoy ?

  • January 14, 2022
  • 1 reply
  • 823 views

I need to upload large data files (around 200MB) from my STM32L485 board to AWS S3 storage. I was planning to use direct S3 upload via an HTTP request. However, my data cannot be loaded in RAM so I was thinking of sending it in chunks. But, I have seen that the HTTP library used in the examples, coreHTTP from aws c sdk, does not support streaming. Because of this, I'm not sure that uploading my data via HTTP is the best way to go.

I am new to these kinds of challenges so any advice on how to solve this problem will be much appreciated !

Thank in advance,

D.

    This topic has been closed for replies.

    1 reply

    ST Employee
    January 27, 2022

    Have you looked at the S3 multipart upload API? I am not sure it is compatible with the coreHTTP lib, but it could relax the memory constraint if you have to stick to HTTP.

    If your application is connected to AWS IoT Core, another option could be:

    1. stream file chunks over MQTT (check the max message length of your MQTT client library);
    2. connect an AWS IoT Rule to push them to S3;
    3. get a lambda function assemble the chunks as they arrive.

    There are also some threads on this topic on the freertos.org forums.