You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
this bug seems to be with the pyboard 'D' but I thought I'd report here in case anyone else is seeing large files that seem to timeout on the microWebSrv. Basically I noticed that sometimes my microWebSrv did not fully transfer large image files while testing.
I tested directly with curl scripts shown below:
curl -X GET http://192.168.110.143:8000/thumb.png > thumb.png
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 71268 100 71268 0 0 205k 0 --:--:-- --:--:-- --:--:-- 205k
nherriot@Zenbook-UX32A ~ $ curl -X GET http://192.168.110.143:8000/thumb.png > thumb.png
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 71268 100 71268 0 0 220k 0 --:--:-- --:--:-- --:--:-- 220k
nherriot@Zenbook-UX32A ~ $ curl -X GET http://192.168.110.143:8000/thumb.png > thumb.png
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
10 71268 10 7403 0 0 2199 0 0:00:32 0:00:03 0:00:29 2199
curl: (18) transfer closed with 63865 bytes remaining to read
You can see where the last curl request fails.
The output from the microWebSrv on the Pyboard 'D' looks like:
Accepted 'client': <socket state=3 timeout=-1 incoming=0 off=0> and 'client address': ('192.168.110.147', 42094)
Processing request HTTP Method: GET Path: /thumb.png Version: HTTP/1.1
Server writing file: www//thumb.png of size: 71268 of type: image/png to host: ('192.168.110.147', 42094)
Last: 612 octets being sent
Accepted 'client': <socket state=3 timeout=-1 incoming=0 off=0> and 'client address': ('192.168.110.147', 42152)
Processing request HTTP Method: GET Path: /thumb.png Version: HTTP/1.1
Server writing file: www//thumb.png of size: 71268 of type: image/png to host: ('192.168.110.147', 42152)
Last: 612 octets being sent
Connected on IP: 192.168.110.143
Accepted 'client': <socket state=3 timeout=-1 incoming=0 off=0> and 'client address': ('192.168.110.147', 42178)
Processing request HTTP Method: GET Path: /thumb.png Version: HTTP/1.1
Server writing file: www//thumb.png of size: 71268 of type: image/png to host: ('192.168.110.147', 42178)
Again we can see the last file never gets sent, and there is no except being called in the WriteResponseFile method. The code looks like:
defWriteResponseFile(self, filepath, contentType=None, headers=None):
""" A method to write a file to the client. It takes the path of the file, calculates it's size and copies the file in chunk sizes of 1024 octets via the low level socket interface. The method first builds the first line of the HTTP request and provides headers, content type, reason code and size of the file. It then does the sending of the file in 1024 octet chunks to the client. If there is a failure in reading the file a WriteREsponseNotFound is sent. If there is a faulure in sending the file to the client a WriteResponseInternalServerError is sent. :param filepath: (e.g. www/style.css) :param contentType: (e.g. text/html) :param headers: (e.g. {'Cache-Control': 'max-age=315360000', 'Last-Modified': 'Fri, 1 Jan 2018 23:42:00 GMT'}) :return: Boolean """try :
size=stat(filepath)[6]
print("Server writing file: {} of size: {} of type: {} to host: {}".format(filepath, size, contentType, self._client._addr ))
ifsize>0 :
withopen(filepath, 'rb') asfile : # Open file for reading in binary modeself._writeBeforeContent(200, headers, contentType, None, size) # Write our HTTP headertry :
buf=bytearray(1024)
whilesize>0 :
x=file.readinto(buf)
ifx<len(buf) :
buf=memoryview(buf)[:x]
print("Last: {} octets being sent".format(x))
self._write(buf) # call up low level socket write functionsize-=xreturnTrueexcept :
self.WriteResponseInternalServerError()
returnFalseexcept :
passself.WriteResponseNotFound()
returnFalse
The file I try and transfer is here - which is not even that large!
I've taken the liberty of adding a lot of print statements and doc strings to the microWebSrv.py file to help me debug. I'll push the doc strings back here today.
If anyone wants to try this out you can find instruction here
I've also reported this to the micropython team here
Things to note that:
I have threading set to True, but it only ever uses a single thread so i have no idea why this would be an issue.
I do use an IRQ Timer which happens every 10 seconds and calls a callback to check WiFi status.
If anyone has clever ways to debug on the embedded board please let me know. :-)
If anyone has an idea why this is happening, or has seen this behaviour before please let me know! :-)
Kind regards, Nicholas.
The text was updated successfully, but these errors were encountered:
Hello @nherriot and thank you for your issue 👍
I've just committed a fix in the write socket function (commit 6f36702).
Could you retry your code and tell me if the problem occurs again please :) ?
Thanks!
Hi MicrowWebSrv people,
this bug seems to be with the pyboard 'D' but I thought I'd report here in case anyone else is seeing large files that seem to timeout on the microWebSrv. Basically I noticed that sometimes my microWebSrv did not fully transfer large image files while testing.
I tested directly with curl scripts shown below:
You can see where the last curl request fails.
The output from the microWebSrv on the Pyboard 'D' looks like:
Again we can see the last file never gets sent, and there is no except being called in the WriteResponseFile method. The code looks like:
The file I try and transfer is here - which is not even that large!
I've taken the liberty of adding a lot of print statements and doc strings to the microWebSrv.py file to help me debug. I'll push the doc strings back here today.
If anyone wants to try this out you can find instruction here
I've also reported this to the micropython team here
Things to note that:
If anyone has clever ways to debug on the embedded board please let me know. :-)
If anyone has an idea why this is happening, or has seen this behaviour before please let me know! :-)
Kind regards, Nicholas.
The text was updated successfully, but these errors were encountered: