You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The function unishox2_decompress_simple does not take the size of the output buffer into account, always assuming it has enough space for the result. Therefore, if an ATAK message contains a compressed field that decompresses into more bytes than is reserved in a meshtastic_TAKPacket, then this will write out of the bounds of that uncompressed field, overwriting other values on the stack (potentially even a pushed return address).
This is actually the same issue as what triggered #3573, except there it was compression that triggered the overflow. The user who reported it sent a base64-encoded message near the maximum size. Due to the high entropy of the message the unishox2 compression actually increased its length from 220 to 245 bytes, also causing an out of bounds write on the stack. Before #3606 was merged it looks like decompression there was affected too with received messages.
The correct function to call would be unishox2_decompress/unishox2_compress and making sure that UNISHOX_API_WITH_OUTPUT_LEN is defined as 1. (#3606 disabled the use of unishox2 for "app" messages, but I'm not sure if not doing the same in the ATAK plugin was an oversight.)
This is a vulnerability that could be exploited by sending a malicious message to a Meshtastic node (the ATAK plugin does not need to actively used just sending a message of this type on a channel the node is on is enough). In the firmware I looked at (the LilyGo T3-S3), this would not be directly exploitable to gain RCE due to the use of stack cookies, but I don't know if that's the case for all supported hardware. Even without knowing the stack cookie value, this can be used to repeatedly crash a node.
Relevant log output
No response
The text was updated successfully, but these errors were encountered:
Category
Other
Hardware
Not Applicable
Firmware Version
2.3.7.30fbcab
Description
There is a stack buffer overflow when decompressing ATAK messages here:
firmware/src/modules/AtakPluginModule.cpp
Line 114 in 75dc8cc
firmware/src/modules/AtakPluginModule.cpp
Line 119 in 75dc8cc
firmware/src/modules/AtakPluginModule.cpp
Line 126 in 75dc8cc
firmware/src/modules/AtakPluginModule.cpp
Line 133 in 75dc8cc
The function
unishox2_decompress_simple
does not take the size of the output buffer into account, always assuming it has enough space for the result. Therefore, if an ATAK message contains a compressed field that decompresses into more bytes than is reserved in ameshtastic_TAKPacket
, then this will write out of the bounds of thatuncompressed
field, overwriting other values on the stack (potentially even a pushed return address).This is actually the same issue as what triggered #3573, except there it was compression that triggered the overflow. The user who reported it sent a base64-encoded message near the maximum size. Due to the high entropy of the message the unishox2 compression actually increased its length from 220 to 245 bytes, also causing an out of bounds write on the stack. Before #3606 was merged it looks like decompression there was affected too with received messages.
The correct function to call would be
unishox2_decompress
/unishox2_compress
and making sure thatUNISHOX_API_WITH_OUTPUT_LEN
is defined as 1. (#3606 disabled the use of unishox2 for "app" messages, but I'm not sure if not doing the same in the ATAK plugin was an oversight.)This is a vulnerability that could be exploited by sending a malicious message to a Meshtastic node (the ATAK plugin does not need to actively used just sending a message of this type on a channel the node is on is enough). In the firmware I looked at (the LilyGo T3-S3), this would not be directly exploitable to gain RCE due to the use of stack cookies, but I don't know if that's the case for all supported hardware. Even without knowing the stack cookie value, this can be used to repeatedly crash a node.
Relevant log output
No response
The text was updated successfully, but these errors were encountered: