I’m downloading a raw byte array to be used as a Texture2D using a technique by at SO.
UnityWebRequest www = UnityWebRequest.Get("myurl");
DownloadHandler handle = www.downloadHandler;
yield return www.Send();
if (www.isNetworkError)
{
UnityEngine.Debug.Log("Error while Receiving: " + www.error);
}
else
{
UnityEngine.Debug.Log("Success");
Texture2D texture2d = new Texture2D(8, 8);
texture2d.LoadImage(handle.data);
}
LoadImage, however, can’t parse the byte array and gives me ‘red question mark’ fallback.
So the issue was in LoadImage not being able to parse the image’s header. The function starts reading the texture as a bytearray, however first encounters the header, and doesn’t know what to do with it.
Furthermore, the image was hosted on an S3 bucket, which assigned a header by default. Removing the header on S3 solved the issue.
I know I am reviving a necro-thread, but I am having a similar issue. I have a png on S3 that I am accessing via an API endpoint that retrieves the raw binary via a node.js use of AWS SDK’s getObject(). What is returned also fails to parse. I get a byte array on the Unity side fine, but the LoadImage(byteArray) call returns false. I am a bit stuck, not sure how to remove a header, or even check if one is there, etc.
*
EDIT: I solved my issue, but I have no idea why I had to do it this way. I will share here for others in case they have a similar issue, and maybe those more knowledgeable than me can point out why I ran into this and potentially suggest better solutions.
*
For background, I am using Node.js Express to expose API endpoints that allow image access. Within this Node code I uses the AWS SDK getObject() function to fetch the PNG and send it back as so (bucket junk in the params):
*
s3.getObject(params, function(err, data) {
res.write(Buffer.from(data.Body));
res.end(null);
});
*
Theoretically, the Buffer.from(data.Body) here should return the byte array I need for LoadImage() in Unity, and sure enough, if I check the length in Node for my test image it is exactly the number of bytes I expect so seems to be correct. When I actually hit the end point in Unity, however, it comes out as a much longer byte that won’t parse. On the Unity side I was using the following on a successful request (where uwr is the UnityWebRequest that succeeded):
*
byte binaryImageWeb = uwr.downloadHandler.data;
*
My workaround (which I don’t love because I don’t understand why it is needed, but it works) is to more explicitly return the byte array as a stringified JSON array and then process it on the Unity side through deserialization of the text return rather than the provided .data byte. The new Node code is as follows:
*
s3.getObject(params, function(err, data) {
res.write(JSON.stringify(Buffer.from(data.Body).toJSON(‘hex’).data));
res.end(null);
});
*
And the Unity code changes to this:
*
byte binaryImage = JsonConvert.DeserializeObject<byte>(uwr.downloadHandler.text);
*
Using this approach I get the correct expected-length byte array which successfully parses into a Texture.