Couple of thoughts to try and explain the memory usage you are seeing.
Firstly as stated your current jpgs are not ‘power-of-two’, which means internally Unity will create 2 versions, both POT in size. One simply copies the source image into it 1:1 ratio, so leaving empty space where the source image ends. The other downsamples to the next nearest POT, and so the image is scaled and detail lost. At least I think it downsamples, never really looked into that but can see arguments for both methods, downsmaple to use less memory, upsample to keep detail, though it is still going to be blurred slightly.
So your 1280x1656 image becomes 2048x2048 and 1024x1024, which at 24bit become 12mb and 3mb respectively. So thats 15mb or 20mb if Unity stores it as 32bit.
Another copy may be kept if the default for downloading an image via WWW is to mark it as ‘readable’. So depending on what is kept by Unity in the case of a non-pot jpg that could be an additional 6mb-20mb.
So even at the lowest estimate your 6mb image data could be taking up as much as 21mb or worse case 40mb. Though just because the worse case here looks the same as what you are seeing I don’t think we can just assume its correct. There are likely to be other factors that will increase memory usage beyond what i’ve mentioned here, for example mipmapping if Unity keeps them on the cpu, or WWW operations are cached in case you ask for it again.
It should then be possible to reduce memory requirements but will require some effort.
Resize all jpg’s to 2048x2048, but do not rescale the image, simply copy it into the 2048x2048 image. You’ll have a border around the image which you’ll have to remove through a material, shader or by uv mapping on a plane for example. This will avoid Unity making two copies, so in theory saving the memory from creating the 1024x1024 version.
Set the texture to use point sampling to remove mipmaps - though unsure if that will really save cpu memory, since I can’t think of a reason for Unity to keep it around once uploaded to the gpu.
Try using apply(false, true) toforce Unity to ditch the texture data once uploaded to the gpu (maybe have to get/set a pixel first).