Are there options for more realistic down-sampling of textures?

I am currently working on an application that involves simulating barcodes in 3D space. I am taking full resolution camera captures and sending the images to a program that can decode the barcodes. One issue I am having is that when I move the camera to far away from the barcode texture, the barcode texture starts to morph in an unrealistic way. Instead of blurring more (kind of like how a real camera would capture a far away barcode), the texture becomes distorted in a way that makes the code unreadable. I am trying to figure out how to make a more “true to life” visual simulation of a far away barcode that would more accurately portray what a camera would really see. Performance is NOT a concern, and I know a lot of Unity’s work on antialiasing and downsampling might avoid realism for the sake of performance. What are my options here?

Sorry but I have no idea what you mean by “morphing in unrealistic ways”. The sampling is done by your GPU. How good the result is depends on if you use mipmapping or not and on your anisotropic filtering level. Do you have an example of the “morphing” you’re talking about? So the actual source texture and a screenshot of the “morphed” view would help.