IF i want to take a picture with the camera and later i want to show it on a imageview which cover the screen 100% horizontally and lets say 80% vertically, should i set the camera picture size to the same size of my view or let the image adapt with using gravity?
But in this case aint the result image a distorted image?
Setting the camera output size accordling screen dimensions looks complicated if not impossible to me.
how do i cope to different screen sizes?
In another scenario, i have a very big image, let's say his dimensions are 3872*2592 my thought is to use loadbitmapsample to resample the image and save memory giving as width the same size of the imageview and height proportionally calculated by maintaining the aspect ratio with this formula:
height / width * imageview.heigh = new height
Is that correct?
Thanks
But in this case aint the result image a distorted image?
Setting the camera output size accordling screen dimensions looks complicated if not impossible to me.
how do i cope to different screen sizes?
In another scenario, i have a very big image, let's say his dimensions are 3872*2592 my thought is to use loadbitmapsample to resample the image and save memory giving as width the same size of the imageview and height proportionally calculated by maintaining the aspect ratio with this formula:
height / width * imageview.heigh = new height
Is that correct?
Thanks