Load the bitmap with Bitmap.Initialize2 and save it with Bitmap.WriteToStream with the last parameter set to "JPEG"
On looking closely at the code it uses LLCamera to take a photo - which gets returned as a bitmap - so shouldn't be affected by "High Efficiency" - or is it.
I have been testing my app on
AWS Device Farm using an iPhone X (iOS 12) and iPhone 11 Pro Max (iOS 13.1.3)
It worked fine on both - I also tested a rigged version of the app to log some details of the image produced - indicated a proper bitmap was produced on both camera format settings on both phones.
So I got in touch with my tester (actually my son in law who works in IT and is no dope) and got him to test again - still fails when on camera format "High Efficiency" but works with "Most Compatible" (his iPhone is iOS 13.3).
When the app fails it generates a low level log which is posted to an AWS S3 bucket - see attachment for what my son in law produced.
This is indicating "The network connection was lost." - which occurs in a step subsequent to the image processing stuff where I am uploading the photo to an AWS S3 bucket.
I'm stumped!
I can test it with an iPhone 11.
I may have to take you up on this.
If you could give me an email address, I can send you a digital ticket which will give you access to the App Store version of the app.