I am using the Android Camera2 API to take photos for an app and I can’t get the flash to work properly on most Samsung Galaxy devices (galaxy s6 edge, galaxy s7, galaxy j7). I believe I have implemented the flash logic properly because Google’s Camera app that used to be on the Play Store also exhibits the same behaviors. Also the galaxy s8 seems to work with the flash pretty well (although results definitely have inconsistent lighting)
The issue with the galaxy j7 is that when I take a picture with flash (either with flash locked on or with auto flash in a scene that requires flash) the flash will stay on for a long time, the preview will lock, and then after maybe 7 seconds the image will take, and the flash is not a part of the image.
I have the following method handling flash modes:
private void setAutoFlash(CaptureRequest.Builder requestBuilder)
This flash logic is called every time a CaptureRequest.Builder is needed
The 3 commented out lines are something else I read that was suggested to help flash work properly for me, but it doesn’t seem to do anything
Samsung has their own camera API (http://developer.samsung.com/galaxy/camera)
but I read that it is just a wrapper over Camera2 and I am worried that their API won’t even fix my issue.
- I have tried some camera apps from the play store, some work and others don’t.
- ZCamera works fine with flash, which made me think they use Samsungs camera API to get it working, but then I noticed that ZCamera’s touch metering doesn’t work on Samsung devices which is another issue I came across while debugging my Camera2 implementation.
- Flash seems to work fine if I stick with the deprecated Camera API
Any help on how to accomplish a working flash would be greatly appreciated
After working at this on and off for a bit I realized a few things. I mentioned that ZCamera (from the play store) works with flash, and I thought they accomplished this by using the Samsung SDK. I checked the app and it does not use the Samsung SDK.
I also incorporated the Samsung SDK into my app and that didn’t change anything. The Samsung SDK is really just a wrapper around google’s camera 2 so you can add some Samsung specific features, adding it to your project wont fix any Samsung compatibilities.
What I finally realized was that the touch metering flow I had programmed myself (touch to focus/then take a photo) worked very differently than my logic that ran when we take a photo without touch to focus. The regular photo logic was borrowed from googles camera2 api example code and it wasnt working propery.
The trick to get the flash to fire on Samsung devices (or at least what worked for me) was to first trigger a check for AE levels, and once that converges then start the auto focus trigger. If flash is turned on this will fire the flash to check AE levels and to focus, and then fire the flash once more to take the photo