Java Question Need to understand some Java Code

XverhelstX

Well-Known Member
Licensed User
Longtime User
Hey everyone,

I don't know if this here is the right place to post this topic as it's partially related to Basic4Android and Java and a library development.

So I found an intresting piece of code here: CameraView.java - camdroiduni - stream from Android camera to pc - Google Project Hosting

And it's about live streaming from your device's camera to your browser.

But I want to know how the code actually works.

These are the things I want to understand:

1) How do they stream to the webbrowser. I understand that they send a index.html file to the ip adress of the device (on wifi) and that file reloads the page every second. But how do they send the index.html file to the desired ip address with sockets?

2) Here they mention they are using video, but I am still convinced they take pictures and send them as I don't see the mediarecorder anywhere.

Now my question is how they keep sending AND saving those images into the SD folder (i think). I think it's done with this code, but how does it works. Like with c.takepicture, it takes long to save and start previewing again, so that's no option to livestream.

B4X:
public synchronized byte[] getPicture()
    {
        try
        {
                while (!isPreviewOn) wait();
                isDecoding = true;
                mCamera.setOneShotPreviewCallback(this);
                while (isDecoding)      wait();
                }
        catch (Exception e)
        {
                return null;
        }
        return mCurrentFrame;
    }
    
    private LayoutParams calcResolution (       int origWidth,
                                                                                        int origHeight,
                                                                                        int aimWidth,
                                                                                        int aimHeight   )
        {
                double origRatio = (double)origWidth/(double)origHeight;
                double aimRatio  =  (double)aimWidth/(double)aimHeight ;
                
                if (aimRatio>origRatio)
                        return new LayoutParams(origWidth,(int)(origWidth/aimRatio));
                else
                        return new LayoutParams((int)(origHeight*aimRatio),origHeight);
        }
    
    private void raw2jpg(int[] rgb, byte[] raw, int width, int height)
    {
                final int frameSize = width * height;
        
                for (int j = 0, yp = 0; j < height; j++)
                {
                int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
                for (int i = 0; i < width; i++, yp++)
                {
                        int y=0;
                        if( yp < raw.length)
                        {
                                y = (0xff & ((int) raw[yp])) - 16;
                        }
//                      int y = (0xff & ((int) raw[yp])) - 16;
                        if (y < 0) y = 0;
                        if ((i & 1) == 0)
                        {
                                if(uvp<raw.length)
                                {
                                        v = (0xff & raw[uvp++]) - 128;
                                        u = (0xff & raw[uvp++]) - 128;
                                }
                        }
                        
                        int y1192 = 1192 * y;
                        int r = (y1192 + 1634 * v);
                        int g = (y1192 - 833 * v - 400 * u);
                        int b = (y1192 + 2066 * u);
                        
                        if (r < 0) r = 0; else if (r > 262143) r = 262143;
                        if (g < 0) g = 0; else if (g > 262143) g = 262143;
                        if (b < 0) b = 0; else if (b > 262143) b = 262143;
                        
                        rgb[yp] =       0xff000000 | ((r << 6) & 
                                                0xff0000) | ((g >> 2) &
                                                        0xff00) | ((b >> 10) & 
                                                        0xff);
                }
        }
    }
    
    
    @Override
        public synchronized void onPreviewFrame(byte[] data, Camera camera)
        {
        int width = mSettings.PictureW() ;
                int height = mSettings.PictureH();
                
//              // API 8 and above
//              YuvImage yuvi = new YuvImage(data, ImageFormat.NV21 , width, height, null);
//              Rect rect = new Rect(0,0,yuvi.getWidth() ,yuvi.getHeight() );
//              OutputStream out = new ByteArrayOutputStream();
//              yuvi.compressToJpeg(rect, 10, out);     
//              byte[] ref = ((ByteArrayOutputStream)out).toByteArray();
                
                // API 7
                int[] temp = new int[width*height];
                OutputStream out = new ByteArrayOutputStream();
//              byte[] ref = null;
                Bitmap bm = null;
                
                raw2jpg(temp, data, width, height);
                bm = Bitmap.createBitmap(temp, width, height, Bitmap.Config.RGB_565);
                bm.compress(CompressFormat.JPEG, mSettings.PictureQ(), out);
                /*ref*/mCurrentFrame = ((ByteArrayOutputStream)out).toByteArray();
//
//              mCurrentFrame = new byte[ref.length];
//              System.arraycopy(ref, 0, mCurrentFrame, 0, ref.length);
        isDecoding = false;
                notify();
        }
}

It would also be great if someone could help me implement this in a library.

I really hope someone can explain these things as good as possible. That would really much be appreciated.

Thanks!
XverhelstX
 
Last edited:

agraham

Expert
Licensed User
Longtime User
1) I assume the browser asks for it from the server just like a normal page.

2) The code repeatedly takes a snapshot from the camera preview using setOneShotPreviewCallback() to call onPreviewFrame(). The frame is delivered in YUV format so raw2jpg() converts it into 32 bit ARGB for the jpeg encoder. NV21 is a YUV planar format as described here .
 

XverhelstX

Well-Known Member
Licensed User
Longtime User
Thank you very much agraham. You really helped me alot, Although I still have some questions:

1) I assume the browser asks for it from the server just like a normal page.

It is send to the LAN ip adress for example: http://192.168.1.100:8080.
My question is how they do it (so with web sockets?) and why they use a port in it?

2) The code repeatedly takes a snapshot from the camera preview using setOneShotPreviewCallback() to call onPreviewFrame(). The frame is delivered in YUV format so raw2jpg() converts it into 32 bit ARGB for the jpeg encoder. NV21 is a YUV planar format as described here .

1) So is getPicture needed in the code? as setOneShotPreviewCallback() takes the picture from onPreviewFrame. EDIT: Oh ok, setOneShotPreviewCallback is in GetPicture.
So s$when and how do I have to call getPicture in the library

So in the Main Activity, CurrentJPEG = cameraframe.getpicture()
So it is still in "byte-form"? and in the webservice it is send with ps.write?
B4X:
@Override
        public void run()
        {
                isMainRunning(true);
                double deltaT0=0;
                boolean recState=true;
                double deltaT1=0;
                double fps_cnt=0;
                
                setText(osdState, " R E C ");
                setText(osdFps, " 0 fps ");
                
                while (isMainRunning())
                {
                        final double dateT1 = new Date().getTime();
                        currentJPEG = cameraFrame.getPicture();    <-- Here
                        final double dateT2 = new Date().getTime();

2) So the pictures are converted from YUV to jpeg(?) Then where are the pictures saved?

3) Is it possible to put this in a library?

4) What do you think should be easier / a more convenient way to do to capture the audio?
Take pictures togheter with the AudioRecorder Library or capture in a videostream and live stream in?

5) Your link probably leads to here: YUV pixel formats thanks for this.

XverhelstX
 
Last edited:

agraham

Expert
Licensed User
Longtime User
1) I am no expert on HTTP but I don't think an HTTP server can actually send anything. I think it can only reply to an IP address that sent a request - but I may be wrong here.

2) getPicture() is called, presumably by the application, and produces the jpeg data for the image in the private byte array mCurrentFrame and returns that array. What happens to if afterwards is not in that code fragment. Note that getPicture() does a couple of wait()s. This is because the image acquisition code is running in a separate thread to that of the application

That code is for a View class that will be displayed in an activity by the application. I think you could put it straight into a library as a static class and wrap it using anywheresoftware.b4a.objects.ViewWrapper.

I've no idea about audio, I'm not at all interested in media.
 

XverhelstX

Well-Known Member
Licensed User
Longtime User
Ok, I just figured out from the getPicture().
Edited post above.

Altough I just still don't understand how the picture IS displayed on your browser as you don't send any picture but just bytes?

B4X:
 public static void doMJPGStreaming(OutputStream os) {
                PrintStream ps = new PrintStream(os);
                try {
                        Worker.printMJPEGHeaders(ps);
                } catch (IOException e1) {
                        e1.printStackTrace();
                        return;
                }

                while (WebServer.isServiceRunning()) {
                        
                        try {
                                Worker.printJPGHeaders(ps);
                                ps.write(main.currentJPEG);
                        } catch (IOException e) {
                                e.printStackTrace();
                                break;
                        }
                }
        }

2- Do you have some documentation on B4A about the ViewWrapper and what it does?

How do I make it a static class?

public static class SnSCamera extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback { ?

EDIT: Doing this : public static class SnSCamera extends AbsObjectWrapper<SurfaceView> implements SurfaceHolder.Callback, PreviewCallback
{ gives me an error on SnSCamera that only Final, etc are allowed.

Thanks!

XverhelstX
 
Last edited:

agraham

Expert
Licensed User
Longtime User
It seems to be all done by the magic of HTTP about which I know almost nothing.

There is no documentation of ViewWrapper or of Basic4android internal structures that I know of. I know about it because I tested for Erel during its development and even then I didn't see the source code but used a decompiler, JD-GUI to poke around in the jars to see what was happening. I'm quite good at reverse engineering software coding.

There's an outline ViewWrapper example in the attachment here but no guarantees as to whether I have got something wrong or not.
 

Attachments

  • ViewWrapperExample.zip
    703 bytes · Views: 390

XverhelstX

Well-Known Member
Licensed User
Longtime User
pff, this is really hard :s

So you don't know how the picture is displayed on the browser?
Is it converted in the webservice or in the browser?

Maybe you can take a look at it.
I really don't have a clue if I succeeded :p
Don't know how to handle it though.
You can import it to your library folder too.

XverhelstX
 

Attachments

  • LiveStream.zip
    8.2 KB · Views: 377

agraham

Expert
Licensed User
Longtime User
I keep telling you that I know nothing about HTTP so I can't answer your browser questions no matter how often you ask!

Your library won't work as it stands. You need to implement the original CameraView as a static class in your library source code then wrap it like the template I posted. You need to do you own research to understand what you need to do.
 

XverhelstX

Well-Known Member
Licensed User
Longtime User
I keep telling you that I know nothing about HTTP so I can't answer your browser questions no matter how often you ask!

Ok, I'm sorry.

Your library won't work as it stands. You need to implement the original CameraView as a static class in your library source code then wrap it like the template I posted. You need to do you own research to understand what you need to do.

I will try to do it in a different way.
So if i understand it correctly:

The camera takes everytime a picture with getPicture. getPicture calls onPreviewFrame with c.setOneShotPreviewCallback(this). onPreviewFrame writes it to a raw byte with raw2jpg? Is this correct?

Then why doesn't this code works every 10 seconds?:
B4X:
Sub TI_Tick

s = BytesToString(camera1.Picture, 0, 20, "UTF-8")

Msgbox(s,"")

End Sub

I got a java.nullpointerexception with these logs:

LogCat connected to: 43423541314355304246
--------- beginning of /dev/log/system


--------- beginning of /dev/log/main


** Activity (main) Create, isFirst = true **
** Activity (main) Resume **
main_ti_tick (java line: 331)
java.lang.NullPointerException
at java.lang.String.<init>(String.java:316)
at java.lang.String.<init>(String.java:257)
at anywheresoftware.b4a.keywords.Common.BytesToString(Common.java:793)
at com.rootsoft.streamnationstudio.main._ti_tick(main.java:331)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:507)
at anywheresoftware.b4a.BA.raiseEvent2(BA.java:104)
at anywheresoftware.b4a.objects.Timer$TickTack.run(Timer.java:101)
at android.os.Handler.handleCallback(Handler.java:587)
at android.os.Handler.dispatchMessage(Handler.java:92)
at android.os.Looper.loop(Looper.java:123)
at android.app.ActivityThread.main(ActivityThread.java:3652)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:507)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:862)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:620)
at dalvik.system.NativeStart.main(Native Method)
java.lang.NullPointerException



XverhelstX
 
Last edited:

agraham

Expert
Licensed User
Longtime User
onPreviewFrame writes it to a raw byte with raw2jpg? Is this correct?
No. Look at the code, like I said above it produces a byte array that contains a jpeg encoded image.

Then why doesn't this code works every 10 seconds?
Look at the code for getPicture. It returns null if an exception occurs which is probably what is happening.
 

XverhelstX

Well-Known Member
Licensed User
Longtime User
Oh yes, sorry my mistake.

B4X:
Look at the code for getPicture. It returns null if an exception occurs which is probably what is happening.

What is wrong then :s.
I really want to find the error but I cannot seem to find it.

Is there a difference between synchronized byte and byte?

XverhelstX
 
Last edited:

XverhelstX

Well-Known Member
Licensed User
Longtime User
Nope, I don't know how to identify such an exception.

Attached is a program.

XverhelstX

EDIT:

I emailed the authors of camdroid and gave me the following explanation for the webserver:
in WebServer class we catch requests for test.jpg and motion.jpg.
test.jpg is the image displayed on index.html, if the browser send a request for some of this resources, we put the header with the current picture together and send it to the browser.
what happens on a specific request is in WebServer class in lines: 283, 295 and 303.
 

Attachments

  • livestream.zip
    7.2 KB · Views: 329
Last edited:

agraham

Expert
Licensed User
Longtime User
There's no point in me looking at it while we know it throws an unidentified exception.

Either remove the try .. catch in getPicture and let the exception pass to the Basic4android code or log the exception message and and the stack trace in the catch block of getPicture.
B4X:
catch (Exception e)
{
    Log.e("B4A", e.toString()); // the type of exception
    Log.e("B4A", e.getMessage)); // the message if any
    Log.e("B4A", "", e); // the stack trace
    return null;
}
 

XverhelstX

Well-Known Member
Licensed User
Longtime User
maybe the error is related as a byte array is a prime candidate for an OutputStream. So I can make a socket connection to a server expecting the raw data, point the OutputStream to the socket, and let the server use some other API to deal with reassembling the data stream into picture frames to render as streaming video.

EDIT: Ok, I will try that too.

Thanks,

XverhelstX
 

XverhelstX

Well-Known Member
Licensed User
Longtime User
I really cannot seem to find it.

- Deleted try/catch, didn't worked.
- added the logs. No more error, but the app blocks and doesn't show any error anymore.

XverhelstX

EDIT:

So I have this code and I log some things to check it out.
B4X:
public synchronized byte[] getPicture()
    {
        try
        {   
                Log.i("B4A","Try Mode.");
                while (!isPreviewOn) wait();
                Log.i("B4A","Preview is on.");
                isDecoding = true;
                Log.i("B4A","Decoding is true.");
                c.setOneShotPreviewCallback(this);
                Log.i("B4A","Callback: Done.");
                while (isDecoding)      wait();
                Log.i("B4A","Done.");
                }
        catch (Exception e)
        {
           Log.e("B4A", e.toString()); // the type of exception
            Log.e("B4A", e.getMessage()); // the message if any
            Log.e("B4A", "", e); // the stack trace
            Log.i("B4A","Catch Mode.");
            return null;
        }
        return mCurrentFrame;
    }

In my logcat, I only see Try Mode and Preview is on.
So basically, I don't have a callback to onPreviewFrame. How does this come and how do I solve.
I use this with my existing camera library.

This is the other code under it.
 
Last edited:

agraham

Expert
Licensed User
Longtime User
You need to apply some cold hard logic to what you see. There won't be any magic involved, it's just that you don't at the moment understand what is, or is not, happening.

What you report does not make logical sense so there is something else going on. Go back to the state when you were getting a null returned and try changing one thing at a time and analyse the difference in behaviour, if any, before you try the next change.
 

XverhelstX

Well-Known Member
Licensed User
Longtime User
Ok, I really didn't understand what you wrote here, but it was just to check out if the onPreviewFrame and Raw2jpg are called. Apparently not.

So I think I might have find it and I will try to explain as good as possible.

So, I changed the code to the following:

B4X:
public byte[] getPicture()
          {
              try
              {
                    Log.i("B4A","Try Mode.");
                      while (!isPreviewOn) wait();
                      Log.i("B4A","IsPreviewOn is True");
                      isDecoding = true;
                      Log.i("B4A","Decoding is true.");
                      c.setOneShotPreviewCallback(mPreviewCallback);
                      Log.i("B4A","Callback Done.");
                      while (isDecoding) wait();
                      Log.i("B4A","Outside isDecoding.");
                      }
              catch (Exception e)
              {
                 Log.i("B4A","An error occured.");
                      return null;
              }
              Log.i("B4A","Returning current frame.");
              return mCurrentFrame;
          }
       
       PreviewCallback mPreviewCallback = new PreviewCallback(){
          
          
          @Override
              public synchronized void onPreviewFrame(byte[] data, Camera camera)
              {
             Log.i("B4A","In onPreviewFrame");
                    int width = 20;
                      int height = 10;
                      
//                    // API 8 and above
//                    YuvImage yuvi = new YuvImage(data, ImageFormat.NV21 , width, height, null);
//                    Rect rect = new Rect(0,0,yuvi.getWidth() ,yuvi.getHeight() );
//                    OutputStream out = new ByteArrayOutputStream();
//                    yuvi.compressToJpeg(rect, 10, out);     
//                    byte[] ref = ((ByteArrayOutputStream)out).toByteArray();
                      
                      // API 7
                      int[] temp = new int[width*height];
                      OutputStream out = new ByteArrayOutputStream();
//                    byte[] ref = null;
                      Bitmap bm = null;
                      Log.i("B4A","Initialize raw2jpg");
                      raw2jpg(temp, data, width, height);
                      bm = Bitmap.createBitmap(temp, width, height, Bitmap.Config.RGB_565);
                      bm.compress(CompressFormat.JPEG, 100, out);
                      /*ref*/mCurrentFrame = ((ByteArrayOutputStream)out).toByteArray();
      //
//                    mCurrentFrame = new byte[ref.length];
//                    System.arraycopy(ref, 0, mCurrentFrame, 0, ref.length);
                      isDecoding = false;
                      notify();
                      Log.i("B4A","Decoding is false.");
              }
       };

Here, you should note that I changed :
B4X:
public synchronized byte[] getPicture()
into
public byte[] getPicture()
What seems to work as onPreviewFrame and raw2jpg is called now as you can see in my log:

Try Mode.
IsPreviewOn is True
Decoding is true.
Callback Done.
An error occured.
main_ti_tick (java line: 317)
java.lang.NullPointerException
at anywheresoftware.b4a.agraham.byteconverter.ByteConverter.HexFromBytes(ByteConverter.java:237)
at com.rootsoft.streamnationstudio.main._ti_tick(main.java:317)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:507)
at anywheresoftware.b4a.BA.raiseEvent2(BA.java:104)
at anywheresoftware.b4a.objects.Timer$TickTack.run(Timer.java:101)
at android.os.Handler.handleCallback(Handler.java:587)
at android.os.Handler.dispatchMessage(Handler.java:92)
at android.os.Looper.loop(Looper.java:123)
at android.app.ActivityThread.main(ActivityThread.java:3652)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:507)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:862)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:620)


at dalvik.system.NativeStart.main(Native Method)
java.lang.NullPointerException
In onPreviewFrame
Initialize raw2jpg
In raw2jpg.
Done with raw2jpg
Decoding is false.

Now the problem is when isDecoding isn't called by notify() (because getPicture is not synchronized) it won't notify
B4X:
while (isDecoding) wait();
Log.i("B4A","Outside isDecoding.");

So that is basically the problem now. I need to find a way to notify it as I only can notify when public getPicture is not synchronized.

Note that it skips:
B4X:
while (!isPreviewOn) wait();
                      Log.i("B4A","IsPreviewOn is True");

because isPreviewOn is already true and doesn't have to wait.

Erel, Agraham or anyone else, Could you please help me at this?

Thanks,
XverhelstX
 
Top