Tuesday, April 26, 2011

Android Modules

An important step of this development is to get all necessary Android phone resources into our hands.
According to our demand we need the to handle the Camera, Finger Touches, Bitmap/Display and GUI Layout.

The Camera Module 
In Android the photo taken is called with the PictureCallBack event, implementing function:
public abstract void onPictureTaken (byte[] data, Camera camera)
This function will be called when the picture data is available after the photo is taken, given the byte[] data for further processing.

The Hand Draw Module
Here are several APIs to allow the user to draw sketch images. They are all based on Event Callbacks.
The first possible choice is using the GestureOverlayView in the Android layout:

<android.gesture.GestureOverlayView
android:id="@+id/gestures"
android:layout_width="fill_parent"
        android:layout_height="0dip"
  android:layout_weight="1.0" >
  <GestureOverlayView android:id="@+id/gestureOverlayView1">        
  </GestureOverlayView>
</android.gesture.GestureOverlayView>

Add these to the Android res/layout/your_choice.xml and load this layout in the Android application, we will get an area that allows us to draw gestures on it.
Then we can implement a callback function to get the user hand draw information like Strokes, History Points and even Predictions.
In addition, we must call this resource in our layout and register a listener for the event:


GestureOverlayView gestures = (GestureOverlayView)
findViewById(R.id.gestures);
gestures.addOnGesturePerformedListener(this);


The following callback function will then be called 
public abstract void onGesturePerformed (GestureOverlayView overlay, Gesture gesture)
If the user draws in our the GestureOverlayView layout area.



Another way is create a canvas for the user to manipulate, which is even better for our implementations except the GUI is more primary and requires extra effort. In my next diary I will explain how to use the canvas and Touch Event instead of the GestureOverlayView.

The BitMap Module & Display
All of our work is based on processing image buffers. In android the wrap up is a Bitmap class.
A Bitmap instance can be created from byte[] buffer and IntBuffer/FloatBuffer class in android SDK.
It can also be converted back into any Buffer type - which we can then bind as textures.

Taking these nature qualities of Bitmap class, we can modify it easily as byte[] arrays as well as display it in at least two ways:
1st way is using an Android.Canvas.  A canvas can be activated in the View and be displayed, it can draw() anything inside its area: e.g., a Bitmap or a circle.
2nd way is using openGL texture. As we can bind the Bitmap to a texture, we will be able to draw it easily.

Conclusion
With the modules above, the required functionalities for our application is mostly completed.
The input byte[] is from the Camera.  Another input byte[] is from the Hand Drawing.  We can directly manipulate these byte[] arrays, and we can bind them as textures then modify in GLSL.
After that we can create Bitmaps for display - more for saving into files, as the openGL is able to display the texture before it is read back into a Bitmap on the host.

No comments:

Post a Comment