Learning to Use Android NDK With a Neat Augmented Reality Example

I’ve been moving some Flash SWF files over to AIR for Android and while looking into how best to optimize ActionScript code for AIR for Android, I got sidetracked. After reading about the obvious optimizations for memory and CPU resources, I decided to take a look at the AIR .apk file that is installed on my Android phone. I found that the magic was happening in

\lib\armeabi\libCore.so

Ahh, the Adobe AIR runtime kind of runs beside the Dalvik virtual machine rather than completely inside of it.

The libCore.so file is a shared object file using JNI. Java Native Interface(JNI) http://en.wikipedia.org/wiki/JNI code written in C or C++ allows you to access platform specific features and ‘touch the metal’ with your code.

Android provides the Android NDK so you can use native code.
http://developer.android.com/sdk/ndk/index.html

Some words of wisdom from the Android developer page 🙂

The NDK will not benefit most applications. As a developer, you need 
to balance its benefits against its drawbacks; notably, using native 
code does not result in an automatic performance increase, but 
always increases application complexity. In general, you should 
only use native code if it is essential to your application, not just 
because you prefer to program in C/C++.

http://developer.android.com/sdk/ndk/overview.html

I wanted to try compiling and running some native code on my phone (Android 2.2) so that I have that tool in my toolbox when I need it. I found a great example at Qualcomm.
https://ar.qualcomm.com/qdevnet/sdk

The sample application that comes with the SDK allows you to detect and track image targets in 3D using your phone’s camera and then it puts a floating teapot (http://www.sjbaker.org/wiki/index.php?title=The_History_of_The_Teapot) over your target. The NDK is required to compile the native C++ code which uses OpenGL ES http://en.wikipedia.org/wiki/OpenGL_ES. It has step by step instructions that will show you how to build the native C++ source files with the NDK package of the Android SDK and then use Eclipse to build the Java sources and create the APK package that can be deployed to the phone.

This is very much like the augmented reality on the new Nintendo 3DS
http://www.siliconera.com/2011/03/04/lose-your-nintendo-3ds-ar-cards-just-print-out-new-ones/

If you are having a hard time with the NDK, try the sample that comes with it in \samples\hello-jni along with the tutorial at http://www.pocketmagic.net/?p=1332 for a very basic example that will help you understand the proper procedure.

I have some broken antique clocks around the house so I made image targets out of their faces and now have a teapot floating in front of them. Next time I get a little break in the action, I’ll change that teapot to a semi-transparent digital clock so my clocks will no longer be broken provided, I run the program on my phone and point the camera at the clocks 🙂

Clock Face

The NDK combined with current hardware capabilities certainly seem to be enough to make sophisticated and engaging toolkits and engines for game developers and artists.

Anyway, enjoy playing with the code as that is how you learn.

One Reply to “Learning to Use Android NDK With a Neat Augmented Reality Example”

Leave a Reply

Your email address will not be published. Required fields are marked *