[android-developers] Augmented reality: using camera feed as texture
Hi,
Posted on this a while back but I've been doing some more work on it
since. The aim of the augmented reality app I'm developing is to
overlay geographical data (roads, hiking trails) on a camera feed for
navigation. Following other examples on the web, I've decided to use
the raw camera feed as an OpenGL texture and overlay geographic data
on top of the texture within a GLSurfaceView. The feed will be black
and white for the moment.
Only problem is, it seems to be impossible to capture camera data
without creating a SurfaceView to show the preview. However, since
you're not supposed to layer SurfaceViews on top of each other, this
appears to mean that I presumably have to create an invisible
SurfaceView just to be able to capture camera data. The other
solutions to this I've seen on the web (nhenze.net and AndAR) seem to
layer SurfaceViews on top of each other.
Is it therefore true that (prior to Android 3.0) there is simply no
elegant solution to the problem? The only solutions people have come
up with are either:
a) overlay two surfaceviews on top of each other (not guaranteed to
work);
b) create a "dummy" SurfaceView for the preview just to get
PreviewCallback working; or
c) reimplement GLSurfaceView from scratch to incorporate a camera
feed?
Thanks,
Nick
Thanks,
Nick
--
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers+unsubscribe@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home