1. marcelo.emmerich's Avatar
    First of all, thanks to everyone who contributed to this thread, it helped me a lot in implementing an app that samples the preview of the camera and do meaningful, AR-ish things with the raw pixels.

    Mmm... but I think takePicture() is not as fast as the streammed video, I will make some tests...

    Any other way to take a sample of the camera, without using a private API?
    According to this thread it looks like there is a way of sampling the preview of the camera using the generic UIImagePickerController, however no details are presented, as the developer is under NDA. He gives a couple of hints though.
    2010-01-05 10:08 AM
  2. supreme1's Avatar
    Hi rilocr,

    Is there any way to get YUV image data from coreSurfaceBuffer ?

    Thank you.

    Hi natios,

    Try releasing the coreSurfaceBuffer variable with CFRelease(coreSurfaceBuffer).

    That's the only additional code I have.
    Hi rilocr,

    Is there any way to get YUV image data from coreSurfaceBuffer ?

    Thank you.
    Last edited by supreme1; 2010-04-08 at 11:40 PM. Reason: Automerged Doublepost
    2010-04-08 11:40 PM
  3. rilocr's Avatar
    Hi,

    Yes, but you have to find a function to translate RGB to YUV, which I don't have... maybe ask google.

    surface.baseAddress;

    returns a buffer encoded as RGB32, and I am sure there are functions to translate a buffer format to another one.

    Regards
    2010-04-09 06:01 AM
  4. iphone8130hak's Avatar
    Nice
    Keithsta
    2010-04-17 11:48 PM
64 ... 234
LINK TO POST COPIED TO CLIPBOARD