Augmented Reality apps

Hi...Does anyone know of any recent augmented reality apps, such as the ones where
you can point your camera at a restaurant, store, etc., and get detailed information, user
ratings, etc.?   Am aware of Acrossair, Yelp, Layar.   Thank you.

Anyone?

Similar Messages

  • Best Augmented Reality Apps

    What are your favorite augmented reality apps?
    I enjoy:
    Peaks - great for knowing what mountain you are looking at
    Yelp - awesome for finding a good place to eat
    acrossair - just fun to play with
    What are some other cool ones?

    Anyone?

  • A Sneak Peek at HP Support's Augmented Reality App

    Get a behind-the-scenes glimpse of HP's new augmented reality application being developed for Android mobile devices. See how this exciting new app will help you change the ink cartridges in your HP Officejet 6500a Plus (e710n) printer. Help make this beta app even better -- and help us expand the app to other products -- by trying it and completing a short survey about your augmented reality experience.
    For other helpful videos go to hp.com/supportvideos or youtube.com/hpsupport. More support options for your printer are available at hp.com/support.
    This video was produced by HP.
    If I have helped you in any way click the Kudos button to say Thanks.
    The community works together, click Accept as Solution on the post that solves your issue for other members of the community to benefit from the solution.
    - Friendship is magical.
    This question was solved.
    View Solution.

    I hope you find this video informative.
    If I have helped you in any way click the Kudos button to say Thanks.
    The community works together, click Accept as Solution on the post that solves your issue for other members of the community to benefit from the solution.
    - Friendship is magical.

  • Need info about Augmented Reality Apps for SmartPhones

    Greetings,
    We're trying to see if CS5 could be used to create a SmartPhone AR application. (Augmented Reality)
    Is there an SDK we could use to ease the process ?
    It must work with iPhone, Android and Windows Phone 7.
    Thanks !

    Greetings,
    We're trying to see if CS5 could be used to create a SmartPhone AR application. (Augmented Reality)
    Is there an SDK we could use to ease the process ?
    It must work with iPhone, Android and Windows Phone 7.
    Thanks !

  • Augmented reality for iOS

    Anybody have a good link on developing an Augmented Reality app for iOS.
    The only thing I can find is this...
    http://www.adobe.com/devnet/flash/articles/augmented_reality.html
    ...which is outdated.
    I cannot find anything out there with folks using AIR 3.

    Could you please open a new bug report on this over at bugbase.adobe.com?  Any chance we could get a copy of your app?  If you'd like to keep it private, please feel free to email it to me directly at [email protected]
    Thanks,
    Chris

  • Queries regarding Flash Builder and Augmented Reality.

    I am Sarat from India. I'm a software engineer with working knowledge of Java, so Flash AS and OOP are understandable for me. I am working on an augmented reality project. I am quite new to Flash, Adobe Community and I've got some queries regarding Augmented Reality and Flash Builder:
    1. Flash Builder 4.6 comes with a default Flex 4.6 SDK. However, Flex 4.6 SDK wasn't allowing me to compile and run some example files. So I've downloaded Flex 4.0A version from Adobe.com. Now the examples are running fine, but would there be any problem if i try deploying such projects in a website or as a desktop app? Once the code is compiled into a swf file, the flex framework used doesn't make much difference, does it?
    2. Would the AR project run effectively on a website, given various internet/processor speeds worldwide? Would the effectiveness of the AR project, deployed on a website, depend on the number of triangles in the 3D models i.e. dae files? Because as per my understanding heavy models implies more time to download the flash app into the local browser from the internet and more time to render them by the papervision 3D engine right?
    3. Can we develop a stand alone desktop AR app using Flash Builder? Using Adobe AIR we can, i guess. Please refer some tutorial, if possible.
    4. I've seen that we can implement multiple-marker-tracking AR using vectors/arrays in AS. Would there be any performance issues depending on the size of the vectors/arrays used.
    5. Can someone please mention some tips to improve performance of an AR app (desktop app and web app)?
    6. What would be, approximately, the cost of FLARManager, FLARToolkit commercial versions, if you have any idea? I've gone through their website but they did not mention the costs.
    7. Would applying bitmap material to the dae models pull down the web app/mobile app/desktop app performance, given some 4 to 5 dae models in the scene?
    8. Is it advisable to use multiple markers with multiple dae models or single marker with Flash-based GUI option to load different models onto the same marker?
    It would be very helpful for me if someone could answer my above queries.
    Sarat.

    #1, If it compiles then you have no issue. There's no reason at this point not to use 4.6. You should bundle a captive runtime to assure the users computer won't need to have AIR installed at all.
    #2, Papervision is old. Use the Stage3D and/or a wrapper framework. As far as the generic "If I download lots of data will it take the user more time to load it", well, of course. Just don't make the loading experience painful. Entertain them while they way or find ways of displaying data sooner than later. If it's desirable on the web has more to do with the context of the app and the device displaying it. In other words, a phone user would find it easy but obviously not a desktop user.
    #3, Definitely referring you to Google on that one.
    #4, Size always matters, it's common sense. The more you process the harder it is. While I haven't done AR I've used the Microsoft Kinect SDK and ANE and tracking was extremely fast but limited. From what I've seen and your basic built in location and direction hardware on any mobile device you shouldn't have much trouble. Depends on what you're doing.
    #5, This discussion would be way too large for a forum. You'd need to consult a firm experienced in AR development.
    #6, "Applications using the commercial license do not have to provide source code, but must pay a licensing fee. Contact ARToolworks at [email protected] for more information." They will base your price on your product, there is no single price.
    #7, The models could be huge and elaborate or tiny and simple which changes the answer. Consult the answer in #4. Ultimately most people are getting on fast networks with mobile and excessively fast on desktop/wifi. Size matters a lot less than 3 years ago.
    #8, Depends on what you're doing. You have to explain it.

  • OJ 6500A Plus and Augmented Reality

    Do you have an Officejet 6500A Plus?
    If so, check out the Augmented Reality support app for replacing cartridges.
    http://h10025.www1.hp.com/ewfrf/wc/document?cc=us&​lc=en&docname=c03668358
    Although I am an HP employee, I am speaking for myself and not for HP.
    Twitter: @Ciara_B_HP

    Star Wars fans might like this
    https://www.youtube.com/watch?v=fvA6fmfbCps&featur​e=youtu.be
    Although I am an HP employee, I am speaking for myself and not for HP.
    Twitter: @Ciara_B_HP

  • SWFLoader and Augmented Reality

    Hi. I created a mobile Actionscript project and have IN2AR augmented reality working with it. I complied a .swf file, and I used the SWFLoader to bring it into a Flex mobile project in FlashBuilder. However, when I do this, the captured video from the camera does not display on screen like it did in the original project. My trace statements show that the test image is being detected for augmented reality, but nothing is displaying on screen in the Flex app other than the IN2AR logo and bottom bar that shows something has been detected. Any ideas for solutions?
    This is my code for the Flex app:
    <?xml version="1.0" encoding="utf-8"?>
    <s:Application xmlns:fx="http://ns.adobe.com/mxml/2009"
                                     xmlns:s="library://ns.adobe.com/flex/spark"
                                     xmlns:mx="library://ns.adobe.com/flex/mx"
                                     minWidth="1024" minHeight="600">
              <mx:SWFLoader id="loader1" source="../ar/AReality.swf"/>
    </s:Application>

    Star Wars fans might like this
    https://www.youtube.com/watch?v=fvA6fmfbCps&featur​e=youtu.be
    Although I am an HP employee, I am speaking for myself and not for HP.
    Twitter: @Ciara_B_HP

  • No Augmented Reality Because the Compass *****?

    The Android platform has several Augmented Reality programs, while the iPhone seems to have none.
    Is this because the compass in the iPhone works so poorly that AR cannot be implemented reliably?
    And why is OS 3.1 not listed on the pull-down menu below?

    My understanding is that the Apple SDK currently doesn't allow these apps to be built for iPhone...
    The compass in the G2 is no more reliable than the 3GS...the "compass mode" of google maps on my friends new G2 is very sub par as well...Very glitchy and doesn't register movement well...similar to issues with 3GS compass...can't be perfect lol
    This was first done most likely due to the limitations in graphics of the 2G and 3G iPhones...although graphics were good Apple doesn't like apps that don't ensure iPhone is beautifully portrayed
    With 3GS graphics are much much better and yet it seems the SDK rules still don't allow them...maybe that'll change soon...
    3.1 will be there soon enough...or maybe because they know there is another update on 9/25 for MMS to go live they are not bothering to put 3.1 in until after 9/25 when maybe it will be 3.1.1
    Until then give feedback...
    http://www.apple.com/feedback/iphone.html

  • Augmented Reality in ios

    i want to make augmented reality application. in which when i put my hand under iphone camera and the hand appear on the screen and then just i have to select ring which is displayed on that app and it get wear on finger.
    for example of my feature application you can see the below link.
    http://www.youtube.com/watch?gl=SG&hl=en-GB&v=WYFMDQ92x_8
    As you see this link video you would get idea that which type of application i need to buildup for iPhone.
    can anyone please guid me for my application.
    and i also want to know that which framework to be used and is there any lib to be used for app.
    thanking you all......!

    I have aslo ask Question on https://developer.apple.com/devforums/  My Question's link is below.
    https://devforums.apple.com/thread/203793
    But i can't get Proper reply from there.
    Please help me.
    Thanks

  • Augmented reality on mobile devices (android and ios)

    Him, is there already some sort of extension for AIR 3 capable of identify visual tags for augmented reality applications on android and on iOS?

    Hi CK,
    I think we may need to create a policy in Network Policies. Please follow the steps below,
    Right click Network Policies, Click New.
    Enter the policy name, click Next.
    Click Add, select the Day and Time Restrictions, click
    Add.
    In the Day and Time Restrictions, choose Permited for
    all, click OK.
    Click Next five times(leave everything default), click
    Finish.
    Move the policy to top and try to connect with your device.
    If issue persists, please make sure that the Connection Requet Policies have been configured properly.
    For detailed information about how to create a network policy, please refer to the link below,
    Configuring NPS network policies
    http://technet.microsoft.com/en-us/library/dd441006.aspx
    Best Regards.
    Steven Lee
    TechNet Community Support

  • Starting and stopping Video File that is Attached to Augmented Reality

    I'm trying to do a little augmented reality. I'm using an action script to call a video and play it. The problem that I'm having is that the sound is starting the moment the window is open and the video plays with the marker is shown. I want the sound and video to only play when the marker is seen. I would like it to pause when the marker is removed and then resume where it left off when the marker is seen again. Here is a copy of my code if someone could look it over and tell me what I'm doing wrong. PLEASE PLEASE PLEASE just correct my mistake. I don't understand enough about coding for you to just say Netstream.pause() and netstream.resume. I don't know where to put that information in. ANY HELP is appreciated greatly.....
    package {
    import flash.display.Sprite;
    import flash.events.Event;
    import flash.display.BitmapData;
    import flash.display.Loader;
    import flash.net.URLRequest;
    import flash.media.Video;
    import flash.net.NetConnection;
    import flash.net.NetStream;
    ////////////AWAY3D FOR
    COLLADA//////////////////////////////////////////////////////////////////// ////////////////////////
    import org.papervision3d.lights.PointLight3D;
    import org.papervision3d.materials.MovieMaterial;
    import org.papervision3d.materials.WireframeMaterial;
    import org.papervision3d.materials.shaders.EnvMapShader;
    import org.papervision3d.materials.shaders.ShadedMaterial;
    import org.papervision3d.materials.shaders.PhongShader;
    import org.papervision3d.materials.BitmapFileMaterial;
    import org.papervision3d.materials.VideoStreamMaterial;
    import org.papervision3d.materials.utils.MaterialsList;
    import org.papervision3d.objects.primitives.Plane;
    // import org.papervision3d.objects.parsers.Collada;
    public class test1 extends PV3DARApp {
    private var _plane:Plane;
    private var _plane2:Plane;
    // private var _piso:Plane;
    private var videoStreamMaterial:VideoStreamMaterial;
    private var quality:uint = 8;
    private var netConnection:NetConnection;
    private var video:Video;
    private var netStream:NetStream;
    private var videoStreamMaterialPiso:VideoStreamMaterial;
    private var quality:uint = 8;
    private var netConnectionPiso:NetConnection;
    private var videoPiso:Video;
    private var netStreamPiso:NetStream;
    */////////////COLLADA FILE
    VAR//////////////////////////////////////////////////////////////////////// ////////////////////
    // private var loader:Loader;
    public function test1() {
    // Transferring the file name of the camera revision file and the
    pattern defined file, it initializes.
    this.init('Data/camera_para.dat', 'Data/flarlogo.pat');
    protected override function onInit():void {
    super.onInit(); // Al be sure to call.
    //Apply Filters
    // The same size as the marker it tries making Plane.
    var wmat:WireframeMaterial = new WireframeMaterial(0xff0000, 0,
    0); // With wire frame.
    this._plane = new Plane(wmat, 80, 80);
    this._plane.rotationX = 180;
    this._baseNode.addChild(this._plane);
    var light:PointLight3D = new PointLight3D();
    light.x = 0;
    light.y = 1000;
    light.z = -1000;
    ///////////////FLV FILE
    HERE/////////////////////////////////////////////////////////////////////// ///////////////////
    var customClient:Object = new Object();
    customClient.onMetaData = metaDataHandler;
    netConnection = new NetConnection();
    netConnection.connect(null);
    netStream = new NetStream(netConnection);
    netStream.client = customClient;
    netStream.play("FNF_Intro.m4v");
    video = new Video();
    video.smoothing = true;
    video.attachNetStream(netStream);
    videoStreamMaterial = new VideoStreamMaterial(video, netStream);
    videoStreamMaterial.doubleSided = true;
    _plane2 = new Plane(videoStreamMaterial, 130, 130, quality,
    quality);
    this._plane2.z = 50;
    this._plane2.rotationX = -90;
    this._plane2.rotationY = 180;
    this._baseNode.addChild(_plane2);
    function metaDataHandler(infoObject:Object):void {
    trace('metaDataHandler',infoObject);

    Helpmeun,
       I am still learning PV3D and AR but from looking  this is the only thing I can think of.... not exactly sure what you are doing, but if you only want the video to play when the face of the plane with the video is visible... try something like this....
    In your AR code.... most people add a listener to the stage or application for the ENTER_FRAME event and use that for doing their maker detection (I use a timer, but either way it is a modification to your event handler function)... something like....
    constructor or creationCompleteHandler()
        // FLAR initialization code
       // PV3D initialization code
       stage.addEventListener(Event.ENTER_FRAME, updateRenderHandler, false, 0, true);  // This would probably go in your onInit override
    The code you are wanting to change is the code in the handler function when it detects or doesn't detect the pattern....
    private function updateRenderHandler(hEvent:Event):void
              // Capture the video as bitmap for pattern recognition
              m_hBitmapRenderer.draw(m_hVideo);
                try
                   // Redetect and see if it is high confidence.... then update and show
                    if ( m_hFLARdetector.detectMarkerLite(m_hFLARraster, 120) && ( m_hFLARdetector.getConfidence() > 0.75 ) )
                        // Get and set transform matrix from detector and update model
                        m_hFLARdetector.getTransformMatrix(m_hFLARtransform);
                        m_hFLARcontainer3D.setTransformMatrix(m_hFLARtransform);
                        // Re-render the scene
                        m_hRenderer.renderScene(m_hScene, m_hFLARcamera, m_hViewport);
                        // Show viewport
                        m_hViewport.visible = true;
                        // !!!!!!! YOUR CODE SHOULD BE
                        m_hNetstream.play();                
                   // The pattern is not recognized in the video
                    else
                        m_hViewport.visible = false;
                        // !!!!!!! YOUR CODE SHOULD BE
                        m_hNetstream.pause();
    Hope that helps.....
    JJ

  • Flex and Augmented Reality, Preview not working

    Hi
    I have been following the tutorial  Introduction to Augmented Reality on gotoandlearn(http://www.gotoandlearn.com/play.php?id=105).
    I have followed the coding exactly  and yet nothing happens when I go to preview it.  I have even copied the  code in the provided code (ar.zip) into my project and have had no result.  When I go  to run the application inside Flex, the Flash Player opens with the  default Flex coluring but nothing occurs inside, Eg the camera is not  called up etc.
    I am using FlexBuilder 3 with Flex 3.5.  With  Flash Player 9.
    Here is the code (basically pulled straight  from the downloadables)
    package {
       import flash.display.BitmapData;
       import  flash.display.Sprite;
       import flash.events.Event;
       import  flash.media.Camera;
       import flash.media.Video;
       import  flash.utils.ByteArray;
       import  org.libspark.flartoolkit.core.FLARCode;
       import  org.libspark.flartoolkit.core.param.FLARParam;
       import  org.libspark.flartoolkit.core.transmat.FLARTransMatResult;
       import  org.libspark.flartoolkit.detector.FLARSingleMarkerDetector;
       import  org.libspark.flartoolkit.pv3d.FLARBaseNode;
       import  org.libspark.flartoolkit.pv3d.FLARCamera3D;
       import  org.papervision3d.lights.PointLight3D;
       import  org.papervision3d.materials.utils.MaterialsList;
       import  org.papervision3d.objects.primitives.Cube;
       import  org.papervision3d.render.BasicRenderEngine;
       import  org.papervision3d.scenes.Scene3D;
       import  org.papervision3d.view.Viewport3D;
       [SWF(width="640".  height="480". frameRate="30". backgroundColor="#FFFFFF")]
        public class ARTester extends Sprite
           [Embed(source="map.pat". mimeType="application/octet-stream")]
           private var pattern:Class;
           [Embed(source="camera_para.dat". mimeType="application/octet-stream")]
           private var params:Class;
          private var  fparams:FLARParam;
          private var mpattern:FLARCode;
           private var vid:Video;
          private var cam:Camera;
           private var bmd:BitmapData;
          private var raster:  FLARRgbRaster_BitmapData;
          private var  detector:FLARSingleMarkerDetector;
          private var scene:Scene3D;
           private var camera:FLARCamera3D;
          private var  container:FLARBaseNode;
          private var vp:Viewport3D;
           private var bre:BasicRenderEngine;
          private var  trans:FLARTransMatResult;
          public function  ARTester()
             setupFLAR();
              setupCamera();
             setupBitmap();
             setupPV3D();
              addEventListener(Event,ENTER_NAME, loop);
              private function setupFLAR():void
                 fparams = new FLARParam();
                fparams.loadARParam(new  params() as ByteArray);
                mpattern = new FLARCode(16, 16);
                 mpattern.loadARPatt(new pattern());
       private  function setupCamera():void
          vid = new Video(640, 480);
           cam = Camera.getCamera();
          cam.setMode(640, 480, 30);
           vid.attachCamera(cam);
          addChild(vid);
            private function setupBitmap():void
              bmd =  new BitmapData(640, 480);
              bmd.draw(vid);
               raster = new FLARRgbRaster_BitmapData(bmd);
              detector = new  FLARSingleMarkerDetector(fparams, mpattern, 80);
            private function setupPV3D():void
              scene = new  Scene3D();
              camera = new FLARCamera3D(fparams);
               container = new FLARBaseNode();
              scene.addChild(container);
              var pl:PointLight3D = new PointLight3D();
               pl.x = 1000;
              pl.y = 1000;
              pl.z = -1000;
              var ml:MaterialsList = new MaterialList({all:  FlatShadeMaterial(pl)});
              var cube1:Cube = new  Cube(ml, 30, 30, 30);
              var cube2:Cube = new Cube(ml, 30,  30, 30);
              cube2.z = 50
              var cube3:Cube = new  Cube(ml, 30, 30, 30);
              cube3.z = 100;
               container.addChild(cube1);
              container.addChild(cube2);
               container.addChild(cube3);
              bre = new  BasicRenderEngine();
              trans = new FLARTransMatResult();
              vp = new Viewport3D();
               addChild(vp);  
           private function  loop(e:Event):void
                  bmd.draw(vid);
                 try
                     if(detector.detectMarkerLite(raster. 80) &&  detector.getConfidence() > 0.5)
                     detector.getTransformMatrix(trans);
                     container.setTransformMatrix(trans);
                     bre.renderScene(scene, camera, vp);
            catch(e:Error){}
    Any help would be much appreiciated.
    Thanks
    Daniel.
    Also; sorry I wasn't sure how to neatly display my code in the post

    You didn't post the original, before adding the +5 exposure.
    I played around in LR5 with some of my photos, and I found this one that exhibits similar results on adding +5 exposure, but it starts off as underexposed and low contrast.  Anything a bit more exposed or with a slightly brighter area (e.g. make dog's front leg a tiny bit brighter) does clip.  (Note - this is the original photo without adding any exposure).

  • Live Video editing using Augmented Reality

    I saw Disney "i am in a band" video. I have  a similar project in augmented reality based on enhancing retail  experience.
    I was having difficulty in finding action script code and if anyone of you can provide me with it for live video editing  (specifically for placing different background in the back and front  layer), i would be very thankfull.
    see the link to get a better undertanding:
    http://www.youtube.com/user/psans1#p/a/u/0/oU2AowSgnGw
    Best
    Nitish Meena
    India

    The Apple people probably didn't understand that you wanted a live mix. FCP can't help you there.
    MultiCam is an FCP feature "post-switching" multicamera recordings for that live-to-tape look. It is fairly complex, and especially with only two "cameras" you would be muchn better to start off editing them on the timeline.
    As for a live switch you need a video switcher. Outside the realm of FCP, but http://bhphoptovideo.com is a great place to do research.
    Then if you're really serious you can record both feeds plus optionally the live switched output for further manipulation in FCP or elsewhere.

  • About augmented reality SDKs for handhelds

    Greetings,
    We're trying to if CS5 could be used to create a SmartPhone AR application. (Augmented Reality)
    Is there an SDK we could use to ease the process ?
    It must work with iPhone, Android and Windows Phone 7.
    Thanks !

    One thing, the class files are all compiled into the swf and therefore they are not used at runtime. You do not need to upload to the server or anything. So one less thing to consider when you debug your issue
    As for the paths, Flash will use the host HTML location as the base path once embedded so it's  best to use loaderInfo.url (always the location of the swf) to construct your paths relative to the swf.
    Kenneth Kawamoto
    http://www.materiaprima.co.uk/

Maybe you are looking for

  • Transaction data can be loaded into the Fact table without loading the

    Transaction data can be loaded into the Fact table without loading the corresponding master data (Example : Sales analysis transaction data can be loaded without populating any of its  dimension’s master data) a.     True b.     False

  • Separate clearing document for each invoice wise while posting F-36

    Dear Experts I am posting F-36 Bills of exchange transaction for invoices. I am receiving one BOE for multiple invoices. Requirement is while saving document system should give separate clearing document for each invoice wise. Pls guide me Thanks in

  • Comment in User Decision

    Hi, I have a step with a User Decision where the person executing can give type his comments in the Create button .And when he executes the Decision a Notification is sent to the next level. At the next level the comments whatever he has written shou

  • Printing table in JSP page

    Hi friends, i am developing an web application where customers and vendors can able to view a detailed report of what are all the product they purchase/sold. Here i need to print that report and not the whole page.. I think i am clear my point. Pleas

  • Why download Apps to iMac?

    Just spent ages downloading iPad apps onto my iMac in iTunes, but I cant open any of them on the iMac?  What is the point of having apps in iTunes on the iMac if you can't use them? Thanks in advance.