Search Unity

Photo sphere mixed with 3D

Discussion in 'AR/VR (XR) Discussion' started by jk15, Oct 26, 2016.

  1. jk15

    jk15

    Joined:
    Jun 23, 2011
    Posts:
    49
    Hi All,

    I'm attempting to add a photo sphere to my VR game. Imagine you are standing on a 3D balcony in the app, I have a photo wrapped to the inside of a sphere, which is centered where your head is in VR. I'm having issues trying to align everything to look correct.

    Any thoughts? Thanks.
     
  2. Martinez-Vargas

    Martinez-Vargas

    Joined:
    Jun 24, 2016
    Posts:
    39
    hi, try to reset the position of the sphere in the inspector, and recetear the position of the camera in the inspector, the VR camera must be around the center of the sphere so that everything is aligned you.
     
  3. jk15

    jk15

    Joined:
    Jun 23, 2011
    Posts:
    49
    Thanks for the info - I did do this, it just seem items in the foreground in 3D and the image just never line up correctly. Items in the photo appear much closer than they should.
     
  4. JDMulti

    JDMulti

    Joined:
    Jan 4, 2012
    Posts:
    384
    It's hard to align 3d objects within a photosphere. However for real-estate I've done this by placing 3d stereo 360 renders inside not a sphere but a cube. I did this because I could add extra effect that were with sphere impossible to do. However I found out that at a certain distance, things don't mix with a stereo image anymore.

    For example, I had a door at between 2 / 3 meters, I could add a 3d object in my scene and have it appear in the opendoor at correct depth. However past 3 / 4 meter this effect isn't there anymore. Everything lower then 4 meters appears fine and mixes well with the stereo image ( with mix I mean depth sense ).

    In certain situations you can go beyond the 4 meters and it will mix, but only if all other objects in your scene aren't closer then the 3d object you're placing. For example I had an underwater scene with at 10 meter a big whale swimming and at 5 meter some huge sharks. They mixed well, but that was because the underwater scene had far distance view and far away objects in it's stereo image.

    So if you hav a photo with a balcony and some objects close in range at between 1 meter, you'll able to mix 3d objects not past 3/4 meter within your photo. If your first item in your photo appears after 3 meter, then you can do a little past the 3/4 meter with 3d objects.

    It's weird, it how you sense stuff if they mix or not. But I could place VR buttons in 3d space and successfully have them appear in correct depth with the image. But mainly not past beyond 3/4 meters, after that you won't sense the depth anymore. Past that distance you only do that because you want to have maybe GUI elements overlap each other. GL experimenting. I yet have to see stereo photo / render mix with elements past 4 meters well, cause it won't happen.

    Oooh BTW one big thing, your brain will fool you massivly. If your photo contains content that let your eyes autofocus close because a lot of content is close, your brain will say: "hey... euhm this space is small, so no need to focus far away". And if your main content fools your brain, you can't place something about 10 meters or 4 meter, because main content fool your brain to not focus past it.

    Good example is this: have a stereo photo of a door in front of you at 2 meter. Set a 3d object past this 2 meter at the door, you'll see it crosseyed. Place it lower then 2 meter, you'll have it appear in correct depth.
    Open the door, put someone in the open door and have a 3d object behind him, but further away in depth but not overlapping photo content closer. You see it crosseyed, because it's too small and the main content mostly is closer.
    Theory doesn't say much in this cause, user experience will tell a different story. Non the less interesting to do this kind of experiments. Did this a lot since I had to for work and finally we had some really nice stereo 360 real-estate content being used a lot for sales marketing ;)
     
    jk15 likes this.
  5. jk15

    jk15

    Joined:
    Jun 23, 2011
    Posts:
    49
  6. jakeabby

    jakeabby

    Joined:
    Jul 26, 2013
    Posts:
    1
    @JDMulti - I would be very interested to hear more about your process that you used for the real estate, and see some examples if you would share!

    We're working towards something similar (combining 360 images with 3D elements) and I'm struggling with some of the concepts!

    You can catch me on Skype: jakeabby or message me on here

    Thank you!
     
  7. JDMulti

    JDMulti

    Joined:
    Jan 4, 2012
    Posts:
    384
    At the moment I've nothing to show in public, as the thing I've done is a private application.
    But the idea is quite simple, I had an ocean as stereo cubemap with ships, pipes, and some other stuff lying around. I just added a shark, whale and fishes to make the scene more lifelike. The same I did with a room and a butterfly flying around.
    It needs some tweaking it have them appear in correct depth, but it's doable. =)
     
  8. Martinez-Vargas

    Martinez-Vargas

    Joined:
    Jun 24, 2016
    Posts:
    39
    Hi, you are interested in examples of virtual reality for real estate, I have a demo that was working but never finished, if you are interested you can see it and tell me what you think.
    Email: jhan321798@hotmail.es
     
  9. igorpavlov

    igorpavlov

    Joined:
    Dec 24, 2016
    Posts:
    2
    Is there any chance this happens because you shoot stereoscopy with parallel cameras, which means that the zero parallax point will be at infinity? Could this cause the issue with aligning 3D meshes with the stereoscopy?