FAQ Category

Technical Implementation & WebXR

Developer-focused guidance on building custom AR/3D experiences for real estate.

15 articles


Technical Implementation & WebXR

How to completely fix tripod shadows and the nadir in a 360 real estate virtual tour?

The nadir (bottom blind spot showing the tripod) is fixed through several methods: automated AI nadir patching tools in platforms like Matterport handle it automatically. For manual workflows, use Photoshop's clone stamp on the equirectangular projection, or capture an extra handheld shot looking down from each position and composite it in. Some photographers also use nadir caps — pre-made branded images that cover the tripod position.

Read article
Technical Implementation & WebXR

Why are the vertical lines converging in my real estate 360 panoramic interior photos?

Converging verticals in 360 panoramas are caused by lens distortion and imprecise leveling of the camera during capture. The 360 camera must be perfectly level (use a leveling base or tripod with bubble level). In post-processing, apply tilt-shift corrections in PTGui or your stitching software. Some cameras offer in-camera horizon leveling that corrects minor misalignment automatically during capture.

Read article
Technical Implementation & WebXR

How to optimize a large 3D real estate Gaussian Splat model to load faster on mobile browsers?

Gaussian Splatting models can be massive. Optimize by: reducing splat count through decimation while preserving visual quality, implementing level-of-detail (LOD) streaming that loads nearby detail first, compressing the point cloud data with quantization, and using progressive loading so users see a coarse preview within seconds while full detail streams in the background. Three.js and custom WebGL shaders are the primary tools for browser-based splat rendering.

Read article
Technical Implementation & WebXR

What causes severe stitching errors when capturing a 3D virtual tour in a narrow hallway?

Narrow hallways create parallax errors because nearby surfaces shift dramatically between the camera's multiple lenses. The stitching algorithm struggles when objects are very close to the camera (under 1–2 meters). Solutions: position the camera in the center of the hallway, use a lens with less parallax offset, capture more positions closer together, and in post-production use parallax-aware stitching software that handles close-range geometry.

Read article
Technical Implementation & WebXR

How to correct lighting exposure problems between bright windows and dark rooms in 360 photos?

Use HDR (High Dynamic Range) bracketing: capture multiple exposures at each position (typically 3–7 brackets spanning the full dynamic range) and merge them in software. Most professional 360 cameras offer automatic HDR bracketing. In post-processing, tone-map the HDR image to balance bright window light with dark interior shadows. Shooting during overcast conditions or twilight also naturally reduces the dynamic range challenge.

Read article
Technical Implementation & WebXR

Can I use the WebXR API to embed an augmented reality property tour directly on a webpage?

Yes. The WebXR Device API enables AR experiences directly in mobile browsers without app installation. Using frameworks like A-Frame, Three.js, or Babylon.js with WebXR, you can build web pages that activate the user's camera, detect surfaces, and overlay 3D property models or furniture in augmented reality. This is particularly powerful for marketing — buyers tap a link and instantly enter an AR experience.

Read article
Technical Implementation & WebXR

How to natively integrate SLAM technology into a custom real estate augmented reality web app?

SLAM (Simultaneous Localization and Mapping) in the browser relies on WebXR's built-in tracking capabilities rather than implementing SLAM from scratch. Mobile browsers on ARKit/ARCore-capable devices expose surface detection and spatial tracking through the WebXR API. Your Three.js or A-Frame application accesses these capabilities via XRSession with 'immersive-ar' mode, letting the browser handle the heavy SLAM computation natively.

Read article
Technical Implementation & WebXR

Why is my Zillow 3D Home tour consistently failing to process the interactive floor plan?

Common causes: insufficient overlap between panoramic capture positions (aim for 50%+ visual overlap), inconsistent lighting between rooms causing stitching failures, reflective surfaces (mirrors, large windows) confusing the depth estimation, or using a camera model not fully supported by Zillow's processing pipeline. Ensure you follow Zillow's recommended capture pattern — methodical, consistent spacing, with the camera at chest height in each room's center.

Read article
Technical Implementation & WebXR

How to programmatically add clickable hotspots and multimedia pop-ups in a Three.js 3D environment?

In Three.js, create hotspot markers as sprite meshes or HTML overlay elements positioned in 3D space using CSS2DRenderer. Use raycasting (THREE.Raycaster) to detect mouse/touch intersections with hotspot objects. On click, trigger pop-up overlays containing images, video, text, or links. For VR mode, implement gaze-based interaction where hovering the reticle on a hotspot for a threshold duration triggers the same action.

Read article
Technical Implementation & WebXR

What are the optimal export mesh settings for importing a 3D architectural model into Unity for VR?

For real-time VR performance in Unity: target under 500K polygons for the full scene, use FBX format for best Unity compatibility, bake high-poly detail into normal maps, limit texture atlases to 2K–4K resolution, and ensure proper UV unwrapping. Set the import scale to match Unity's 1 unit = 1 meter convention. For large properties, implement occlusion culling and LOD groups so only visible geometry renders each frame.

Read article
Technical Implementation & WebXR

How to implement a virtual ray hit test to place 3D furniture using WebXR JavaScript?

Use the WebXR Hit Test API: request an XRHitTestSource from the XR session, perform hit tests each frame against detected surfaces, visualize the potential placement point with a reticle mesh, and on user tap create a Three.js anchor at the hit test result's pose. The furniture model is then parented to this anchor, maintaining its position as the user moves. This is the core mechanic behind all browser-based AR furniture placement apps.

Read article
Technical Implementation & WebXR

Can I create a performant 3D real estate viewer using only Vanilla JavaScript and Three.js?

Absolutely. Three.js with Vanilla JavaScript is a lightweight, dependency-free approach that produces highly performant 3D viewers. Load GLTF/GLB models with GLTFLoader, implement OrbitControls for navigation, and add environment maps for realistic material rendering. Without framework overhead (React, Vue, etc.), the initial bundle stays small and rendering performance is maximized — critical for mobile devices viewing heavy 3D property scenes.

Read article
Technical Implementation & WebXR

How to generate a seamless day-to-night live panorama transition in 3D Vista Virtual Tour Pro?

3D Vista's "Live Panorama" feature captures multiple exposures at different times of day from the same position. In the editor, assign these as time layers and configure the transition slider. Visitors can then drag a timeline to smoothly transition between daylight, golden hour, and nighttime views of the property. This feature is particularly powerful for showcasing how a property's ambiance changes throughout the day.

Read article
Technical Implementation & WebXR

What is the best method to extract high-quality 2D still photos from a 360 video walkthrough?

Use the 360 video software's built-in "reframe" or "keyframe" export feature (available in Insta360 Studio, GoPro Player, etc.) to select the exact angle and field of view, then export as a high-resolution still image. For maximum quality, shoot 360 video at the highest available resolution (8K+) and extract stills at specific timestamps. Post-process the extracted crops in Lightroom for professional-grade output.

Read article
Technical Implementation & WebXR

How to configure an interactive project generator using npm create @iwsdk for an AR app?

Run 'npm create @iwsdk@latest' to scaffold a new WebXR project. The CLI walks you through selecting AR vs VR mode, choosing a rendering engine (Three.js or Babylon.js), and configuring build tools. The generated project includes hit testing, surface detection, and 3D model loading pre-configured. From there, customize the scene to load your property models and implement interaction handlers specific to your real estate use case.

Read article