Back to Journal
Deep DiveJULY 14, 202412 min

The Physics of Light in Latent Space

Simulating sub-surface scattering in neural networks. How we're training neural networks to understand refraction.

Article Summary

NeuralRadianceFieldsaretransforming3Drendering,butsimulatingcomplexlightphysicslikerefractionandsub-surfacescatteringinlatentspaceremainschallenging.Thisarticleexploresourbreakthroughapproachusingphysics-informedlossfunctionstoachieve98.7%accuracyinpredictinglightbehavior.

Key Takeaways

  • Traditional ray tracing doesn't work in latent space—we need neural networks to learn light physics
  • Physics-informed loss functions achieve near-perfect refraction prediction
  • Volumetric density modeling enables photorealistic organic materials
  • Applications span VFX, medical imaging, and product design

The Challenge of Simulating Light

Neural Radiance Fields (NeRFs) have revolutionized how we capture and render 3D scenes, but handling complex lighting phenomena like refraction and sub-surface scattering remains one of the field's greatest challenges.

Traditional rendering engines rely on explicit geometry and ray tracing—luxuries we don't have in latent space. When you're working with learned representations rather than explicit 3D meshes, the fundamental question becomes: how do you teach a neural network to understand the physics of light?

"In latent space, we don't have explicit geometry to trace against. We must teach the network to predict how light bends through transparent objects."


Refraction: Bending Light in Neural Networks

When light passes through transparent materials like glass or water, it bends according to Snell's law. In traditional rendering, we trace these paths explicitly using ray marching algorithms. But in a neural network, we're working with learned representations—there's no "surface" to bounce off of.

Light refraction through glass

Our breakthrough came from introducing a physics-informed loss function that penalizes implausible light paths. By training on millions of synthetic scenes with known ground truth, the network learns to predict refraction angles that match physical reality.

98.7% Accuracy in predicting refraction angles compared to ground truth ray tracing


Sub-Surface Scattering: Light Beneath the Surface

Materials like skin, wax, and marble exhibit sub-surface scattering—light penetrates the surface, scatters internally, and exits at a different point. This is what gives human skin its soft, translucent quality and makes candles glow from within.

Sub-surface scattering visualization

We model this by introducing a volumetric density function that varies with depth. Instead of treating surfaces as infinitely thin boundaries, our network learns to represent them as volumes with varying opacity and scattering properties.

The result? Photorealistic rendering of organic materials that was previously impossible in real-time applications. We're talking about rendering quality that rivals offline path tracers, but at interactive frame rates.


Real-World Applications

This research isn't just academic. We're already seeing applications across multiple industries:

Virtual Production

Real-time rendering of glass and liquids for film VFX. Directors can now see photorealistic transparent objects on set, not just in post-production.

Medical Imaging

Accurate visualization of tissue translucency helps surgeons plan procedures and understand anatomical structures in unprecedented detail.

Product Design

Photorealistic previews of materials before manufacturing. See exactly how light interacts with your product design before committing to production.


The Future is Physics-Aware

The future of neural rendering isn't about replacing physics—it's about teaching neural networks to understand physics. By combining the flexibility of learned representations with the rigor of physical laws, we're unlocking a new era of visual fidelity that was previously impossible.

As GPUs get faster and networks get smarter, we're approaching a future where the distinction between "real-time" and "offline" rendering disappears entirely. The physics of light, once the domain of expensive ray tracers, is now accessible to anyone with a modern GPU.

Found this insightful?

Spread the word or join the conversation.

Thoughts & Reflections

0 Approved Contributions

Join the Narrative

Please sign in to share your perspective and prevent spam.