Presentation
View Synthesis in Casually Captured Scenes Using a Cylindrical Neural Radiance Field With Exposure Compensation
Contributors
Event Type
Poster
Artificial Intelligence/Machine Learning
Deep Learning
Photography
Virtual Reality
Research & Education
Ultimate Supporter
Ultimate Attendee
Exhibitor Ultimate
Enhanced Attendee
Time
Location
DescriptionWe extend Neural Radiance Fields (NeRF) with a cylindrical parameterization and learned exposure compensation technique that enables rendering photorealistic novel views of casually captured 360-degree, outward-facing scenes.
The authors of this Poster have been invited to participate in the Technical Papers Summary and Q&A: Video Editing 2, Thursday, 12 August, 5-6 pm.
The authors of this Poster have been invited to participate in the Technical Papers Summary and Q&A: Video Editing 2, Thursday, 12 August, 5-6 pm.