Synthesizing novel views for Street View experience

Show simple item record

dc.contributor.author Lazorenko, Anastasiia
dc.date.accessioned 2021-09-07T07:39:36Z
dc.date.available 2021-09-07T07:39:36Z
dc.date.issued 2021
dc.identifier.citation Lazorenko, Anastasiya. Synthesizing novel views for Street View experience: Bachelor Thesis: manuscript / Anastasiya Lazorenko; Supervisor: Philipp Kofman; Ukrainian Catholic University, Department of Computer Sciences. – Lviv: 2021. – 30 p.: ill. uk
dc.identifier.uri https://er.ucu.edu.ua/handle/1/2854
dc.description.abstract Navigational applications often suffer from restricted and granular movement possibilities caused by a limited capture of real-world locations. Even the largest collections of street photos like Street-View, Mapillary [31], and SPED win more in geographical coverage than in qualitative capture of specific scenes. A possible solution to this problem could be post-processing of available image collections and generation of new photos that would restore the missing parts. This is the task of novel view synthesis - a known area in computer graphics and vision, that has shown impressive results over last several years [26], [27], [33], etc. However, the problem of real-world outdoor scene reconstruction is the most challenging, and is still a subject to active research. In this work we will explore different approaches to novel view synthesis and evaluate some of them on the sparse real-world imagery from Street-View dataset. uk
dc.language.iso en uk
dc.subject Novel View Synthesis uk
dc.subject cinematography uk
dc.subject virtual reality uk
dc.subject visual effects uk
dc.subject Google Maps’ Street View uk
dc.subject Google Earth uk
dc.title Synthesizing novel views for Street View experience uk
dc.type Preprint uk
dc.status Публікується вперше uk


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search


Browse

My Account