Liu, S.W. and Fischer, M. and Yoo, Paul and Ritschel, T. (2024) Neural bounding. ACM SIGGRAPH 2024 Conference ,
Text
3641519.3657442.pdf - Published Version of Record Restricted to Repository staff only Download (32MB) |
Abstract
Bounding volumes are an established concept in computer graphics and vision tasks but have seen little change since their early inception. In this work, we study the use of neural networks as bounding volumes. Our key observation is that bounding, which so far has primarily been considered a problem of computational geometry, can be redefined as a problem of learning to classify space into free or occupied. This learning-based approach is particularly advantageous in high-dimensional spaces, such as animated scenes with complex queries, where neural networks are known to excel. However, unlocking neural bounding requires a twist: allowing – but also limiting – false positives, while ensuring that the number of false negatives is strictly zero. We enable such tight and conservative results using a dynamically-weighted asymmetric loss function. Our results show that our neural bounding produces up to an order of magnitude fewer false positives than traditional methods. In addition, we propose an extension of our bounding method using early exits that accelerates query speeds by 25 %. We also demonstrate that our approach is applicable to non-deep learning models that train within seconds. Our project page is at https://wenxin-liu.github.io/neural_bounding/.
Metadata
Item Type: | Article |
---|---|
Additional Information: | SIGGRAPH '24: Special Interest Group on Computer Graphics and Interactive Techniques Conference, Denver CO USA, 27 July 2024- 1 August 2024. ISBN: 9798400705250 |
School: | Birkbeck Faculties and Schools > Faculty of Science > School of Computing and Mathematical Sciences |
Depositing User: | Paul Yoo |
Date Deposited: | 15 Jul 2024 13:11 |
Last Modified: | 16 Jul 2024 00:02 |
URI: | https://eprints.bbk.ac.uk/id/eprint/53824 |
Statistics
Additional statistics are available via IRStats2.