Optimizing neural radiance field: a comprehensive review of the impact of different optimizers on neural radiance fields

Latika Pinjarkar, Aditya Nittala, Mahantesh P. Mattada, Vedant Pinjarkar, Bhumika Neole, Manisha Kogundi Math

Abstract


Neural radiance field (NeRF) is a form of deep learning model that may be used to depict 3D scenes from a collection of photos. It has been demonstrated that NeRF can produce photorealistic photographs of fresh perspectives on a scene even from a small number of input images. However, the optimizer that is employed can have a significant impact on the quality of the final reconstruction. Finding an effective optimizer is one of the biggest challenges while learning NeRF models. The optimizer is responsible for making changes to the model's parameters to minimize the discrepancy between the model's predictions and the actual data. We cover the many optimizers that have been used to train NeRF models in this study. We present research results contrasting the effectiveness of multiple optimizers and examine the benefits and drawbacks of each optimizer. For training NeRF models, four different optimizers viz. Adaptive moment estimation (Adam), AdamW, root mean square propagation (RMSProp), and adaptive gradient (Adagrad) are trained. The most effective optimizer for a given assignment will vary depending on a variety of elements, including the size of the dataset, the complexity of the scene, and the level of accuracy that is required.

Keywords


Adagrad; Adam; AdamW; Deep learning; NeRF; RMSProp

Full Text:

PDF


DOI: https://doi.org/10.11591/eei.v14i1.8315

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Bulletin of EEI Stats

Bulletin of Electrical Engineering and Informatics (BEEI)
ISSN: 2089-3191, e-ISSN: 2302-9285
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).