Home|Journals|Articles by Year|Audio Abstracts
 

Original Article

JJCIT. 2020; 6(4): 434-445


Improved Deep Learning Architecture For Depth Estimation

Suhaila Farhan Abuowaida, Huah Yong Chan.




Abstract

Numerous benefits of depth estimation from the single image field on medicine, robot video games, and3D reality applications have garnered attention in recent years. Closely related to the third dimension of depth, this operation can be accomplished using human vision, though considered challenging due to the various issues when using computer vision. The differences in the geometry, the texture of the scene, the occlusion scene boundaries, and the inherent ambiguity exist because of the minimal information that could be gathered from a single image. This paper, therefore, proposes a novel depth estimation in the field of architecture, which includes the stages that can manage the depth estimation from a single RGB image. An encoder-decoder architecture has been proposed, based on the improvement yielded from DenseNet that extracted the map of an image using skip connection technique. This paper also takes on the reverse Huberloss function that essentially suits our architecture hand driven by the value distributions that are commonly present in-depth maps. Experimental results have indicated that the depth estimation architecture that employs the NYU Depth v2 dataset has a better performance than the other state-of-the-art methods that tend to have fewer parameters and require fewer training time.

Key words: Deep Learning; Depth Estimation; Encoder-Decoder






Full-text options


Share this Article


Online Article Submission
• ejmanager.com




ejPort - eJManager.com
Refer & Earn
JournalList
About BiblioMed
License Information
Terms & Conditions
Privacy Policy
Contact Us

The articles in Bibliomed are open access articles licensed under Creative Commons Attribution 4.0 International License (CC BY), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.