This work establishes a robocentric framework around a non-linear Model Predictive Control (NMPC) for autonomous navigation of quadrotors in tunnel-like environments. The proposed framework enables obstacle free navigation capabilities for resource constraint platforms in areas with critical challenges including darkness, textureless surfaces as well as areas with self-similar geometries, without any prior knowledge. The core contribution of the proposed framework stems from the merging of perception dynamics in a model-based optimization approach, aligning the vehicles heading to the tunnels’ open space expressed in the x axis coordinate in the image frame of the most distant area. Moreover, the aerial vehicle is considered as a free-flying object that plans its actions using egocentric onboard sensors. The proposed method can be deployed in both fully illuminated indoor corridors or featureless dark tunnels, leveraging visual processing from either RGB-D or monocular sensors for generating direction commands to keep flying in the proper direction. Multiple experimental field trials demonstrate the effectiveness of the proposed method in challenging environments.