浏览代码

Updated to V0.3beta

- RGB-D compatibility, the RGB-D examples had been adapted to the new version.

- Kitti and TUM dataset compatibility, these examples had been adapted to the new version.

- ROS compatibility, It had been updated the old references in the code to work with this version.

- Config file parser, the YAML file contains the session configuration, a wrong parametrization may break the execution without any information to solve it. This version parses the file to read all the fields and give a proper answer if one of the fields have been wrong deffined or don't exist.

- Fixed minor bugs.
Richard Elvira 4 年之前
父节点
当前提交
ef9784101f

+ 1 - 0
.gitignore

@@ -46,3 +46,4 @@ cmake_modules/
 cmake-build-debug/
 
 *.pyc
+*.osa

+ 26 - 0
Changelog.md

@@ -0,0 +1,26 @@
+# ORB-SLAM3
+Details of changes between the different versions.
+
+### V0.3: Beta version, 4 Sep 2020
+
+- RGB-D compatibility, the RGB-D examples had been adapted to the new version.
+
+- Kitti and TUM dataset compatibility, these examples had been adapted to the new version.
+
+- ROS compatibility, It had been updated the old references in the code to work with this version.
+
+- Config file parser, the YAML file contains the session configuration, a wrong parametrization may break the execution without any information to solve it. This version parses the file to read all the fields and give a proper answer if one of the fields have been wrong deffined or don't exist.
+
+- Fixed minor bugs.
+
+
+### V0.2: Beta version, 7 Aug 2020
+Initial release. It has these capabilities:
+
+- Multiple-Maps capabilities, it is able to handle multiple maps in the same session and merge them when a common area is detected with a seamless fussion.
+
+- Inertial sensor, the IMU initialization takes a 2 seconds to achieve a scale error less than 5\% and it is reffined in the next 10 seconds until is around 1\%. Inertial measures are integrated at frame rate to estimate the scale, gravity and velocity in order to improve the visual features detection and make the system robust to temporal occlusions.
+
+- Fisheye sensor, the fisheye sensors are now fully supported in monocular and stereo. 
+
+

+ 1 - 1
Dependencies.md

@@ -1,5 +1,5 @@
 ##List of Known Dependencies
-###ORB-SLAM3 version 0.2
+###ORB-SLAM3 version 0.3
 
 In this document we list all the pieces of code included  by ORB-SLAM3 and linked libraries which are not property of the authors of ORB-SLAM3.
 

+ 3 - 0
Examples/Monocular/EuRoC.yaml

@@ -16,6 +16,9 @@ Camera.k2: 0.07395907
 Camera.p1: 0.00019359
 Camera.p2: 1.76187114e-05
 
+Camera.width: 752
+Camera.height: 480
+
 # Camera frames per second 
 Camera.fps: 20.0
 

+ 4 - 0
Examples/Monocular/KITTI00-02.yaml

@@ -22,6 +22,10 @@ Camera.fps: 10.0
 # Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
 Camera.RGB: 1
 
+# Camera resolution
+Camera.width: 1241
+Camera.height: 376
+
 #--------------------------------------------------------------------------------------------
 # ORB Parameters
 #--------------------------------------------------------------------------------------------

+ 4 - 0
Examples/Monocular/KITTI03.yaml

@@ -22,6 +22,10 @@ Camera.fps: 10.0
 # Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
 Camera.RGB: 1
 
+# Camera resolution
+Camera.width: 1241
+Camera.height: 376
+
 #--------------------------------------------------------------------------------------------
 # ORB Parameters
 #--------------------------------------------------------------------------------------------

+ 4 - 0
Examples/Monocular/KITTI04-12.yaml

@@ -22,6 +22,10 @@ Camera.fps: 10.0
 # Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
 Camera.RGB: 1
 
+# Camera resolution
+Camera.width: 1241
+Camera.height: 376
+
 #--------------------------------------------------------------------------------------------
 # ORB Parameters
 #--------------------------------------------------------------------------------------------

+ 4 - 0
Examples/Monocular/TUM1.yaml

@@ -23,6 +23,10 @@ Camera.fps: 30.0
 # Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
 Camera.RGB: 1
 
+# Camera resolution
+Camera.width: 640
+Camera.height: 480
+
 #--------------------------------------------------------------------------------------------
 # ORB Parameters
 #--------------------------------------------------------------------------------------------

+ 4 - 0
Examples/Monocular/TUM2.yaml

@@ -23,6 +23,10 @@ Camera.fps: 30.0
 # Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
 Camera.RGB: 1
 
+# Camera resolution
+Camera.width: 640
+Camera.height: 480
+
 #--------------------------------------------------------------------------------------------
 # ORB Parameters
 #--------------------------------------------------------------------------------------------

+ 4 - 0
Examples/Monocular/TUM3.yaml

@@ -22,6 +22,10 @@ Camera.fps: 30.0
 # Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
 Camera.RGB: 1
 
+# Camera resolution
+Camera.width: 640
+Camera.height: 480
+
 #--------------------------------------------------------------------------------------------
 # ORB Parameters
 #--------------------------------------------------------------------------------------------

+ 3 - 0
Examples/RGB-D/TUM2.yaml

@@ -35,6 +35,9 @@ ThDepth: 40.0
 # Deptmap values factor 
 DepthMapFactor: 5208.0
 
+Camera.width: 640
+Camera.height: 480
+
 #--------------------------------------------------------------------------------------------
 # ORB Parameters
 #--------------------------------------------------------------------------------------------

+ 3 - 0
Examples/RGB-D/TUM3.yaml

@@ -34,6 +34,9 @@ ThDepth: 40.0
 # Deptmap values factor
 DepthMapFactor: 5000.0
 
+Camera.width: 640
+Camera.height: 480
+
 #--------------------------------------------------------------------------------------------
 # ORB Parameters
 #--------------------------------------------------------------------------------------------

+ 1 - 1
Examples/Stereo-Inertial/EuRoC.yaml

@@ -29,7 +29,7 @@ Camera.bf: 47.90639384423901
 Camera.RGB: 1
 
 # Close/Far threshold. Baseline times.
-ThDepth: 35 # 35
+ThDepth: 35.0 # 35
 
 # Transformation from camera 0 to body-frame (imu)
 Tbc: !!opencv-matrix

+ 6 - 3
Examples/Stereo-Inertial/TUM_512.yaml

@@ -39,8 +39,11 @@ Tlr: !!opencv-matrix
             -0.000656143613644,  -0.046896036240590,   0.998899560146304,   0.001015350132563]
 
 # Lapping area between images
-Lapping.left: 0
-Lapping.right: 511
+Camera.lappingBegin: 0
+Camera.lappingEnd: 511
+
+Camera2.lappingBegin: 0
+Camera2.lappingEnd: 511
 
 # Camera resolution
 Camera.width: 512
@@ -53,7 +56,7 @@ Camera.fps: 20.0
 Camera.RGB: 1
 
 # Close/Far threshold. Baseline times.
-ThDepth: 40
+ThDepth: 40.0
 Camera.bf: 19.3079
 
 

+ 6 - 3
Examples/Stereo-Inertial/TUM_512_outdoors.yaml

@@ -39,8 +39,11 @@ Tlr: !!opencv-matrix
             -0.000656143613644,  -0.046896036240590,   0.998899560146304,   0.001015350132563]
 
 # Lapping area between images
-Lapping.left: 0
-Lapping.right: 511
+Camera.lappingBegin: 0
+Camera.lappingEnd: 511
+
+Camera2.lappingBegin: 0
+Camera2.lappingEnd: 511
 
 # Camera resolution
 Camera.width: 512
@@ -53,7 +56,7 @@ Camera.fps: 20.0
 Camera.RGB: 1
 
 # Close/Far threshold. Baseline times.
-ThDepth: 40
+ThDepth: 40.0
 Camera.bf: 19.3079
 
 

+ 1 - 1
Examples/Stereo/EuRoC.yaml

@@ -31,7 +31,7 @@ Camera.bf: 47.90639384423901
 Camera.RGB: 1
 
 # Close/Far threshold. Baseline times.
-ThDepth: 35
+ThDepth: 35.0
 
 #--------------------------------------------------------------------------------------------
 # Stereo Rectification. Only if you need to pre-rectify the images.

+ 1 - 1
Examples/Stereo/KITTI00-02.yaml

@@ -31,7 +31,7 @@ Camera.bf: 386.1448
 Camera.RGB: 1
 
 # Close/Far threshold. Baseline times.
-ThDepth: 35
+ThDepth: 35.0
 
 #--------------------------------------------------------------------------------------------
 # ORB Parameters

+ 1 - 3
Examples/Stereo/KITTI03.yaml

@@ -16,8 +16,6 @@ Camera.k2: 0.0
 Camera.p1: 0.0
 Camera.p2: 0.0
 
-Camera.bFishEye: 0
-
 Camera.width: 1241
 Camera.height: 376
 
@@ -31,7 +29,7 @@ Camera.bf: 387.5744
 Camera.RGB: 1
 
 # Close/Far threshold. Baseline times.
-ThDepth: 40
+ThDepth: 40.0
 
 #--------------------------------------------------------------------------------------------
 # ORB Parameters

+ 1 - 1
Examples/Stereo/KITTI04-12.yaml

@@ -31,7 +31,7 @@ Camera.bf: 379.8145
 Camera.RGB: 1
 
 # Close/Far threshold. Baseline times.
-ThDepth: 40
+ThDepth: 40.0
 
 #--------------------------------------------------------------------------------------------
 # ORB Parameters

+ 9 - 3
Examples/Stereo/TUM_512.yaml

@@ -46,10 +46,16 @@ Tlr: !!opencv-matrix
             -0.000823363992158,   0.998899461915674,   0.046895490788700,   0.001946204678584,
             -0.000656143613644,  -0.046896036240590,   0.998899560146304,   0.001015350132563]
 
+# Camera resolution
+Camera.width: 512
+Camera.height: 512
+
 # Lapping area between images
-Lapping.left: 0
-Lapping.right: 511
+Camera.lappingBegin: 0
+Camera.lappingEnd: 511
 
+Camera2.lappingBegin: 0
+Camera2.lappingEnd: 511
 
 # Camera frames per second
 Camera.fps: 20.0
@@ -58,7 +64,7 @@ Camera.fps: 20.0
 Camera.RGB: 1
 
 # Close/Far threshold. Baseline times.
-ThDepth: 40
+ThDepth: 40.0
 
 Camera.bf: 19.3079
 

+ 3 - 1
README.md

@@ -1,8 +1,10 @@
 # ORB-SLAM3
 
-### V0.3: Beta version, 7 Aug 2020
+### V0.3: Beta version, 4 Sep 2020
 **Authors:** Carlos Campos, Richard Elvira, Juan J. Gómez Rodríguez, [José M. M. Montiel](http://webdiis.unizar.es/~josemari/), [Juan D. Tardos](http://webdiis.unizar.es/~jdtardos/).
 
+The [Changelog](https://github.com/UZ-SLAMLab/ORB_SLAM3/Changelog.md) describes the features of each version.
+
 ORB-SLAM3 is the first real-time SLAM library able to perform **Visual, Visual-Inertial and Multi-Map SLAM** with **monocular, stereo and RGB-D** cameras, using **pin-hole and fisheye** lens models. In all sensor configurations, ORB-SLAM3 is as robust as the best systems available in the literature, and significantly more accurate. 
 
 We provide examples to run ORB-SLAM3 in the [EuRoC dataset](http://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets) using stereo or monocular, with or without IMU, and in the [TUM-VI dataset](https://vision.in.tum.de/data/datasets/visual-inertial-dataset) using fisheye stereo or monocular, with or without IMU. Videos of some example executions can be found at [ORB-SLAM3 channel](https://www.youtube.com/channel/UCXVt-kXG6T95Z4tVaYlU80Q).

+ 2 - 0
include/MapDrawer.h

@@ -47,6 +47,8 @@ public:
 
 private:
 
+    bool ParseViewerParamFile(cv::FileStorage &fSettings);
+
     float mKeyFrameSize;
     float mKeyFrameLineWidth;
     float mGraphLineWidth;

+ 5 - 0
include/Tracking.h

@@ -62,6 +62,11 @@ public:
 
     ~Tracking();
 
+    // Parse the config file
+    bool ParseCamParamFile(cv::FileStorage &fSettings);
+    bool ParseORBParamFile(cv::FileStorage &fSettings);
+    bool ParseIMUParamFile(cv::FileStorage &fSettings);
+
     // Preprocess the input and call Track(). Extract features and performs stereo matching.
     cv::Mat GrabImageStereo(const cv::Mat &imRectLeft,const cv::Mat &imRectRight, const double &timestamp, string filename);
     cv::Mat GrabImageRGBD(const cv::Mat &imRGB,const cv::Mat &imD, const double &timestamp, string filename);

+ 2 - 0
include/Viewer.h

@@ -61,6 +61,8 @@ public:
     bool both;
 private:
 
+    bool ParseViewerParamFile(cv::FileStorage &fSettings);
+
     bool Stop();
 
     System* mpSystem;

+ 2 - 2
src/Frame.cc

@@ -1047,8 +1047,8 @@ Frame::Frame(const cv::Mat &imLeft, const cv::Mat &imRight, const double &timeSt
     mvInvLevelSigma2 = mpORBextractorLeft->GetInverseScaleSigmaSquares();
 
     // ORB extraction
-    thread threadLeft(&Frame::ExtractORB,this,0,imLeft,0,511);
-    thread threadRight(&Frame::ExtractORB,this,1,imRight,0,511);
+    thread threadLeft(&Frame::ExtractORB,this,0,imLeft,static_cast<KannalaBrandt8*>(mpCamera)->mvLappingArea[0],static_cast<KannalaBrandt8*>(mpCamera)->mvLappingArea[1]);
+    thread threadRight(&Frame::ExtractORB,this,1,imRight,static_cast<KannalaBrandt8*>(mpCamera2)->mvLappingArea[0],static_cast<KannalaBrandt8*>(mpCamera2)->mvLappingArea[1]);
     threadLeft.join();
     threadRight.join();
     std::chrono::steady_clock::time_point t2 = std::chrono::steady_clock::now();

+ 86 - 6
src/MapDrawer.cc

@@ -31,13 +31,93 @@ MapDrawer::MapDrawer(Atlas* pAtlas, const string &strSettingPath):mpAtlas(pAtlas
 {
     cv::FileStorage fSettings(strSettingPath, cv::FileStorage::READ);
 
-    mKeyFrameSize = fSettings["Viewer.KeyFrameSize"];
-    mKeyFrameLineWidth = fSettings["Viewer.KeyFrameLineWidth"];
-    mGraphLineWidth = fSettings["Viewer.GraphLineWidth"];
-    mPointSize = fSettings["Viewer.PointSize"];
-    mCameraSize = fSettings["Viewer.CameraSize"];
-    mCameraLineWidth = fSettings["Viewer.CameraLineWidth"];
+    bool is_correct = ParseViewerParamFile(fSettings);
 
+    if(!is_correct)
+    {
+        std::cerr << "**ERROR in the config file, the format is not correct**" << std::endl;
+        try
+        {
+            throw -1;
+        }
+        catch(exception &e)
+        {
+
+        }
+    }
+}
+
+bool MapDrawer::ParseViewerParamFile(cv::FileStorage &fSettings)
+{
+    bool b_miss_params = false;
+
+    cv::FileNode node = fSettings["Viewer.KeyFrameSize"];
+    if(!node.empty())
+    {
+        mKeyFrameSize = node.real();
+    }
+    else
+    {
+        std::cerr << "*Viewer.KeyFrameSize parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
+
+    node = fSettings["Viewer.KeyFrameLineWidth"];
+    if(!node.empty())
+    {
+        mKeyFrameLineWidth = node.real();
+    }
+    else
+    {
+        std::cerr << "*Viewer.KeyFrameLineWidth parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
+
+    node = fSettings["Viewer.GraphLineWidth"];
+    if(!node.empty())
+    {
+        mGraphLineWidth = node.real();
+    }
+    else
+    {
+        std::cerr << "*Viewer.GraphLineWidth parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
+
+    node = fSettings["Viewer.PointSize"];
+    if(!node.empty())
+    {
+        mPointSize = node.real();
+    }
+    else
+    {
+        std::cerr << "*Viewer.PointSize parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
+
+    node = fSettings["Viewer.CameraSize"];
+    if(!node.empty())
+    {
+        mCameraSize = node.real();
+    }
+    else
+    {
+        std::cerr << "*Viewer.CameraSize parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
+
+    node = fSettings["Viewer.CameraLineWidth"];
+    if(!node.empty())
+    {
+        mCameraLineWidth = node.real();
+    }
+    else
+    {
+        std::cerr << "*Viewer.CameraLineWidth parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
+
+    return !b_miss_params;
 }
 
 void MapDrawer::DrawMapPoints()

+ 737 - 191
src/Tracking.cc

@@ -55,14 +55,229 @@ Tracking::Tracking(System *pSys, ORBVocabulary* pVoc, FrameDrawer *pFrameDrawer,
     // Load camera parameters from settings file
     cv::FileStorage fSettings(strSettingPath, cv::FileStorage::READ);
 
-    cv::Mat DistCoef = cv::Mat::zeros(4,1,CV_32F);
+    bool b_parse_cam = ParseCamParamFile(fSettings);
+    if(!b_parse_cam)
+    {
+        std::cout << "*Error with the camera parameters in the config file*" << std::endl;
+    }
+
+    // Load ORB parameters
+    bool b_parse_orb = ParseORBParamFile(fSettings);
+    if(!b_parse_orb)
+    {
+        std::cout << "*Error with the ORB parameters in the config file*" << std::endl;
+    }
+
+    initID = 0; lastID = 0;
+
+    // Load IMU parameters
+    bool b_parse_imu = true;
+    if(sensor==System::IMU_MONOCULAR || sensor==System::IMU_STEREO)
+    {
+        b_parse_imu = ParseIMUParamFile(fSettings);
+        if(!b_parse_imu)
+        {
+            std::cout << "*Error with the IMU parameters in the config file*" << std::endl;
+        }
+
+        mnFramesToResetIMU = mMaxFrames;
+    }
+
+    mbInitWith3KFs = false;
+
+
+    //Rectification parameters
+    /*mbNeedRectify = false;
+    if((mSensor == System::STEREO || mSensor == System::IMU_STEREO) && sCameraName == "PinHole")
+    {
+        cv::Mat K_l, K_r, P_l, P_r, R_l, R_r, D_l, D_r;
+        fSettings["LEFT.K"] >> K_l;
+        fSettings["RIGHT.K"] >> K_r;
+
+        fSettings["LEFT.P"] >> P_l;
+        fSettings["RIGHT.P"] >> P_r;
+
+        fSettings["LEFT.R"] >> R_l;
+        fSettings["RIGHT.R"] >> R_r;
+
+        fSettings["LEFT.D"] >> D_l;
+        fSettings["RIGHT.D"] >> D_r;
+
+        int rows_l = fSettings["LEFT.height"];
+        int cols_l = fSettings["LEFT.width"];
+        int rows_r = fSettings["RIGHT.height"];
+        int cols_r = fSettings["RIGHT.width"];
+
+        if(K_l.empty() || K_r.empty() || P_l.empty() || P_r.empty() || R_l.empty() || R_r.empty() || D_l.empty() || D_r.empty()
+                || rows_l==0 || cols_l==0 || rows_r==0 || cols_r==0)
+        {
+            mbNeedRectify = false;
+        }
+        else
+        {
+            mbNeedRectify = true;
+            // M1r y M2r son los outputs (igual para l)
+            // M1r y M2r son las matrices relativas al mapeo correspondiente a la rectificación de mapa en el eje X e Y respectivamente
+            //cv::initUndistortRectifyMap(K_l,D_l,R_l,P_l.rowRange(0,3).colRange(0,3),cv::Size(cols_l,rows_l),CV_32F,M1l,M2l);
+            //cv::initUndistortRectifyMap(K_r,D_r,R_r,P_r.rowRange(0,3).colRange(0,3),cv::Size(cols_r,rows_r),CV_32F,M1r,M2r);
+        }
+
+
+    }
+    else
+    {
+        int cols = 752;
+        int rows = 480;
+        cv::Mat R_l = cv::Mat::eye(3, 3, CV_32F);
+    }*/
+
+    mnNumDataset = 0;
+
+    if(!b_parse_cam || !b_parse_orb || !b_parse_imu)
+    {
+        std::cerr << "**ERROR in the config file, the format is not correct**" << std::endl;
+        try
+        {
+            throw -1;
+        }
+        catch(exception &e)
+        {
+
+        }
+    }
+
+    //f_track_stats.open("tracking_stats"+ _nameSeq + ".txt");
+    /*f_track_stats.open("tracking_stats.txt");
+    f_track_stats << "# timestamp, Num KF local, Num MP local, time" << endl;
+    f_track_stats << fixed ;*/
+
+#ifdef SAVE_TIMES
+    f_track_times.open("tracking_times.txt");
+    f_track_times << "# ORB_Ext(ms), Stereo matching(ms), Preintegrate_IMU(ms), Pose pred(ms), LocalMap_track(ms), NewKF_dec(ms)" << endl;
+    f_track_times << fixed ;
+#endif
+}
+
+Tracking::~Tracking()
+{
+    //f_track_stats.close();
+#ifdef SAVE_TIMES
+    f_track_times.close();
+#endif
+}
+
+bool Tracking::ParseCamParamFile(cv::FileStorage &fSettings)
+{
+    mDistCoef = cv::Mat::zeros(4,1,CV_32F);
+    cout << endl << "Camera Parameters: " << endl;
+    bool b_miss_params = false;
 
     string sCameraName = fSettings["Camera.type"];
-    if(sCameraName == "PinHole"){
-        float fx = fSettings["Camera.fx"];
-        float fy = fSettings["Camera.fy"];
-        float cx = fSettings["Camera.cx"];
-        float cy = fSettings["Camera.cy"];
+    if(sCameraName == "PinHole")
+    {
+        float fx, fy, cx, cy;
+
+        // Camera calibration parameters
+        cv::FileNode node = fSettings["Camera.fx"];
+        if(!node.empty() && node.isReal())
+        {
+            fx = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.fx parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+
+        node = fSettings["Camera.fy"];
+        if(!node.empty() && node.isReal())
+        {
+            fy = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.fy parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+
+        node = fSettings["Camera.cx"];
+        if(!node.empty() && node.isReal())
+        {
+            cx = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.cx parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+
+        node = fSettings["Camera.cy"];
+        if(!node.empty() && node.isReal())
+        {
+            cy = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.cy parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+
+        // Distortion parameters
+        node = fSettings["Camera.k1"];
+        if(!node.empty() && node.isReal())
+        {
+            mDistCoef.at<float>(0) = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.k1 parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+
+        node = fSettings["Camera.k2"];
+        if(!node.empty() && node.isReal())
+        {
+            mDistCoef.at<float>(1) = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.k2 parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+
+        node = fSettings["Camera.p1"];
+        if(!node.empty() && node.isReal())
+        {
+            mDistCoef.at<float>(2) = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.p1 parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+
+        node = fSettings["Camera.p2"];
+        if(!node.empty() && node.isReal())
+        {
+            mDistCoef.at<float>(3) = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.p2 parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+
+        node = fSettings["Camera.k3"];
+        if(!node.empty() && node.isReal())
+        {
+            mDistCoef.resize(5);
+            mDistCoef.at<float>(4) = node.real();
+        }
+
+        if(b_miss_params)
+        {
+            return false;
+        }
 
         vector<float> vCamCalib{fx,fy,cx,cy};
 
@@ -70,98 +285,354 @@ Tracking::Tracking(System *pSys, ORBVocabulary* pVoc, FrameDrawer *pFrameDrawer,
 
         mpAtlas->AddCamera(mpCamera);
 
-        DistCoef.at<float>(0) = fSettings["Camera.k1"];
-        DistCoef.at<float>(1) = fSettings["Camera.k2"];
-        DistCoef.at<float>(2) = fSettings["Camera.p1"];
-        DistCoef.at<float>(3) = fSettings["Camera.p2"];
+
+        std::cout << "- Camera: Pinhole" << std::endl;
+        std::cout << "- fx: " << fx << std::endl;
+        std::cout << "- fy: " << fy << std::endl;
+        std::cout << "- cx: " << cx << std::endl;
+        std::cout << "- cy: " << cy << std::endl;
+        std::cout << "- k1: " << mDistCoef.at<float>(0) << std::endl;
+        std::cout << "- k2: " << mDistCoef.at<float>(1) << std::endl;
+
+
+        std::cout << "- p1: " << mDistCoef.at<float>(2) << std::endl;
+        std::cout << "- p2: " << mDistCoef.at<float>(3) << std::endl;
+
+        if(mDistCoef.rows==5)
+            std::cout << "- k3: " << mDistCoef.at<float>(4) << std::endl;
+
+        mK = cv::Mat::eye(3,3,CV_32F);
+        mK.at<float>(0,0) = fx;
+        mK.at<float>(1,1) = fy;
+        mK.at<float>(0,2) = cx;
+        mK.at<float>(1,2) = cy;
+
     }
-    if(sCameraName == "KannalaBrandt8"){
-        float fx = fSettings["Camera.fx"];
-        float fy = fSettings["Camera.fy"];
-        float cx = fSettings["Camera.cx"];
-        float cy = fSettings["Camera.cy"];
+    else if(sCameraName == "KannalaBrandt8")
+    {
+        float fx, fy, cx, cy;
+        float k1, k2, k3, k4;
 
-        float K1 = fSettings["Camera.k1"];
-        float K2 = fSettings["Camera.k2"];
-        float K3 = fSettings["Camera.k3"];
-        float K4 = fSettings["Camera.k4"];
+        // Camera calibration parameters
+        cv::FileNode node = fSettings["Camera.fx"];
+        if(!node.empty() && node.isReal())
+        {
+            fx = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.fx parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+        node = fSettings["Camera.fy"];
+        if(!node.empty() && node.isReal())
+        {
+            fy = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.fy parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
 
-        std::cout << "K1 = " << K1 << std::endl;
-        std::cout << "K2 = " << K2 << std::endl;
-        std::cout << "K3 = " << K3 << std::endl;
-        std::cout << "K4 = " << K4 << std::endl;
+        node = fSettings["Camera.cx"];
+        if(!node.empty() && node.isReal())
+        {
+            cx = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.cx parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
 
-        vector<float> vCamCalib{fx,fy,cx,cy,K1,K2,K3,K4};
+        node = fSettings["Camera.cy"];
+        if(!node.empty() && node.isReal())
+        {
+            cy = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.cy parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
 
-        mpCamera = new KannalaBrandt8(vCamCalib);
+        // Distortion parameters
+        node = fSettings["Camera.k1"];
+        if(!node.empty() && node.isReal())
+        {
+            k1 = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.k1 parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+        node = fSettings["Camera.k2"];
+        if(!node.empty() && node.isReal())
+        {
+            k2 = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.k2 parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
 
-        mpAtlas->AddCamera(mpCamera);
+        node = fSettings["Camera.k3"];
+        if(!node.empty() && node.isReal())
+        {
+            k3 = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.k3 parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+
+        node = fSettings["Camera.k4"];
+        if(!node.empty() && node.isReal())
+        {
+            k4 = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.k4 parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
+
+        if(!b_miss_params)
+        {
+            vector<float> vCamCalib{fx,fy,cx,cy,k1,k2,k3,k4};
+            mpCamera = new KannalaBrandt8(vCamCalib);
+
+            std::cout << "- Camera: Fisheye" << std::endl;
+            std::cout << "- fx: " << fx << std::endl;
+            std::cout << "- fy: " << fy << std::endl;
+            std::cout << "- cx: " << cx << std::endl;
+            std::cout << "- cy: " << cy << std::endl;
+            std::cout << "- k1: " << k1 << std::endl;
+            std::cout << "- k2: " << k2 << std::endl;
+            std::cout << "- k3: " << k3 << std::endl;
+            std::cout << "- k4: " << k4 << std::endl;
+
+            mK = cv::Mat::eye(3,3,CV_32F);
+            mK.at<float>(0,0) = fx;
+            mK.at<float>(1,1) = fy;
+            mK.at<float>(0,2) = cx;
+            mK.at<float>(1,2) = cy;
+        }
+
+        if(mSensor==System::STEREO || mSensor==System::IMU_STEREO){
+            // Right camera
+            // Camera calibration parameters
+            cv::FileNode node = fSettings["Camera2.fx"];
+            if(!node.empty() && node.isReal())
+            {
+                fx = node.real();
+            }
+            else
+            {
+                std::cerr << "*Camera2.fx parameter doesn't exist or is not a real number*" << std::endl;
+                b_miss_params = true;
+            }
+            node = fSettings["Camera2.fy"];
+            if(!node.empty() && node.isReal())
+            {
+                fy = node.real();
+            }
+            else
+            {
+                std::cerr << "*Camera2.fy parameter doesn't exist or is not a real number*" << std::endl;
+                b_miss_params = true;
+            }
+
+            node = fSettings["Camera2.cx"];
+            if(!node.empty() && node.isReal())
+            {
+                cx = node.real();
+            }
+            else
+            {
+                std::cerr << "*Camera2.cx parameter doesn't exist or is not a real number*" << std::endl;
+                b_miss_params = true;
+            }
+
+            node = fSettings["Camera2.cy"];
+            if(!node.empty() && node.isReal())
+            {
+                cy = node.real();
+            }
+            else
+            {
+                std::cerr << "*Camera2.cy parameter doesn't exist or is not a real number*" << std::endl;
+                b_miss_params = true;
+            }
+
+            // Distortion parameters
+            node = fSettings["Camera2.k1"];
+            if(!node.empty() && node.isReal())
+            {
+                k1 = node.real();
+            }
+            else
+            {
+                std::cerr << "*Camera2.k1 parameter doesn't exist or is not a real number*" << std::endl;
+                b_miss_params = true;
+            }
+            node = fSettings["Camera2.k2"];
+            if(!node.empty() && node.isReal())
+            {
+                k2 = node.real();
+            }
+            else
+            {
+                std::cerr << "*Camera2.k2 parameter doesn't exist or is not a real number*" << std::endl;
+                b_miss_params = true;
+            }
 
-        if(sensor==System::STEREO || sensor==System::IMU_STEREO){
-            //Right camera
-            fx = fSettings["Camera2.fx"];
-            fy = fSettings["Camera2.fy"];
-            cx = fSettings["Camera2.cx"];
-            cy = fSettings["Camera2.cy"];
+            node = fSettings["Camera2.k3"];
+            if(!node.empty() && node.isReal())
+            {
+                k3 = node.real();
+            }
+            else
+            {
+                std::cerr << "*Camera2.k3 parameter doesn't exist or is not a real number*" << std::endl;
+                b_miss_params = true;
+            }
 
-            K1 = fSettings["Camera2.k1"];
-            K2 = fSettings["Camera2.k2"];
-            K3 = fSettings["Camera2.k3"];
-            K4 = fSettings["Camera2.k4"];
+            node = fSettings["Camera2.k4"];
+            if(!node.empty() && node.isReal())
+            {
+                k4 = node.real();
+            }
+            else
+            {
+                std::cerr << "*Camera2.k4 parameter doesn't exist or is not a real number*" << std::endl;
+                b_miss_params = true;
+            }
 
-            cout << endl << "Camera2 Parameters: " << endl;
-            cout << "- fx: " << fx << endl;
-            cout << "- fy: " << fy << endl;
-            cout << "- cx: " << cx << endl;
-            cout << "- cy: " << cy << endl;
 
-            vector<float> vCamCalib2{fx,fy,cx,cy,K1,K2,K3,K4};
+            int leftLappingBegin = -1;
+            int leftLappingEnd = -1;
 
-            mpCamera2 = new KannalaBrandt8(vCamCalib2);
+            int rightLappingBegin = -1;
+            int rightLappingEnd = -1;
+
+            node = fSettings["Camera.lappingBegin"];
+            if(!node.empty() && node.isInt())
+            {
+                leftLappingBegin = node.operator int();
+            }
+            else
+            {
+                std::cout << "WARNING: Camera.lappingBegin not correctly defined" << std::endl;
+            }
+            node = fSettings["Camera.lappingEnd"];
+            if(!node.empty() && node.isInt())
+            {
+                leftLappingEnd = node.operator int();
+            }
+            else
+            {
+                std::cout << "WARNING: Camera.lappingEnd not correctly defined" << std::endl;
+            }
+            node = fSettings["Camera2.lappingBegin"];
+            if(!node.empty() && node.isInt())
+            {
+                rightLappingBegin = node.operator int();
+            }
+            else
+            {
+                std::cout << "WARNING: Camera2.lappingBegin not correctly defined" << std::endl;
+            }
+            node = fSettings["Camera2.lappingEnd"];
+            if(!node.empty() && node.isInt())
+            {
+                rightLappingEnd = node.operator int();
+            }
+            else
+            {
+                std::cout << "WARNING: Camera2.lappingEnd not correctly defined" << std::endl;
+            }
 
-            mpAtlas->AddCamera(mpCamera2);
+            node = fSettings["Tlr"];
+            if(!node.empty())
+            {
+                mTlr = node.mat();
+                if(mTlr.rows != 3 || mTlr.cols != 4)
+                {
+                    std::cerr << "*Tlr matrix have to be a 3x4 transformation matrix*" << std::endl;
+                    b_miss_params = true;
+                }
+            }
+            else
+            {
+                std::cerr << "*Tlr matrix doesn't exist*" << std::endl;
+                b_miss_params = true;
+            }
 
-            int leftLappingBegin = fSettings["Camera.lappingBegin"];
-            int leftLappingEnd = fSettings["Camera.lappingEnd"];
+            if(!b_miss_params)
+            {
+                static_cast<KannalaBrandt8*>(mpCamera)->mvLappingArea[0] = leftLappingBegin;
+                static_cast<KannalaBrandt8*>(mpCamera)->mvLappingArea[1] = leftLappingEnd;
 
-            int rightLappingBegin = fSettings["Camera2.lappingBegin"];
-            int rightLappingEnd = fSettings["Camera2.lappingEnd"];
+                mpFrameDrawer->both = true;
 
-            static_cast<KannalaBrandt8*>(mpCamera)->mvLappingArea[0] = leftLappingBegin;
-            static_cast<KannalaBrandt8*>(mpCamera)->mvLappingArea[1] = leftLappingEnd;
+                vector<float> vCamCalib2{fx,fy,cx,cy,k1,k2,k3,k4};
+                mpCamera2 = new KannalaBrandt8(vCamCalib2);
 
-            static_cast<KannalaBrandt8*>(mpCamera2)->mvLappingArea[0] = rightLappingBegin;
-            static_cast<KannalaBrandt8*>(mpCamera2)->mvLappingArea[1] = rightLappingEnd;
+                static_cast<KannalaBrandt8*>(mpCamera2)->mvLappingArea[0] = rightLappingBegin;
+                static_cast<KannalaBrandt8*>(mpCamera2)->mvLappingArea[1] = rightLappingEnd;
 
-            fSettings["Tlr"] >> mTlr;
-            cout << "- mTlr: \n" << mTlr << endl;
-            mpFrameDrawer->both = true;
-        }
-    }
+                std::cout << "- Camera1 Lapping: " << leftLappingBegin << ", " << leftLappingEnd << std::endl;
 
-    float fx = fSettings["Camera.fx"];
-    float fy = fSettings["Camera.fy"];
-    float cx = fSettings["Camera.cx"];
-    float cy = fSettings["Camera.cy"];
+                std::cout << std::endl << "Camera2 Parameters:" << std::endl;
+                std::cout << "- Camera: Fisheye" << std::endl;
+                std::cout << "- fx: " << fx << std::endl;
+                std::cout << "- fy: " << fy << std::endl;
+                std::cout << "- cx: " << cx << std::endl;
+                std::cout << "- cy: " << cy << std::endl;
+                std::cout << "- k1: " << k1 << std::endl;
+                std::cout << "- k2: " << k2 << std::endl;
+                std::cout << "- k3: " << k3 << std::endl;
+                std::cout << "- k4: " << k4 << std::endl;
 
-    cv::Mat K = cv::Mat::eye(3,3,CV_32F);
-    K.at<float>(0,0) = fx;
-    K.at<float>(1,1) = fy;
-    K.at<float>(0,2) = cx;
-    K.at<float>(1,2) = cy;
-    K.copyTo(mK);
+                std::cout << "- mTlr: \n" << mTlr << std::endl;
 
-    const float k3 = fSettings["Camera.k3"];
+                std::cout << "- Camera2 Lapping: " << rightLappingBegin << ", " << rightLappingEnd << std::endl;
+            }
+        }
 
-    if(k3!=0)
+        if(b_miss_params)
+        {
+            return false;
+        }
+
+        mpAtlas->AddCamera(mpCamera);
+        mpAtlas->AddCamera(mpCamera2);
+    }
+    else
     {
-        DistCoef.resize(5);
-        DistCoef.at<float>(4) = k3;
+        std::cerr << "*Not Supported Camera Sensor*" << std::endl;
+        std::cerr << "Check an example configuration file with the desired sensor" << std::endl;
     }
 
-    DistCoef.copyTo(mDistCoef);
+    if(mSensor==System::STEREO || mSensor==System::IMU_STEREO)
+    {
+        cv::FileNode node = fSettings["Camera.bf"];
+        if(!node.empty() && node.isReal())
+        {
+            mbf = node.real();
+        }
+        else
+        {
+            std::cerr << "*Camera.bf parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
 
-    mbf = fSettings["Camera.bf"];
+    }
 
     float fps = fSettings["Camera.fps"];
     if(fps==0)
@@ -171,24 +642,6 @@ Tracking::Tracking(System *pSys, ORBVocabulary* pVoc, FrameDrawer *pFrameDrawer,
     mMinFrames = 0;
     mMaxFrames = fps;
 
-    cout << endl << "Camera Parameters: " << endl;
-    cout << "- fx: " << fx << endl;
-    cout << "- fy: " << fy << endl;
-    cout << "- cx: " << cx << endl;
-    cout << "- cy: " << cy << endl;
-    cout << "- bf: " << mbf << endl;
-    cout << "- k1: " << DistCoef.at<float>(0) << endl;
-    cout << "- k2: " << DistCoef.at<float>(1) << endl;
-
-
-    cout << "- p1: " << DistCoef.at<float>(2) << endl;
-    cout << "- p2: " << DistCoef.at<float>(3) << endl;
-
-    if(DistCoef.rows==5)
-        cout << "- k3: " << DistCoef.at<float>(4) << endl;
-
-
-
     cout << "- fps: " << fps << endl;
 
 
@@ -200,144 +653,237 @@ Tracking::Tracking(System *pSys, ORBVocabulary* pVoc, FrameDrawer *pFrameDrawer,
     else
         cout << "- color order: BGR (ignored if grayscale)" << endl;
 
-    // Load ORB parameters
+    if(mSensor==System::STEREO || mSensor==System::RGBD || mSensor==System::IMU_STEREO)
+    {
+        float fx = mpCamera->getParameter(0);
+        cv::FileNode node = fSettings["ThDepth"];
+        if(!node.empty()  && node.isReal())
+        {
+            mThDepth = node.real();
+            mThDepth = mbf*mThDepth/fx;
+            cout << endl << "Depth Threshold (Close/Far Points): " << mThDepth << endl;
+        }
+        else
+        {
+            std::cerr << "*ThDepth parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
 
-    int nFeatures = fSettings["ORBextractor.nFeatures"];
-    float fScaleFactor = fSettings["ORBextractor.scaleFactor"];
-    int nLevels = fSettings["ORBextractor.nLevels"];
-    int fIniThFAST = fSettings["ORBextractor.iniThFAST"];
-    int fMinThFAST = fSettings["ORBextractor.minThFAST"];
 
-    mpORBextractorLeft = new ORBextractor(nFeatures,fScaleFactor,nLevels,fIniThFAST,fMinThFAST);
+    }
 
-    if(sensor==System::STEREO || sensor==System::IMU_STEREO)
-        mpORBextractorRight = new ORBextractor(nFeatures,fScaleFactor,nLevels,fIniThFAST,fMinThFAST);
+    if(mSensor==System::RGBD)
+    {
+        cv::FileNode node = fSettings["DepthMapFactor"];
+        if(!node.empty() && node.isReal())
+        {
+            mDepthMapFactor = node.real();
+            if(fabs(mDepthMapFactor)<1e-5)
+                mDepthMapFactor=1;
+            else
+                mDepthMapFactor = 1.0f/mDepthMapFactor;
+        }
+        else
+        {
+            std::cerr << "*DepthMapFactor parameter doesn't exist or is not a real number*" << std::endl;
+            b_miss_params = true;
+        }
 
-    if(sensor==System::MONOCULAR || sensor==System::IMU_MONOCULAR)
-        mpIniORBextractor = new ORBextractor(5*nFeatures,fScaleFactor,nLevels,fIniThFAST,fMinThFAST);
+    }
 
-    initID = 0; lastID = 0;
+    if(b_miss_params)
+    {
+        return false;
+    }
 
-    cout << endl << "ORB Extractor Parameters: " << endl;
-    cout << "- Number of Features: " << nFeatures << endl;
-    cout << "- Scale Levels: " << nLevels << endl;
-    cout << "- Scale Factor: " << fScaleFactor << endl;
-    cout << "- Initial Fast Threshold: " << fIniThFAST << endl;
-    cout << "- Minimum Fast Threshold: " << fMinThFAST << endl;
+    return true;
+}
 
-    if(sensor==System::STEREO || sensor==System::RGBD || sensor==System::IMU_STEREO)
+bool Tracking::ParseORBParamFile(cv::FileStorage &fSettings)
+{
+    bool b_miss_params = false;
+    int nFeatures, nLevels, fIniThFAST, fMinThFAST;
+    float fScaleFactor;
+
+    cv::FileNode node = fSettings["ORBextractor.nFeatures"];
+    if(!node.empty() && node.isInt())
+    {
+        nFeatures = node.operator int();
+    }
+    else
     {
-        mThDepth = mbf*(float)fSettings["ThDepth"]/fx;
-        cout << endl << "Depth Threshold (Close/Far Points): " << mThDepth << endl;
+        std::cerr << "*ORBextractor.nFeatures parameter doesn't exist or is not an integer*" << std::endl;
+        b_miss_params = true;
     }
 
-    if(sensor==System::RGBD)
+    node = fSettings["ORBextractor.scaleFactor"];
+    if(!node.empty() && node.isReal())
     {
-        mDepthMapFactor = fSettings["DepthMapFactor"];
-        if(fabs(mDepthMapFactor)<1e-5)
-            mDepthMapFactor=1;
-        else
-            mDepthMapFactor = 1.0f/mDepthMapFactor;
+        fScaleFactor = node.real();
+    }
+    else
+    {
+        std::cerr << "*ORBextractor.scaleFactor parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
     }
 
-    if(sensor==System::IMU_MONOCULAR || sensor==System::IMU_STEREO)
+    node = fSettings["ORBextractor.nLevels"];
+    if(!node.empty() && node.isInt())
+    {
+        nLevels = node.operator int();
+    }
+    else
     {
-        cv::Mat Tbc;
-        fSettings["Tbc"] >> Tbc;
-        cout << endl;
+        std::cerr << "*ORBextractor.nLevels parameter doesn't exist or is not an integer*" << std::endl;
+        b_miss_params = true;
+    }
 
-        cout << "Left camera to Imu Transform (Tbc): " << endl << Tbc << endl;
+    node = fSettings["ORBextractor.iniThFAST"];
+    if(!node.empty() && node.isInt())
+    {
+        fIniThFAST = node.operator int();
+    }
+    else
+    {
+        std::cerr << "*ORBextractor.iniThFAST parameter doesn't exist or is not an integer*" << std::endl;
+        b_miss_params = true;
+    }
 
-        float freq, Ng, Na, Ngw, Naw;
-        fSettings["IMU.Frequency"] >> freq;
-        fSettings["IMU.NoiseGyro"] >> Ng;
-        fSettings["IMU.NoiseAcc"] >> Na;
-        fSettings["IMU.GyroWalk"] >> Ngw;
-        fSettings["IMU.AccWalk"] >> Naw;
+    node = fSettings["ORBextractor.minThFAST"];
+    if(!node.empty() && node.isInt())
+    {
+        fMinThFAST = node.operator int();
+    }
+    else
+    {
+        std::cerr << "*ORBextractor.minThFAST parameter doesn't exist or is not an integer*" << std::endl;
+        b_miss_params = true;
+    }
 
-        const float sf = sqrt(freq);
-        cout << endl;
-        cout << "IMU frequency: " << freq << " Hz" << endl;
-        cout << "IMU gyro noise: " << Ng << " rad/s/sqrt(Hz)" << endl;
-        cout << "IMU gyro walk: " << Ngw << " rad/s^2/sqrt(Hz)" << endl;
-        cout << "IMU accelerometer noise: " << Na << " m/s^2/sqrt(Hz)" << endl;
-        cout << "IMU accelerometer walk: " << Naw << " m/s^3/sqrt(Hz)" << endl;
+    if(b_miss_params)
+    {
+        return false;
+    }
 
-        mpImuCalib = new IMU::Calib(Tbc,Ng*sf,Na*sf,Ngw/sf,Naw/sf);
+    mpORBextractorLeft = new ORBextractor(nFeatures,fScaleFactor,nLevels,fIniThFAST,fMinThFAST);
 
-        mpImuPreintegratedFromLastKF = new IMU::Preintegrated(IMU::Bias(),*mpImuCalib);
+    if(mSensor==System::STEREO || mSensor==System::IMU_STEREO)
+        mpORBextractorRight = new ORBextractor(nFeatures,fScaleFactor,nLevels,fIniThFAST,fMinThFAST);
 
-        mnFramesToResetIMU = mMaxFrames;
-    }
+    if(mSensor==System::MONOCULAR || mSensor==System::IMU_MONOCULAR)
+        mpIniORBextractor = new ORBextractor(5*nFeatures,fScaleFactor,nLevels,fIniThFAST,fMinThFAST);
 
-    mbInitWith3KFs = false;
+    cout << endl << "ORB Extractor Parameters: " << endl;
+    cout << "- Number of Features: " << nFeatures << endl;
+    cout << "- Scale Levels: " << nLevels << endl;
+    cout << "- Scale Factor: " << fScaleFactor << endl;
+    cout << "- Initial Fast Threshold: " << fIniThFAST << endl;
+    cout << "- Minimum Fast Threshold: " << fMinThFAST << endl;
 
+    return true;
+}
 
-    //Rectification parameters
-    /*mbNeedRectify = false;
-    if((mSensor == System::STEREO || mSensor == System::IMU_STEREO) && sCameraName == "PinHole")
+bool Tracking::ParseIMUParamFile(cv::FileStorage &fSettings)
+{
+    bool b_miss_params = false;
+
+    cv::Mat Tbc;
+    cv::FileNode node = fSettings["Tbc"];
+    if(!node.empty())
     {
-        cv::Mat K_l, K_r, P_l, P_r, R_l, R_r, D_l, D_r;
-        fSettings["LEFT.K"] >> K_l;
-        fSettings["RIGHT.K"] >> K_r;
+        Tbc = node.mat();
+        if(Tbc.rows != 4 || Tbc.cols != 4)
+        {
+            std::cerr << "*Tbc matrix have to be a 4x4 transformation matrix*" << std::endl;
+            b_miss_params = true;
+        }
+    }
+    else
+    {
+        std::cerr << "*Tbc matrix doesn't exist*" << std::endl;
+        b_miss_params = true;
+    }
 
-        fSettings["LEFT.P"] >> P_l;
-        fSettings["RIGHT.P"] >> P_r;
+    cout << endl;
 
-        fSettings["LEFT.R"] >> R_l;
-        fSettings["RIGHT.R"] >> R_r;
+    cout << "Left camera to Imu Transform (Tbc): " << endl << Tbc << endl;
 
-        fSettings["LEFT.D"] >> D_l;
-        fSettings["RIGHT.D"] >> D_r;
+    float freq, Ng, Na, Ngw, Naw;
 
-        int rows_l = fSettings["LEFT.height"];
-        int cols_l = fSettings["LEFT.width"];
-        int rows_r = fSettings["RIGHT.height"];
-        int cols_r = fSettings["RIGHT.width"];
+    node = fSettings["IMU.Frequency"];
+    if(!node.empty() && node.isInt())
+    {
+        freq = node.operator int();
+    }
+    else
+    {
+        std::cerr << "*IMU.Frequency parameter doesn't exist or is not an integer*" << std::endl;
+        b_miss_params = true;
+    }
 
-        if(K_l.empty() || K_r.empty() || P_l.empty() || P_r.empty() || R_l.empty() || R_r.empty() || D_l.empty() || D_r.empty()
-                || rows_l==0 || cols_l==0 || rows_r==0 || cols_r==0)
-        {
-            mbNeedRectify = false;
-        }
-        else
-        {
-            mbNeedRectify = true;
-            // M1r y M2r son los outputs (igual para l)
-            // M1r y M2r son las matrices relativas al mapeo correspondiente a la rectificación de mapa en el eje X e Y respectivamente
-            //cv::initUndistortRectifyMap(K_l,D_l,R_l,P_l.rowRange(0,3).colRange(0,3),cv::Size(cols_l,rows_l),CV_32F,M1l,M2l);
-            //cv::initUndistortRectifyMap(K_r,D_r,R_r,P_r.rowRange(0,3).colRange(0,3),cv::Size(cols_r,rows_r),CV_32F,M1r,M2r);
-        }
+    node = fSettings["IMU.NoiseGyro"];
+    if(!node.empty() && node.isReal())
+    {
+        Ng = node.real();
+    }
+    else
+    {
+        std::cerr << "*IMU.NoiseGyro parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
 
+    node = fSettings["IMU.NoiseAcc"];
+    if(!node.empty() && node.isReal())
+    {
+        Na = node.real();
+    }
+    else
+    {
+        std::cerr << "*IMU.NoiseAcc parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
 
+    node = fSettings["IMU.GyroWalk"];
+    if(!node.empty() && node.isReal())
+    {
+        Ngw = node.real();
     }
     else
     {
-        int cols = 752;
-        int rows = 480;
-        cv::Mat R_l = cv::Mat::eye(3, 3, CV_32F);
-    }*/
+        std::cerr << "*IMU.GyroWalk parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
 
-    mnNumDataset = 0;
+    node = fSettings["IMU.AccWalk"];
+    if(!node.empty() && node.isReal())
+    {
+        Naw = node.real();
+    }
+    else
+    {
+        std::cerr << "*IMU.AccWalk parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
 
-    //f_track_stats.open("tracking_stats"+ _nameSeq + ".txt");
-    /*f_track_stats.open("tracking_stats.txt");
-    f_track_stats << "# timestamp, Num KF local, Num MP local, time" << endl;
-    f_track_stats << fixed ;*/
+    if(b_miss_params)
+    {
+        return false;
+    }
 
-#ifdef SAVE_TIMES
-    f_track_times.open("tracking_times.txt");
-    f_track_times << "# ORB_Ext(ms), Stereo matching(ms), Preintegrate_IMU(ms), Pose pred(ms), LocalMap_track(ms), NewKF_dec(ms)" << endl;
-    f_track_times << fixed ;
-#endif
-}
+    const float sf = sqrt(freq);
+    cout << endl;
+    cout << "IMU frequency: " << freq << " Hz" << endl;
+    cout << "IMU gyro noise: " << Ng << " rad/s/sqrt(Hz)" << endl;
+    cout << "IMU gyro walk: " << Ngw << " rad/s^2/sqrt(Hz)" << endl;
+    cout << "IMU accelerometer noise: " << Na << " m/s^2/sqrt(Hz)" << endl;
+    cout << "IMU accelerometer walk: " << Naw << " m/s^3/sqrt(Hz)" << endl;
 
-Tracking::~Tracking()
-{
-    //f_track_stats.close();
-#ifdef SAVE_TIMES
-    f_track_times.close();
-#endif
+    mpImuCalib = new IMU::Calib(Tbc,Ng*sf,Na*sf,Ngw/sf,Naw/sf);
+
+    mpImuPreintegratedFromLastKF = new IMU::Preintegrated(IMU::Bias(),*mpImuCalib);
+
+
+    return true;
 }
 
 void Tracking::SetLocalMapper(LocalMapping *pLocalMapper)

+ 85 - 10
src/Viewer.cc

@@ -31,25 +31,100 @@ Viewer::Viewer(System* pSystem, FrameDrawer *pFrameDrawer, MapDrawer *pMapDrawer
 {
     cv::FileStorage fSettings(strSettingPath, cv::FileStorage::READ);
 
+    bool is_correct = ParseViewerParamFile(fSettings);
+
+    if(!is_correct)
+    {
+        std::cerr << "**ERROR in the config file, the format is not correct**" << std::endl;
+        try
+        {
+            throw -1;
+        }
+        catch(exception &e)
+        {
+
+        }
+    }
+
+    mbStopTrack = false;
+}
+
+bool Viewer::ParseViewerParamFile(cv::FileStorage &fSettings)
+{
+    bool b_miss_params = false;
+
     float fps = fSettings["Camera.fps"];
     if(fps<1)
         fps=30;
     mT = 1e3/fps;
 
-    mImageWidth = fSettings["Camera.width"];
-    mImageHeight = fSettings["Camera.height"];
-    if(mImageWidth<1 || mImageHeight<1)
+    cv::FileNode node = fSettings["Camera.width"];
+    if(!node.empty())
+    {
+        mImageWidth = node.real();
+    }
+    else
     {
-        mImageWidth = 640;
-        mImageHeight = 480;
+        std::cerr << "*Camera.width parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
     }
 
-    mViewpointX = fSettings["Viewer.ViewpointX"];
-    mViewpointY = fSettings["Viewer.ViewpointY"];
-    mViewpointZ = fSettings["Viewer.ViewpointZ"];
-    mViewpointF = fSettings["Viewer.ViewpointF"];
+    node = fSettings["Camera.height"];
+    if(!node.empty())
+    {
+        mImageHeight = node.real();
+    }
+    else
+    {
+        std::cerr << "*Camera.height parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
 
-    mbStopTrack = false;
+    node = fSettings["Viewer.ViewpointX"];
+    if(!node.empty())
+    {
+        mViewpointX = node.real();
+    }
+    else
+    {
+        std::cerr << "*Viewer.ViewpointX parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
+
+    node = fSettings["Viewer.ViewpointY"];
+    if(!node.empty())
+    {
+        mViewpointY = node.real();
+    }
+    else
+    {
+        std::cerr << "*Viewer.ViewpointY parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
+
+    node = fSettings["Viewer.ViewpointZ"];
+    if(!node.empty())
+    {
+        mViewpointZ = node.real();
+    }
+    else
+    {
+        std::cerr << "*Viewer.ViewpointZ parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
+
+    node = fSettings["Viewer.ViewpointF"];
+    if(!node.empty())
+    {
+        mViewpointF = node.real();
+    }
+    else
+    {
+        std::cerr << "*Viewer.ViewpointF parameter doesn't exist or is not a real number*" << std::endl;
+        b_miss_params = true;
+    }
+
+    return !b_miss_params;
 }
 
 void Viewer::Run()

+ 12 - 12
tum_vi_examples.sh

@@ -4,43 +4,43 @@ pathDatasetTUM_VI='/Datasets/TUM_VI' #Example, it is necesary to change it by th
 #------------------------------------
 # Monocular Examples
 echo "Launching Room 1 with Monocular sensor"
-./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room1_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room1_512.txt dataset-room1_512_mono
+./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room1_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room1_512.txt dataset-room1_512_mono
 
 echo "Launching Room 2 with Monocular sensor"
-./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room2_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room2_512.txt dataset-room2_512_mono
+./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room2_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room2_512.txt dataset-room2_512_mono
 
 echo "Launching Room 3 with Monocular sensor"
-./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room3_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room3_512.txt dataset-room3_512_mono
+./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room3_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room3_512.txt dataset-room3_512_mono
 
 echo "Launching Room 4 with Monocular sensor"
-./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room4_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room4_512.txt dataset-room4_512_mono
+./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room4_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room4_512.txt dataset-room4_512_mono
 
 echo "Launching Room 5 with Monocular sensor"
-./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room5_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room5_512.txt dataset-room5_512_mono
+./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room5_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room5_512.txt dataset-room5_512_mono
 
 echo "Launching Room 6 with Monocular sensor"
-./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room6_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room6_512.txt dataset-room6_512_mono
+./Examples/Monocular/mono_tum_vi Vocabulary/ORBvoc.txt Examples/Monocular/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room6_512_16/mav0/cam0/data Examples/Monocular/TUM_TimeStamps/dataset-room6_512.txt dataset-room6_512_mono
 
 
 #------------------------------------
 # Stereo Examples
 echo "Launching Room 1 with Stereo sensor"
-./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room1_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room1_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room1_512.txt dataset-room1_512_stereo
+./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room1_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room1_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room1_512.txt dataset-room1_512_stereo
 
 echo "Launching Room 2 with Stereo sensor"
-./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room2_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room2_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room2_512.txt dataset-room2_512_stereo
+./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room2_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room2_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room2_512.txt dataset-room2_512_stereo
 
 echo "Launching Room 3 with Stereo sensor"
-./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room3_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room3_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room3_512.txt dataset-room3_512_stereo
+./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room3_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room3_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room3_512.txt dataset-room3_512_stereo
 
 echo "Launching Room 4 with Stereo sensor"
-./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room4_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room4_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room4_512.txt dataset-room4_512_stereo
+./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room4_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room4_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room4_512.txt dataset-room4_512_stereo
 
 echo "Launching Room 5 with Stereo sensor"
-./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room5_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room5_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room5_512.txt dataset-room5_512_stereo
+./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room5_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room5_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room5_512.txt dataset-room5_512_stereo
 
 echo "Launching Room 6 with Stereo sensor"
-./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_VI.yaml "$pathDatasetTUM_VI"/dataset-room6_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room6_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room6_512.txt dataset-room6_512_stereo
+./Examples/Stereo/stereo_tum_vi Vocabulary/ORBvoc.txt Examples/Stereo/TUM_512.yaml "$pathDatasetTUM_VI"/dataset-room6_512_16/mav0/cam0/data "$pathDatasetTUM_VI"/dataset-room6_512_16/mav0/cam1/data Examples/Stereo/TUM_TimeStamps/dataset-room6_512.txt dataset-room6_512_stereo
 
 
 #------------------------------------