Logo
Please use this identifier to cite or link to this item: http://20.198.91.3:8080/jspui/handle/123456789/8615
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorBasak, Piyali-
dc.contributor.authorMondal, Supriya-
dc.date.accessioned2025-09-16T06:18:24Z-
dc.date.available2025-09-16T06:18:24Z-
dc.date.issued2022-
dc.date.submitted2022-
dc.identifier.otherDC3587-
dc.identifier.urihttp://20.198.91.3:8080/jspui/handle/123456789/8615-
dc.description.abstractVisual object tracking is one of the most emerging areas in the field of computer vision. In recent years, significant progress has been achieved in this research area. Recent research in this field is mainly focused on data acquisition for creating new benchmark databases suitable for evaluating the performance of various tracking applications, designing several tracking methods, etc. The detection ability of any tracking method depends largely on the database on which it is trained. In order to perform efficient training of tracking methods and to properly evaluate their tracking capability, suitable benchmark databases containing videos or image frames with various attributes—e.g., occlusions, blurriness, etc, are required. The work conducted in this thesis is mainly focused on performing effective benchmarking of tracking capabilities of multiple methods. The main contribution of this thesis lies in designing a novel database, namely VTrack: a visual object tracking benchmark database comprising 25 videos containing thousands of frames with various attributes.en_US
dc.format.extent[x], 54p.en_US
dc.language.isoenen_US
dc.publisherJadavpur University, Kolkata, West Bengalen_US
dc.subjectVTrack Databaseen_US
dc.titleVtrack: a novel visual object tracking benchmark databaseen_US
dc.typeTexten_US
dc.departmentJadavpur University, Dept. of School of Bioscience and Engineeringen_US
Appears in Collections:Dissertation

Files in This Item:
File Description SizeFormat 
M.E. (Biomedical Engineering) Supriya Mondal.pdf4.44 MBAdobe PDFView/Open


Items in IR@JU are protected by copyright, with all rights reserved, unless otherwise indicated.