English  |  正體中文  |  简体中文  |  Items with full text/Total items : 90451/105768 (86%)
Visitors : 10991061      Online Users : 555
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://asiair.asia.edu.tw/ir/handle/310904400/102107


    Title: VEHICLE VERIFICATION IN TWO NONOVERLAPPED VIEWS USING SPARSE REPRESENTATION
    Authors: 徐士中;Hsu, Shih-Chung;黃仲陵;Huang, Chung-Lin;*
    Contributors: 行動商務與多媒體應用學系
    Date: 2016-12
    Issue Date: 2017-03-01 13:54:59 (UTC+8)
    Abstract: Vehicle verification in two different views can be applied for Intelligent Transportation System. However, object appearance matching in two different views is difficult. The vehicle images captured in two views are represented as a feature pair which can be classified as the same/different pair. Sparse representation (SR) has been applied for reconstruction, recognition, and verification. However, the SR dictionary may not guarantee feature sparsity and effective representation. In the paper, we propose Boost-KSVD method without using initial random atom to generate the SR dictionary which can be applied for object verification with very good accuracy. Then, we develop a discriminative criterion to decide the SR dictionary size. Finally, the experiments show that our method can generate better verification accuracy compared with the other methods.
    Relation: Asia-Pacific Signal and Information Processing Association Annual Summit and Conference 2016
    Appears in Collections:[行動商務與多媒體應用學系] 期刊論文

    Files in This Item:

    File SizeFormat
    index.html0KbHTML147View/Open


    All items in ASIAIR are protected by copyright, with all rights reserved.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback