English  |  正體中文  |  简体中文  |  Items with full text/Total items : 90451/105768 (86%)
Visitors : 10991023      Online Users : 543
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version

    Please use this identifier to cite or link to this item: http://asiair.asia.edu.tw/ir/handle/310904400/18932

    Title: Video Attention Ranking using Visual and Contextual Attention Model for Content Driven Sports Videos Mining
    Authors: 黃仲陵;Huang, Chung-Lin
    Contributors: 資訊多媒體應用學系
    Date: 2009-02
    Issue Date: 2012-11-26 15:10:52 (UTC+8)
    Abstract: In this paper, we propose new video attention modeling and content-driven mining strategies which enable client users to browse the video according to their preference. By integrating the object-based visual attention model (V'AM) with the contextual attention model (CAM), the proposed scheme not only can more reliably take advantage of the human perceptual characteristics but also effectively discriminate which video contents may attract users' attention. In addition, extended from the Google PageRank algorithm which sorts the websites based on the importance, we introduce the so-call content-based attention rank (AR) to effectively measure the user interest (UI) level of each video frame. The information of users' feedback is treated as the enhanced query data to further improve the retrieving accuracy. The proposed algorithm is evaluated on commercial baseball game sequences and produces promising results.
    Relation: IEEE Transactions on Multimedia
    Appears in Collections:[行動商務與多媒體應用學系] 期刊論文

    Files in This Item:

    File Description SizeFormat

    All items in ASIAIR are protected by copyright, with all rights reserved.

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback