ASIA unversity:Item 310904400/13129
English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 90453/105672 (86%)
造访人次 : 12145479      在线人数 : 711
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻
    ASIA unversity > 資訊學院 > 資訊傳播學系 > 會議論文 >  Item 310904400/13129


    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: http://asiair.asia.edu.tw/ir/handle/310904400/13129


    题名: Automatic shoe-pattern boundary extraction by image processing techniques
    作者: 王玲玲;Wang, Ling-Ling
    贡献者: 資訊傳播學系
    日期: 2006-06
    上传时间: 2012-11-22 16:46:11 (UTC+8)
    摘要: A footwear designer digitizes shoe patterns to extract their boundaries, once he finishes 3D shoe-model design, last-bottom flattening, and shoe-pattern making. Shoe-pattern boundaries are then imported to cutting machines to cut material into shoe-pattern shapes. In the shoe-pattern making process, a footwear designer may draw many arcs, lines and pigments on a shoe pattern. Therefore, a shoe pattern has smudges, stains, and marker pen drawings on its surface. It results in the difficulty of automatically digitizing and extracting a shoe-pattern boundary. This study aims to develop an effective image-processing method to automatically extract the boundary of a shoe pattern. In the study, we first use a histogram thresholding technique to segment out a shoe pattern from the scanned input image. Then boundary extraction is applied on the segmented image to detect and smooth the shoe-pattern boundary. Finally, the proposed method is tested and its performance is evaluated. Experimental results indicate that the proposed method is good for automatic shoe-pattern boundary extraction.
    關聯: International Conference on Computers and Industrial Engineering
    显示于类别:[資訊傳播學系] 會議論文

    文件中的档案:

    档案 描述 大小格式浏览次数
    index.html0KbHTML146检视/开启


    在ASIAIR中所有的数据项都受到原著作权保护.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 回馈