中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/68807
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 80990/80990 (100%)
Visitors : 42728948      Online Users : 1294
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/68807


    Title: 即時手勢辨識系統及其於戰場情資顯示平台之應用;A Real-Time Hand Gesture Recognition System and its Application in a Battlefield Information Platform
    Authors: 羅冠中;Lo,Kuan-zhong
    Contributors: 資訊工程學系
    Keywords: 深度攝影機;人機互動;手勢辨識
    Date: 2015-07-28
    Issue Date: 2015-09-23 14:30:27 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 近年來,手勢辨識在人機互動的相關研究中,吸引了各個領域的專家投入,而常見的應用有遊戲控制、機械手臂操作、機器人控制及家電控制等等。其簡單又直覺的操作方式,將漸漸地取代傳統遙控器和輸入裝置的使用。
    本論文提出一種基於深度影像之即時手勢辨識系統,並將系統應用於以NASA world wind為基礎之戰場環境情資平臺的操控。我們所開發的以NASA world wind為基礎之戰場環境情資平臺,除了可顯示最基本的世界地圖資訊之外,它也提供了許多地理與環境資訊,可以滿足國軍基層部隊的需求。
    系統實做的方法是先對深度攝影機讀到的深度影像與骨架資訊做必要的前處理後,留下手臂資訊,再用手掌切割演算法將手掌擷取出來,接著利用手掌輪廓上每一個點到手掌質心的距離曲線來當描述手型的特徵,由於這種特徵會受手掌旋轉角度及手掌大小等因素影響,不適合直接當手型辨識特徵使用,故本論文採用快速傅立葉轉換的方式,將該距離曲線取樣後轉換到頻率域上的一組快速傅立葉係數。由於不同的手型會產生不同的係數值,所以本論文採用決定樹的方式,根據係數值來辨識六種手型。之後,再利用這些手型和雙手上下左右揮動的動作加以組合,轉換成操控戰場環境情資平臺所需的六種指令集。
    本系統的研發目標是要能提供軍方人員一套新一代的戰場環境平臺,除了可利用傳統的鍵盤與滑鼠的操控之外,還可引進人機介面的最新技術,藉由手勢操控來取代傳統的輸入裝置。最後,本系統之各項手勢操控功能皆有透過各種不同之實驗設計來驗證。在手型正確率實驗中,其正確率達96.1%。在組合手勢偵測與辨識實驗中,即使是在不同角度時,整體平均的辨識正確率亦可高達97.9%。
    ;For the past few years, gesture recognition research in human-computer interaction had attracted experts’ attention in various field, general applications includes gaming control, humanoid robot arms operation, robot control, household appliances control and so on. Due to its convenience and intuitive manipulation, the hand-gesture-based controller will gradually substitute for the traditional remote and input device control.
    This thesis presents a real-time hand gesture recognition system based on depth image, and apply the system to the battlefield information platform which based on NASA world wind. The NASA world wind-based battlefield information platform not only displays the information of the world maps but offers a number of geographical and environmental information that satisfy the demands of military.
    The implementation of the proposed hand gesture recognition system is as follows. First of all, we locate the arm region from an image captured by Kinect via several necessary depth image preprocessing operators and skeleton tracking operators. Second, use a hand capture algorithm to extract the hand shape from the arm region. We then use the distance curve between hand boundary and the hand center as a feature that describing hand shape. However it’s inappropriate to be used for the hand shape recognition directly since the feature still affected by angle for hand rotation and size of hand. Thus, we use the frequency domain coefficient of the distance curve transformed by Fast Fourier transform. On account that there are different hand shape resulting different Fast Fourier coefficient, the decision tree incorporated with the coefficient is adopted to recognition 6 gestures. Furthermore, arranging those gesture with both hands will be used as commands to control the battlefield information platform.
    The NASA world wind-based battlefield information platform not only displays the information of the world maps but offers a number of geographical and environmental information that satisfy the demands of military. The aim of this system is to provide a brand-new battlefield platform for the military. In addition to operating by traditional keyboard and mouse, this system also introduce the latest technology of human-computer interaction. After all, several experiments were designed to evaluate the functionalities of the proposed real-time hand gesture recognition system. In hand gesture experiments, the correct rate is 97.1%. Even at different angles, the correct rate in the commands experiment with both hand is 97.42%.
    Appears in Collections:[Graduate Institute of Computer Science and Information Engineering] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML403View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明