中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/78699
English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 80990/80990 (100%)
造訪人次 : 42716095      線上人數 : 1442
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/78699


    題名: 總計畫暨子計畫一:使用深度學習解譯即時樂器指法動作與詳細音樂表情;Real-Time Motion Interpretation for Detailed Musical Expression via Deep Learning on Instrument Fingering
    作者: 施國琛;施皇嘉惠霖;孫士韋;許輝煌
    貢獻者: 國立中央大學資訊工程系
    關鍵詞: 手指追蹤;深度學習;虛擬樂器;點雲;Finger Tracking;Deep Learning;Virtual Musical Instruments;Point Cloud
    日期: 2018-12-19
    上傳時間: 2018-12-20 13:43:40 (UTC+8)
    出版者: 科技部
    摘要: 總計畫:群模共舞: 一個支持互動藝術和開放表演的擴增實境劇院摘要  這是一個跨越多門學科的整合型計劃,包含了最先進的人基互動科技;當代藝術與現代表演技術;以及自動機器人操作。本計劃的目的是為了設計一套完整系統,一個可供開放式表演的擴增實境劇院系統。本整合型計劃目標是能交付一套結合系統,可以被用在不同戲院內,用來表演各式各樣的戲劇與舞蹈表演。我們計劃使用兩種以上的攝影機來進行表演者手勢與姿勢的追蹤,並使用可裝置在表演服裝上的動作感測器。特殊的視覺與音樂效果將即時隨著舞者的表演而被觸發。此外,本計畫將開發ㄧ套虛擬樂器。虛擬樂器所組成的樂團將可以融入表演中。  本整合型計劃包含了以下四個子計畫: 1:舞者與機器人同步及互動之即時協合表演 2:即時全息服裝投影舞者的可視化 3:應用於智慧眼鏡之擴增實境與3D環繞聲響之互動表演系統 4:使用深度學習解譯即時樂器指法動作與詳細音樂表情   我們希望這個獨特的系統可以在每學期公開展示。做為官方的活動可以每年被安排展出。本整合型計劃貢獻包含數種用於手勢與姿勢的辨識與追蹤的深度學習模型,藉由動作感測器所做的動作辨識。我們預計這些技術的一部分可以進一步應用於工業與產業上。子計畫四:使用深度學習解譯即時樂器指法動作與詳細音樂表情 摘要:  手指追蹤與標記ㄧ直是機器視覺的難題。通常,由於感測器的定位不當、擷取雜訊和過快的手勢,造成精確與快速的手指追蹤準確度有限。在這個計劃中,我們將使用深度學習科技來實現精確且快速的手指追蹤。發展出的技術將被用來發展虛擬樂器。為了捕捉精確的手指動作來實現複雜的樂器演奏動作,我們發展出ㄧ些不同的深度學習機器來實現不同的樂器。因此,我們必須設計ㄧ套資料收集工具,結合MIDI工具,來訓練一些深度學習機器來因應不同的樂器需求。我們將使用LSTM/GRU和CNN作為發展的深度學習網路基礎,並使用RealSense SR 300攝影機當作工具,來實用最少四種不同的樂器:鋼琴、木琴、大提琴與爵士鼓。我們期待這一套樂器可以被用於由我們學生所組成的虛擬樂團。這個虛擬樂團將與其他計劃合作並進行公開表演。這個計劃是屬於ㄧ個整合型計畫下的一個子計畫。名稱為"群模共舞: 一個支持互動藝術和開放表演的擴增實境劇院"。 ;Main Project: Augmented Theater for Interactive Art and Open PerformanceAbstractThis is a cross-discipline joint project, combining state-of-the-art HCI technologies, modern art and performances, and automatic robot controls, to design a complete system for open performances in an augmented theater. The purpose of the joint project is to deliver an integrated system, that can be used in different theaters, with different drama and dancers. We plan to use two or more kind cameras for posture and gesture tracking, as well as to use motion sensors on wearable costumes for dancers. Special visual and audio effects will be deigned and triggered by dancers in real-time. In addition, a set of virtual musical instruments will be developed. And, a virtual music band will be organized for the performances. The joint project will include the following 4 sub projects:1. Dancer-Robot Synchronization and Interaction for Real-time Concerted Performance2. Real-time Dancer Visualization with Holographic Costume Projection3. Augmented 3D Audio/Visual Performance System using Smart Glasses4. Real-time Motion Interpretation for Detailed Musical Expression via Deep Learning on Instrument FingeringWe hope this unique system can be demonstrated to open public every semester. Official performances will be arranged per year. The contribution of this joint project includes several deep learning modules for gesture and posture tracking/recognition, as well as attitude recognition via motion sensors. We expect part of these technologies can be further used in industrial applications.Sub Project 4: Real-time Motion Interpretation for Detailed Musical Expression via Deep Learning on Instrument Fingering AbstractFinger tracking and labeling are difficult subjects in computer vision. Usually, precise and fast finger tracking have limitation of precision due to improper positioning of sensors, noises, and fast motions. In this project, we will use Deep Learning techniques to implement precise and fast finger tracking. Targeted application of the developed technique includes different virtual musical instruments. We propose that, in order to capture the detailed expression of fingers to fulfill the needs of complicated musical expressions, different Deep Learning machines should be developed for different musical instruments. Thus, we must design a data collection tool, with a MIDI tool, to train a few Deep Learning machines for different musical instruments. We will use LSTM/GRU and CNN as bases, and use RealSense SR 300 camera as a tool, to implement at least four different instruments: Piano, Xylophone, Cello, and Drum. We expect the set of instruments can be used in a virtual music band organized by our students. The virtual music band will work with other projects for open performances. This project is one portion of a joint project entitled “Augmented Theater for Interactive Art and Open Performance.”
    關聯: 財團法人國家實驗研究院科技政策研究與資訊中心
    顯示於類別:[資訊工程學系] 研究計畫

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML286檢視/開啟


    在NCUIR中所有的資料項目都受到原著作權保護.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明