English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 80990/80990 (100%)
造訪人次 : 42683149      線上人數 : 1325
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/93537


    題名: Channel Spatial Attention-based Transformer for Image Super-Resolution
    作者: 陳家偉;CHEN, CHIA-WEI
    貢獻者: 資訊工程學系
    關鍵詞: 超解析度;Super Resolution;Transformer
    日期: 2024-01-25
    上傳時間: 2024-09-19 17:11:35 (UTC+8)
    出版者: 國立中央大學
    摘要: 隨著影音媒體需求的不斷增長,超解析度領域的重要性日
    益提升。特別是Transformer模型因其卓越的性能而在電腦視
    覺方面受到廣泛關注,導致越來越多的研究將其應用於這一領
    域。然而,我們發現儘管Transformer通過增加不同機制的注
    意力能夠解決學習特徵有限的問題,但是在訓練過程中仍可能
    遺失一些紋理和結構。為了盡可能地保留初始特徵和架構,我
    們提出了一種方式用以整合Residual Connection、Attention
    Mechanism和Upscaling Technique。為了驗證我們方法的性能,
    我們在五個不同的資料集上進行了多次實驗,並且與現有的先
    進超解析度模型進行了比較。實驗結果顯示,我們的方法在性
    能上相較於當前領域中最先進的模型有著更佳的表現。;As the demand for audio-visual media continues to grow, the significance of the
    super-resolution field is increasingly recognized. In particular, Transformer models
    have garnered widespread attention in the realm of computer vision due to their
    exceptional performance, leading to their growing application in this area. However, we observed that despite the ability of Transformer to address the issue of
    limited feature learning through various attention mechanisms, some textures and
    structures may be lost during the learning process. To maximally preserve the
    initial features and structures, we propose a system, named Integrated Attention
    Transformer (IAT), that integrates Residual Connection, Attention Mechanism, and
    Upscaling Technique. To confirm the efficacy of IAT, experiments were conducted
    on five different datasets, compared with the current advanced super-resolution
    state-of-the-art (SOTA) model. The results show that the proposed IAT surpasses
    the current SOTA model.
    顯示於類別:[資訊工程研究所] 博碩士論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML20檢視/開啟


    在NCUIR中所有的資料項目都受到原著作權保護.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明