دانلود رایگان مقاله برنامه نویسی پراکنده با رگرسیون حاشیه برای یادگیری بازنمایی پراکنده

عنوان فارسی
برنامه نویسی پراکنده صاف از طریق رگرسیون حاشیه برای یادگیری بازنمایی های پراکنده
عنوان انگلیسی
Smooth sparse coding via marginal regression for learning sparse representations
صفحات مقاله فارسی
0
صفحات مقاله انگلیسی
13
سال انتشار
2016
نشریه
الزویر - Elsevier
فرمت مقاله انگلیسی
PDF
کد محصول
E2218
رشته های مرتبط با این مقاله
مهندسی کامپیوتر
گرایش های مرتبط با این مقاله
برنامه نویسی کامپیوتر
مجله
هوش مصنوعی - Artificial Intelligence
دانشگاه
گروه آمار، دانشگاه ویسکانسین-مدیسن، امریکا
کلمات کلیدی
برنامه نویسی پراکنده، یادگیری واژه نامه، چشم انداز
۰.۰ (بدون امتیاز)
امتیاز دهید
چکیده

ABSTRACT


We propose and analyze a novel framework for learning sparse representations based on two statistical techniques: kernel smoothing and marginal regression. The proposed approach provides a flexible framework for incorporating feature similarity or temporal information present in data sets via non-parametric kernel smoothing. We provide generalization bounds for dictionary learning using smooth sparse coding and show how the sample complexity depends on the L1 norm of kernel function used. Furthermore, we propose using marginal regression for obtaining sparse codes which significantly improves the speed and allows one to scale to large dictionary sizes easily. We demonstrate the advantages of the proposed approach, both in terms of accuracy and speed by extensive experimentation on several real data sets. In addition, we demonstrate how the proposed approach can be used for improving semi-supervised sparse coding.

بحث و کارهای آتی

8. Discussion and future work


We propose a simple framework for incorporating similarity in feature space and space or time into sparse coding. We also propose in this paper modifying sparse coding by replacing the lasso optimization stage by marginal regression and adding a constraint to enforce incoherent dictionaries. The resulting algorithm is significantly faster. This facilitates scaling up the sparse coding framework to large dictionaries, an area which is usually restricted due to intractable computation. This work leads to several interesting follow-up directions. On the theoretical side, local convergence of Lasso-based sparse coding has been analyzed recently – preliminary examination suggests that the proposed marginal-regression based sparse coding algorithm might be more favorable for the local convergence analysis. It is also interesting to explore tighter generalization error bounds by directly analyzing the solutions of the marginal regression based iterative algorithm. Methodologically, it is interesting to explore if using an adaptive or non-constant kernel bandwidth leads to higher accuracy. Furthermore alternative methods for imposing incoherence constraints that may lead to easier optimization is an interesting direction to investigate.


بدون دیدگاه