- مبلغ: ۸۶,۰۰۰ تومان
- مبلغ: ۹۱,۰۰۰ تومان
This work focuses on using machine learning (data analysis) for interpretation and understanding of brainwaves resulting from electroencephalography during a grasping task. Electroencephalography – EEG - was used for acquisition of brain neural signals thought activity, hence to layout a control strategy for robotic hand and fingers movements. This is done via decoding, in real-time, the neural activity associated with fingers motions. Results are used for training robotics dexterous hands, and might allow people with spinal cord injury, brainstem stroke, and ALS (amyotrophic lateral sclerosis) to control a robotic-prosthetic by thinking about movements. The project is novel in a sense, it relies on detecting grasping features for a human grasping using Principle Component Analysis (PAC), hence to learn these features for recognitions applications.
This article has elaborated on a concept of building an intelligent grasping behavior for a robotic hand-prosthesis. That was based on using Electroencephalography. Due to enormous sensory and hand-prosthesis data to be analyzed, the article has presented a reduced dimensionally and size of the hand sensory data using PCA. The dimensionality reduction of hand information and features, are hence used as stimuli to a Neuro-fuzzy architecture. Stimuli of the decision-based learning architecture, are (hand, fingers configurations), wrenching, and behaviors related to particular grasp. Learned behaviors are (no-grasp, start to grasp, fair, soft, power grasps) with multi-levels of hand-prosthesis intelligence. The article has presented the details of the designed intelligent based robot hand-prosthesis that learn human intended behavior through the use of the EEG brainwaves.