[Home ] [Archive]    
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Registration ::
Main Menu
Home::
Journal Information::
Articles archive::
Submission Instruction::
Registration::
Submit article::
Site Facilities::
Contact us::
::
Google Scholar

Citation Indices from GS

AllSince 2019
Citations93634163
h-index127
i10-index146

Search in website

Advanced Search
Receive site information
Enter your Email in the following box to receive the site news and information.
:: Volume 12, Issue 1 (6-2021) ::
IJOR 2021, 12(1): 0-0 Back to browse issues page
Evaluation Efficiency of Large-Scale Data Set: Cerebellar Model Articulation Controller Neural Network
Dalal Modhej * , Adel Adel Dahimavi
Islamic Azad University, Sosangerd Branch , modhej83@gmail.com
Abstract:   (2529 Views)
Data Envelopment Analysis (DEA) is a nonparametric approach for evaluating the relative efficiency of a homogenous set of Decision Making Units (DMUs). To evaluate the relative efficiency of all DMUs, DEA model should be solved once for each DMU. Therefore, by increasing the number of DMUs, computational requirements are increased. The Cerebellar Model Articulation Controller (CMAC) is a neural network that resembles a part of the brain known as cerebellum. The CMAC network with a simple structure is capable of estimating nonlinear functions, system modelling and pattern recognition. Meanwhile, the CMAC approach has fast learning convergence and local generalization in comparison to other networks. The present paper is concerned with assessing the efficiency of DMUs by the CMAC neural network for the first time. The proposed approach is applied to a large set of 600 Iranian bank branches. The efficiency results are analyzed and compared with the Multi-layer Perceptrons (MLP) network outcomes. Based on the results, it can be seen that the DEA-CMAC results tend to be similar to those of DEA-MLP in terms of accuracy. In addition, the Mean Squared Error (MSE) in DEA-CMAC decreases much faster than that in DEA-MLP. The DEA-CMAC model takes 1008 and 1107 iterations to reach MSE errors of 2.03×〖10〗^(-4) and of 6.01×〖10〗^(-4), respectively, while the DEA-MLP model takes 1190 iterations keeping the MSE error stable at 2.07×〖10〗^(-1). Moreover, DEA-CMAC requirements for CPU time are far less than those needed by DEA-MLP.
Keywords: Data Envelopment Analysis, Cerebellar Model Articulation Controller, Neural Networks, Efficiency, Bank Branch
     
Type of Study: Original | Subject: Mathematical Modeling and Applications of OR
Received: 2022/03/15 | Accepted: 2021/06/12 | Published: 2021/06/12
Send email to the article author

Add your comments about this article
Your username or Email:

CAPTCHA


XML     Print



Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Volume 12, Issue 1 (6-2021) Back to browse issues page
مجله انجمن ایرانی تحقیق در عملیات Iranian Journal of Operations Research
Persian site map - English site map - Created in 0.05 seconds with 39 queries by YEKTAWEB 4660