[Home ] [Archive]    
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Registration ::
Main Menu
Home::
Journal Information::
Articles archive::
Submission Instruction::
Registration::
Submit article::
Site Facilities::
Contact us::
::
Google Scholar

Citation Indices from GS

AllSince 2019
Citations86033630
h-index127
i10-index136

Search in website

Advanced Search
Receive site information
Enter your Email in the following box to receive the site news and information.
:: Search published articles ::
Showing 2 results for Modhej

Dr. Dalal Modhej, Dr. Adel Adel Dahimavi,
Volume 12, Issue 1 (6-2021)
Abstract

Data Envelopment Analysis (DEA) is a nonparametric approach for evaluating the relative efficiency of a homogenous set of Decision Making Units (DMUs). To evaluate the relative efficiency of all DMUs, DEA model should be solved once for each DMU. Therefore, by increasing the number of DMUs, computational requirements are increased. The Cerebellar Model Articulation Controller (CMAC) is a neural network that resembles a part of the brain known as cerebellum. The CMAC network with a simple structure is capable of estimating nonlinear functions, system modelling and pattern recognition. Meanwhile, the CMAC approach has fast learning convergence and local generalization in comparison to other networks. The present paper is concerned with assessing the efficiency of DMUs by the CMAC neural network for the first time. The proposed approach is applied to a large set of 600 Iranian bank branches. The efficiency results are analyzed and compared with the Multi-layer Perceptrons (MLP) network outcomes. Based on the results, it can be seen that the DEA-CMAC results tend to be similar to those of DEA-MLP in terms of accuracy. In addition, the Mean Squared Error (MSE) in DEA-CMAC decreases much faster than that in DEA-MLP. The DEA-CMAC model takes 1008 and 1107 iterations to reach MSE errors of 2.03×〖10〗^(-4) and of 6.01×〖10〗^(-4), respectively, while the DEA-MLP model takes 1190 iterations keeping the MSE error stable at 2.07×〖10〗^(-1). Moreover, DEA-CMAC requirements for CPU time are far less than those needed by DEA-MLP.
Dr Dalal Modhej, Dr Adel Dahimavi,
Volume 13, Issue 1 (6-2022)
Abstract

Data Envelopment Analysis (DEA) is a nonparametric approach for evaluating the relative efficiency of a homogenous set of Decision Making Units (DMUs). To evaluate the relative efficiency of all DMUs, DEA model should be solved once for each DMU. Therefore, by increasing the number of DMUs, computational requirements are increased. The Cerebellar Model Articulation Controller (CMAC) is a neural network that resembles a part of the brain known as cerebellum. The CMAC network with a simple structure is capable of estimating nonlinear functions, system modelling and pattern recognition. Meanwhile, the CMAC approach has fast learning convergence and local generalization in comparison to other networks. The present paper is concerned with assessing the efficiency of DMUs by the CMAC neural network for the first time. The proposed approach is applied to a large set of 600 Iranian bank branches. The efficiency results are analyzed and compared with the Multi-layer Perceptrons (MLP) network outcomes. Based on the results, it can be seen that the DEA-CMAC results tend to be similar to those of DEA-MLP in terms of accuracy. In addition, the Mean Squared Error (MSE) in DEA-CMAC decreases much faster than that in DEA-MLP. The DEA-CMAC model takes 1008 and 1107 iterations to reach MSE errors of 2.03×10-4  and of 6.01×10-4 , respectively, while the DEA-MLP model takes 1190 iterations keeping the MSE error stable at 2.07×10-1 . Moreover, DEA-CMAC requirements for CPU time are far less than those needed by DEA-MLP.
 

Page 1 from 1     

مجله انجمن ایرانی تحقیق در عملیات Iranian Journal of Operations Research
Persian site map - English site map - Created in 0.06 seconds with 28 queries by YEKTAWEB 4645