• XML

    isc pubmed crossref medra doaj doaj
  • List of Articles


      • Open Access Article

        1 - Improving Opinion Aspect Extraction Using Domain Knowledge and Term Graph
        Mohammadreza Shams Ahmad  Baraani Mahdi Hashemi
        With the advancement of technology, analyzing and assessing user opinions, as well as determining the user's attitude toward various aspects, have become a challenging and crucial issue. Opinion mining is the process of recognizing people’s attitudes from textual commen More
        With the advancement of technology, analyzing and assessing user opinions, as well as determining the user's attitude toward various aspects, have become a challenging and crucial issue. Opinion mining is the process of recognizing people’s attitudes from textual comments at three different levels: document-level, sentence-level, and aspect-level. Aspect-based Opinion mining analyzes people’s viewpoints on various aspects of a subject. The most important subtask of aspect-based opinion mining is aspect extraction, which is addressed in this paper. Most previous methods suggest a solution that requires labeled data or extensive language resources to extract aspects from the corpus, which can be time consuming and costly to prepare. In this paper, we propose an unsupervised approach for aspect extraction that uses topic modeling and the Word2vec technique to integrate semantic information and domain knowledge based on term graph. The evaluation results show that the proposed method not only outperforms previous methods in terms of aspect extraction accuracy, but also automates all steps and thus eliminates the need for user intervention. Furthermore, because it is not reliant on language resources, it can be used in a wide range of languages. Manuscript profile
      • Open Access Article

        2 - Emerging technologies in future generations of high performance computing: introduction, taxonomy and future research directions
        mahmood nematollahzadeh ehsan arianyan Masoud Hayeri Khyavi niloofar gholipoor abdollah sepahvand
        Due to the rapid growth of science and technology, their need for high performance computing is increasing everyday. So far, the majority of the world's high performance computing needs have been based on conventional silicon-based technologies, but the end of the age o More
        Due to the rapid growth of science and technology, their need for high performance computing is increasing everyday. So far, the majority of the world's high performance computing needs have been based on conventional silicon-based technologies, but the end of the age of silicon-based technologies is near, and this fact has led scientists to use emerging technologies such as quantum computing, bio computing, optical computing and similar technologies. Although some of these technologies are not new and the initial introduction of some of them dates back to some decades ago, but due to the attractiveness of classical silicon-based computing and the speed of development in it, have been neglected to date. However, recently, these technologies have begun to be used to build scalable high performance computers. In this paper, we introduce these technologies and how they participate in the field of high performance computing, their current and future status, and their challenges. Also, the taxonomy related to each of these technologies from the computational point of view as well as their research topics are presented, which can be utilized for future research in this field. Manuscript profile
      • Open Access Article

        3 - Valuation of digital services in Iran: Empirical proof for Google and Instagram
        FARHAD ASGHARI ESTIAR AMIR MOHAMMADZADEH Ebrahim  Abbasi
        This article surveys the fundamental value of digital platforms, such as Instagram and Google. Despite the commutable nature of digital technologies, it is challenging to value digital services, given that the usage is free of charge. Applying the methodology of discret More
        This article surveys the fundamental value of digital platforms, such as Instagram and Google. Despite the commutable nature of digital technologies, it is challenging to value digital services, given that the usage is free of charge. Applying the methodology of discrete choice experiments, we estimated the value of digital free goods. For the first time in the literature, we obtained data for the willingness-to-pay and willingness-to-accept, together with socio-economic variables. The customer’s valuation of free digital services is on average, for Google, 4.9m Rial per week and Instagram, 3.27. This paper corroborates that Instagram and Google have an intrinsic value to users, despite the fact that the service of the digital platforms is free of charge. This is the beginning of the valuation of free services such as Shad, Rubika, Zarebeen, etc. in Iran, which has played a significant role in the communication industry since the beginning of the Covid-19 pandemic, and in the discussion of the national information network, the market value of the provider companies will be very important. Manuscript profile
      • Open Access Article

        4 - A novel metaheuristic algorithm and its discrete form for influence maximizing in complex networks
        Vahideh Sahargahi Vahid Majidnezhad Saeed  Taghavi Afshord Bagher Jafari
        In light of the No Free Lunch (NFL) theorem, which establishes the inherent limitations of meta-heuristic algorithms in universally efficient problem solving, the ongoing quest for enhanced diversity and efficiency prompts the introduction of novel algorithms each year. More
        In light of the No Free Lunch (NFL) theorem, which establishes the inherent limitations of meta-heuristic algorithms in universally efficient problem solving, the ongoing quest for enhanced diversity and efficiency prompts the introduction of novel algorithms each year. This research presents the IWOGSA meta-heuristic algorithm, a pioneering solution tailored for addressing continuous optimization challenges. IWOGSA ingeniously amalgamates principles from both the invasive weed optimization algorithm and the gravitational search algorithm, capitalizing on their synergies. The algorithm's key innovation lies in its dual-pronged sample generation strategy: a subset of samples follows a normal distribution, while others emulate the planetary motion-inspired velocities and accelerations from the gravitational search algorithm. Furthermore, a selective transfer of certain samples from distinct classes contributes to the evolution of succeeding generations. Expanding upon this foundation, a discrete variant of IWOGSA, termed DIWOGSA, emerges to tackle discrete optimization problems. The efficacy of DIWOGSA is demonstrated through its application to the intricate influence maximization problem. DIWOGSA distinguishes itself with an astute population initialization strategy and the integration of a local search operator to expedite convergence. Empirical validation encompasses a rigorous assessment of IWOGSA against established benchmark functions, composite functions, and real-world engineering structural design problems. Remarkably, the IWOGSA algorithm asserts its superiority, eclipsing both contemporary and traditional methods. This ascendancy is statistically affirmed through the utilization of the Friedman test rank, positioning IWOGSA as the premier choice. Also, DIWOGSA algorithm is evaluated by considering different networks for influence maximization problem, and it shows acceptable results in terms of influence and computational time in comparison to conventional algorithms. Manuscript profile
      • Open Access Article

        5 - Fuzzy Multicore Clustering of Big Data in the Hadoop Map Reduce Framework
        Seyed Omid Azarkasb Seyed Hossein Khasteh Mostafa  Amiri
        A logical solution to consider the overlap of clusters is assigning a set of membership degrees to each data point. Fuzzy clustering, due to its reduced partitions and decreased search space, generally incurs lower computational overhead and easily handles ambiguous, no More
        A logical solution to consider the overlap of clusters is assigning a set of membership degrees to each data point. Fuzzy clustering, due to its reduced partitions and decreased search space, generally incurs lower computational overhead and easily handles ambiguous, noisy, and outlier data. Thus, fuzzy clustering is considered an advanced clustering method. However, fuzzy clustering methods often struggle with non-linear data relationships. This paper proposes a method based on feasible ideas that utilizes multicore learning within the Hadoop map reduce framework to identify inseparable linear clusters in complex big data structures. The multicore learning model is capable of capturing complex relationships among data, while Hadoop enables us to interact with a logical cluster of processing and data storage nodes instead of interacting with individual operating systems and processors. In summary, the paper presents the modeling of non-linear data relationships using multicore learning, determination of appropriate values for fuzzy parameterization and feasibility, and the provision of an algorithm within the Hadoop map reduce model. The experiments were conducted on one of the commonly used datasets from the UCI Machine Learning Repository, as well as on the implemented CloudSim dataset simulator, and satisfactory results were obtained.According to published studies, the UCI Machine Learning Repository is suitable for regression and clustering purposes in analyzing large-scale datasets, while the CloudSim dataset is specifically designed for simulating cloud computing scenarios, calculating time delays, and task scheduling. Manuscript profile
      • Open Access Article

        6 - Information Technology, Strategy Implementation, Information Systems, Strategic Planning, Input-Process-Outcome Framework
        Mona Jami Pour Shahnaz Akbari Emami Safora Firozeh
        IT strategy is a key factor in improving the process and performance of companies in using IT. Hence, many companies have a strategic planning process, but only a few succeed in implementing strategies efficiently. Therefore, the purpose of this study is to design a pro More
        IT strategy is a key factor in improving the process and performance of companies in using IT. Hence, many companies have a strategic planning process, but only a few succeed in implementing strategies efficiently. Therefore, the purpose of this study is to design a process framework for implementing IT strategy; To identify the drivers, processes and consequences of implementing IT strategy in organizations. The present study is a qualitative research with a phenomenological approach and in order to collect data, open and in-depth interviews were conducted with 10 experts in the field of IT using theoretical sampling. The results of the analysis show that the inputs under the headings of IT strategy implementation include environmental requirements of business continuity, structural-system cohesion, technology-oriented human resources, IT strategic leadership, skill requirements and common values. The second aspect of the IT strategy implementation model includes the dimensions of IT program monitoring and communication, structural appropriateness, development of support policies, budgeting and resource allocation, appropriate training, and the development of supportive culture. Finally, the implications of implementing an IT strategy, including those related to finance, internal process, customer, and growth and learning, were categorized. Manuscript profile
      • Open Access Article

        7 - Automatic Lung Diseases Identification using Discrete Cosine Transform-based Features in Radiography Images
        Shamim Yousefi Samad Najjar-Ghabel
        The use of raw radiography results in lung disease identification has not acceptable performance. Machine learning can help identify diseases more accurately. Extensive studies were performed in classical and deep learning-based disease identification, but these methods More
        The use of raw radiography results in lung disease identification has not acceptable performance. Machine learning can help identify diseases more accurately. Extensive studies were performed in classical and deep learning-based disease identification, but these methods do not have acceptable accuracy and efficiency or require high learning data. In this paper, a new method is presented for automatic interstitial lung disease identification on radiography images to address these challenges. In the first step, patient information is removed from the images; the remaining pixels are standardized for more precise processing. In the second step, the reliability of the proposed method is improved by Radon transform, extra data is removed using the Top-hat filter, and the detection rate is increased by Discrete Wavelet Transform and Discrete Cosine Transform. Then, the number of final features is reduced with Locality Sensitive Discriminant Analysis. The processed images are divided into learning and test categories in the third step to create different models using learning data. Finally, the best model is selected using test data. Simulation results on the NIH dataset show that the decision tree provides the most accurate model by improving the harmonic mean of sensitivity and accuracy by up to 1.09times compared to similar approaches. Manuscript profile
      • Open Access Article

        8 - The main components of evaluating the credibility of users according to organizational goals in the life cycle of big data
        Sogand Dehghan shahriyar mohammadi rojiar pirmohamadiani
        Social networks have become one of the most important decision-making factors in organizations due to the speed of publishing events and the large amount of information. For this reason, they are one of the most important factors in the decision-making process of inform More
        Social networks have become one of the most important decision-making factors in organizations due to the speed of publishing events and the large amount of information. For this reason, they are one of the most important factors in the decision-making process of information validity. The accuracy, reliability and value of the information are clarified by these networks. For this purpose, it is possible to check the validity of information with the features of these networks at the three levels of user, content and event. Checking the user level is the most reliable level in this field, because a valid user usually publishes valid content. Despite the importance of this topic and the various researches conducted in this field, important components in the process of evaluating the validity of social network information have received less attention. Hence, this research identifies, collects and examines the related components with the narrative method that it does on 30 important and original articles in this field. Usually, the articles in this field are comparable from three dimensions to the description of credit analysis approaches, content topic detection, feature selection methods. Therefore, these dimensions have been investigated and divided. In the end, an initial framework was presented focusing on evaluating the credibility of users as information sources. This article is a suitable guide for calculating the amount of credit of users in the decision-making process. Manuscript profile
      • Open Access Article

        9 - Predicting the workload of virtual machines in order to reduce energy consumption in cloud data centers using the combination of deep learning models
        Zeinab Khodaverdian Hossein Sadr Mojdeh Nazari Soleimandarabi Seyed Ahmad Edalatpanah
        Cloud computing service models are growing rapidly, and inefficient use of resources in cloud data centers leads to high energy consumption and increased costs. Plans of resource allocation aiming to reduce energy consumption in cloud data centers has been conducted usi More
        Cloud computing service models are growing rapidly, and inefficient use of resources in cloud data centers leads to high energy consumption and increased costs. Plans of resource allocation aiming to reduce energy consumption in cloud data centers has been conducted using live migration of Virtual Machines (VMs) and their consolidation into the small number of Physical Machines (PMs). However, the selection of the appropriate VM for migration is an important challenge. To solve this issue, VMs can be classified according to the pattern of user requests into Delay-sensitive (Interactive) or Delay-Insensitive classes, and thereafter suitable VMs can be selected for migration. This is possible by virtual machine workload prediction .In fact, workload predicting and predicting analysis is a pre-migration process of a virtual machine. In this paper, In order to classification of VMs in the Microsoft Azure cloud service, a hybrid model based on Convolution Neural Network (CNN) and Gated Recurrent Unit (GRU) is proposed. Microsoft Azure Dataset is a labeled dataset and the workload of virtual machines in this dataset are in two labeled Delay-sensitive (Interactive) or Delay-Insensitive. But the distribution of samples in this dataset is unbalanced. In fact, many samples are in the Delay-Insensitive class. Therefore, Random Over-Sampling (ROS) method is used in this paper to overcome this challenge. Based on the empirical results, the proposed model obtained an accuracy of 94.42 which clearly demonstrates the superiority of our proposed model compared to other existing models. Manuscript profile
      • Open Access Article

        10 - The effect of Internet of Things (IOT) implementation on the Rail Freight Industry; A futures study approach
        Noureddin Taraz Monfared علی شایان ali rajabzadeh ghotri
        The rail freight industry in Iran has been faced several challenges which affected its performance. Notwithstanding that Internet of Things leverage is rapidly increasing in railway industries-as an experienced solution in other countries-, Iran’s rail freight industry More
        The rail freight industry in Iran has been faced several challenges which affected its performance. Notwithstanding that Internet of Things leverage is rapidly increasing in railway industries-as an experienced solution in other countries-, Iran’s rail freight industry has not been involved in, yet. Related research and experiment has not been identified in Iran, as well. The aim of this survey is to identify the effects of the implementation of Internet of Things in the Rail Freight Industry in Iran. To gather the data, the Delphi method was selected, and the Snowball technique was used for organizing a Panel including twenty experts. To evaluate the outcomes, IQR, Binomial tests, and Mean were calculated. Several statements were identified and there was broad consensus on most of them, which approved that their implementation affects the Iranian rail freight industry, but in different ranks. Finally, the results formed in the Balanced Scorecard’s format. The internal business process has been affected more than the other aspects by the approved statements. Eleven recognized elements are affected in different ranks, including Internal Business Process, Financial, Learning, and Growth, Customers. The Financial perspective remarked as least consensus and the Internal Business Process has received the extreme consensus. The research outcomes can be used to improve the strategic planning of the Iranian rail freight industry by applying the achievements of information technology in practice. Manuscript profile
      • Open Access Article

        11 - Design of Distributed Consensus Controller for Leader-Follower Singular Multi-Agent Systems in the Presence of Sensor Fault
        Saeid Poormirzaee Hamidreza Ahmadzadeh masoud Shafiee
        In this paper, the problem of sensor fault estimation and designing of a distributed fault-tolerant controller is investigated to guarantee the leader-follower consensus for homogeneous singular multi-agent systems for the first time. First, a novel augmented model for More
        In this paper, the problem of sensor fault estimation and designing of a distributed fault-tolerant controller is investigated to guarantee the leader-follower consensus for homogeneous singular multi-agent systems for the first time. First, a novel augmented model for the system is proposed. It is shown that the proposed model is regular and impulse-free unlike some similar research works. Based on this model, the state and sensor fault of the system are simultaneously estimated by designing a distributed singular observer. The proposed observer also has the ability to estimate time-varying sensor fault. Then, a distributed controller is designed to guarantee the leader-follower consensus using estimation of state and sensor fault. The sufficeient conditions to ensure the stability of the observer dynamic and consensus dynamic are drived in terms of linear matrix inequalities (LMIs). The gains of observer and controller are computed by solving these conditions with MATLAB software. Finally, the validation and efficiency of the proposed control system for the leader-follower consensus of singular multi-agent systems exposed to sensor faults is illustrated by computer simulations. The simulation results show that the propsed control strategy deeling to the sensor falut in the singular multi-agent systems is effective. Manuscript profile
      • Open Access Article

        12 - Indigenous model of commercialization of complex technologies based on partnership in the ICT sector
        Mahdi Fardinia Fatemeh saghafi Jalal Haghighat Monfared
        The ICT industry is one of the most complex industries with superior technologies. Sustainable growth of companies in this industry is ensured by successful commercialization, which due to the complexity of the field, knowledge sharing between companies is essential. A More
        The ICT industry is one of the most complex industries with superior technologies. Sustainable growth of companies in this industry is ensured by successful commercialization, which due to the complexity of the field, knowledge sharing between companies is essential. A careful review of the literature showed that there is no model for how to succeed in commercialization and its relationship with interorganizational participation in the ICT sector. Therefore, this issue was determined as the goal of the research. By reviewing the research background; Factors affecting the success of participation-based commercialization including internal and external drivers, participation, esources, dynamic capabilities, executive mechanisms and extraction performance were drawn in the form of a conceptual model. Then, by studying multy-case study, technological projects of ICT Research Institute, including (Antivirus Padvish, Native search engine project, SOC native operations center, communication equipment POTN) and content analysis, main and secondary themes of the model were extracted. Then, using a focus group consisting of experts, the results were validated and themes (propositions) were confirmed. The relationship between the components was also confirmed in the panel of experts. The final model is a combination of these factors that, according to the indigenous experiences of Iran, has led to the success of commercialization and can be the basis for policy-making for successful knowledge-based products. Manuscript profile
      • Open Access Article

        13 - Improving energy consumption in the Internet of Things using the Krill Herd optimization algorithm and mobile sink
        Shayesteh Tabatabaei
        Internet of Things (IoT) technology involves a large number of sensor nodes that generate large amounts of data. Optimal energy consumption of sensor nodes is a major challenge in this type of network. Clustering sensor nodes into separate categories and exchanging info More
        Internet of Things (IoT) technology involves a large number of sensor nodes that generate large amounts of data. Optimal energy consumption of sensor nodes is a major challenge in this type of network. Clustering sensor nodes into separate categories and exchanging information through headers is one way to improve energy consumption. This paper introduces a new clustering-based routing protocol called KHCMSBA. The proposed protocol biologically uses fast and efficient search features inspired by the Krill Herd optimization algorithm based on krill feeding behavior to cluster the sensor nodes. The proposed protocol also uses a mobile well to prevent the hot spot problem. The clustering process at the base station is performed by a centralized control algorithm that is aware of the energy levels and position of the sensor nodes. Unlike protocols in other research, KHCMSBA considers a realistic energy model in the grid that is tested in the Opnet simulator and the results are compared with AFSRP (Artifical Fish Swarm Routing ProtocolThe simulation results show better performance of the proposed method in terms of energy consumption by 12.71%, throughput rate by 14.22%, end-to-end delay by 76.07%, signal-to-noise ratio by 82.82%. 46% compared to the AFSRP protocol Manuscript profile
      • Open Access Article

        14 - Liquidity Risk Prediction Using News Sentiment Analysis
        hamed mirashk albadvi albadvi mehrdad kargari Mohammad Ali Rastegar Mohammad Talebi
        One of the main problems of Iranian banks is the lack of risk management process with a forward-looking approach, and one of the most important risks in banks is liquidity risk. Therefore, predicting liquidity risk has become an important issue for banks. Conventional m More
        One of the main problems of Iranian banks is the lack of risk management process with a forward-looking approach, and one of the most important risks in banks is liquidity risk. Therefore, predicting liquidity risk has become an important issue for banks. Conventional methods of measuring liquidity risk are complex, time-consuming and expensive, which makes its prediction far from possible. Predicting liquidity risk at the right time can prevent serious problems or crises in the bank. In this study, it has been tried to provide an innovative solution for predicting bank liquidity risk and leading scenarios by using the approach of news sentiment analysis. The news sentiment analysis approach about one of the Iranian banks has been used in order to identify dynamic and effective qualitative factors in liquidity risk to provide a simpler and more efficient method for predicting the liquidity risk trend. The proposed method provides practical scenarios for real-world banking risk decision makers. The obtained liquidity risk scenarios are evaluated in comparison with the scenarios occurring in the bank according to the guidelines of the Basel Committee and the opinion of banking experts to ensure the correctness of the predictions and its alignment. The result of periodically evaluating the studied scenarios indicates a relatively high accuracy. The accuracy of prediction in possible scenarios derived from the Basel Committee is 95.5% and in scenarios derived from experts' opinions, 75%. Manuscript profile
      • Open Access Article

        15 - Identify and analyze decision points and key players in procurement process in the EPC companies
        Seyedeh Motahareh  Hosseini Mohammad aghdasim
        Correct and timely decisions have a significant impact on the performance and achievement of the company's goals. In other words, business process management depends on making and implementing rational decisions. By increasing the integration of information systems in o More
        Correct and timely decisions have a significant impact on the performance and achievement of the company's goals. In other words, business process management depends on making and implementing rational decisions. By increasing the integration of information systems in organizations and using tools such as process mining, a platform is provided for the use of data analysis approaches and better analysis of decisions, and managers can act in agile decision making. Selecting a supplier in the process of purchasing in complex projects is one of the basic and key decisions that affect the quality, cost and performance of the project. In this article, with a process perspective, the decision points in the purchasing process in a complex construction project in an EPC company have been discovered and the key players in the implementation of the process have been identified and analyzed through social network analysis. The results of this research have led to the investigation of decision points in the process, the performance of decision points and the identification of key people in decision making, which can be used to improve the company's future performance. Manuscript profile
      • Open Access Article

        16 - Fake Websites Detection Improvement Using Multi-Layer Artificial Neural Network Classifier with Ant Lion Optimizer Algorithm
        Farhang Padidaran Moghaddam Mahshid Sadeghi B.
        In phishing attacks, a fake site is forged from the main site, which looks very similar to the original one. To direct users to these sites, Phishers or online thieves usually put fake links in emails and send them to their victims, and try to deceive users with social More
        In phishing attacks, a fake site is forged from the main site, which looks very similar to the original one. To direct users to these sites, Phishers or online thieves usually put fake links in emails and send them to their victims, and try to deceive users with social engineering methods and persuade them to click on fake links. Phishing attacks have significant financial losses, and most attacks focus on banks and financial gateways. Machine learning methods are an effective way to detect phishing attacks, but this is subject to selecting the optimal feature. Feature selection allows only important features to be considered as learning input and reduces the detection error of phishing attacks. In the proposed method, a multilayer artificial neural network classifier is used to reduce the detection error of phishing attacks, the feature selection phase is performed by the ant lion optimization (ALO) algorithm. Evaluations and experiments on the Rami dataset, which is related to phishing, show that the proposed method has an accuracy of about 98.53% and has less error than the multilayer artificial neural network. The proposed method is more accurate in detecting phishing attacks than BPNN, SVM, NB, C4.5, RF, and kNN learning methods with feature selection mechanism by PSO algorithm. Manuscript profile
      • Open Access Article

        17 - Extracting Innovation Strategies and Requirements for Telecommunication Companies: Case Study of Telecommunications Infrastructure Company
        Alireza Esmaeeli Alireza Asgharian Takavash Bahreini Nasrin Dastranj Mahshid Ghaffarzadegan Kolsoum Abbasi-Shahkooh Mandana Farzaneh Homeyra Moghadami
        The purpose of this study is to identify effective innovation strategies for a governmental organization and mission-oriented in the field of communication and information technology. The characteristics of the company under study are: governmental organization, having More
        The purpose of this study is to identify effective innovation strategies for a governmental organization and mission-oriented in the field of communication and information technology. The characteristics of the company under study are: governmental organization, having a monopoly market, scattered actions in the field of innovation, having managers interested in organizational innovation and has a clear and up-to-date strategy and structure. In this paper, innovation strategies were collected using comparative studies of similar international companies. Then by using the method of thematic analysis on the data obtained from semi-structured interviews, the strengths and weaknesses related to the innovation were identified. By matching these two categories of information, a number of appropriate strategies have been proposed and their implementation considerations have been expressed based on the specific characteristics of this company. Accordingly, suggestions for future research of this company have been presented to identify appropriate methods of implementation of organizational innovation in similar circumstances. Manuscript profile
      • Open Access Article

        18 - Using limited memory to store the most recent action in XCS learning classifier systems in maze problems
        Ali Yousefi kambiz badie mohamad mehdi ebadzade Arash  Sharifi
        Nowadays, learning classifier systems have received attention in various applications in robotics, such as sensory robots, humanoid robots, intelligent rescue and rescue systems, and control of physical robots in discrete and continuous environments. Usually, the combin More
        Nowadays, learning classifier systems have received attention in various applications in robotics, such as sensory robots, humanoid robots, intelligent rescue and rescue systems, and control of physical robots in discrete and continuous environments. Usually, the combination of an evolutionary algorithm or intuitive methods with a learning process is used to search the space of existing rules in assigning the appropriate action of a category. The important challenge to increase the speed and accuracy in reaching the goal in the maze problems is to use and choose the action that the stimulus is placed on the right path instead of repeatedly hitting the surrounding obstacles. For this purpose, in this article, an intelligent learning classifier algorithm of accuracy-based learning classifier systems (XCS) based on limited memory is used, which according to the input and actions applied to the environment and the reaction of the stimulus, the rules It is optimally identified and added as a new classifier set to the accuracy-based learning classifier systems (XCS) algorithm in the next steps. Among the achievements of this method, it can be based on reducing the number of necessary steps and increasing the speed of reaching the stimulus to the target compared to the accuracy-based learning classifier systems (XCS) algorithm. Manuscript profile