• List of Articles


      • Open Access Article

        1 - An Individual-Oriented Shuffled Frog Leaping Algorithm for Solving Vehicle Routing Problem
        Soheila Shafiezadeh Zahra Beheshti
        The Vehicle Routing Problem (VRP) is one of the most important problems in supply chain management because the optimal allocation of vehicles has a significant impact on reducing costs. VRP is in the class of NP-hard problems and exact algorithms cannot find the best so More
        The Vehicle Routing Problem (VRP) is one of the most important problems in supply chain management because the optimal allocation of vehicles has a significant impact on reducing costs. VRP is in the class of NP-hard problems and exact algorithms cannot find the best solution in an acceptable time. Hence, meta-heuristic algorithms can be employed to solve it. Shuffled Frog Leaping Algorithm (SFLA) is one of the meta-heuristic algorithms, which is efficient, but in some cases, its population diversity rapidly reduces, and the algorithm falls in local optima. In this study, an Individual-Oriented Shuffled Frog Leaping Algorithm (IO-SFLA) is proposed to enhance the exploration and exploitation of SFLA by exchanging the global and local information. Several VRPs in different dimensions are applied to evaluate the performance of IO-SFLA. The efficiency of IO-SFLA is compared with several improved shuffled frog leaping algorithms, Simulated Annealing (SA) and Genetic Algorithm (GA). The results show that IO-SFLA provides significant results compared with the other competitor algorithms. IO-SFLA achieves an average of 1130.442 for the best path cost. The next rank belongs to SA with an average of 1228.725. Other compared algorithms are in the lower ranks with high differences in results. Manuscript profile
      • Open Access Article

        2 - A Novel Model based on Encoder-Decoder Architecture and Attention Mechanism for Automatic Abstractive Text Summarization
        hasan aliakbarpor mohammadtaghi manzouri amirmasoud rahmani
        By the extension of the Web and the availability of a large amount of textual information, the development of automatic text summarization models as an important aspect of natural language processing has attracted many researchers. However, with the growth of deep learn More
        By the extension of the Web and the availability of a large amount of textual information, the development of automatic text summarization models as an important aspect of natural language processing has attracted many researchers. However, with the growth of deep learning methods in the field of text processing, text summarization has also entered a new phase of development and abstractive text summarization has experienced significant progress in recent years. Even though, it can be claimed that all the potential of deep learning has not been used for this aim and the need for progress in this field, as well as considering the human cognition in creating the summarization model, is still felt. In this regard, an encoder-decoder architecture equipped with auxiliary attention is proposed in this paper which not only used the combination of linguistic features and embedding vectors as the input of the learning model but also despite previous studies that commonly employed the attention mechanism in the decoder, it utilized auxiliary attention mechanism in the encoder to imitate human brain and cognition in summary generation. By the employment of the proposed attention mechanism, only the most important parts of the text rather than the whole input text are encoded and then sent to the decoder to generate the summary. The proposed model also used a switch with a threshold in the decoder to overcome the rare words problem. The proposed model was examined on CNN / Daily Mail and DUC-2004 datasets. Based on the empirical results and according to the ROUGE evaluation metric, the proposed model obtained a higher accuracy compared to other existing methods for generating abstractive summaries on both datasets. Manuscript profile
      • Open Access Article

        3 - Exploring the antecedents and consequences of smart good governance using the fuzzy Delphi method (FDM)
        Seyed Abdolrasoul Hosseini Mohammad Ghasemi Nour Mohammad  Yaghubi Habibollah Salarzehi
        Background: The present study aimed to identify the antecedents and consequences of smart good governance using the fuzzy Delphi method (FDM). Method: The present study is a mixed-method study that was conducted using a deductive-inductive approach. It is also an appli More
        Background: The present study aimed to identify the antecedents and consequences of smart good governance using the fuzzy Delphi method (FDM). Method: The present study is a mixed-method study that was conducted using a deductive-inductive approach. It is also an applied study in terms of its objectives and a descriptive survey study in terms of the design and methodology. The research population included all experts in the field of governance. Accordingly, 26 experts were selected based on the principle of theoretical adequacy and using purposive sampling. In the qualitative part of the study, semi-structured interviews were conducted to collect data. In addition, in the quantitative part, a Researcher made questionnaire was used to collect the data. The validity and reliability of the questionnaire were confirmed via content validity and test-retest method. In the qualitative part, the data collected from the interviews were analyzed using ATLAS.ti software and the indexing method. Besides, in the quantitative part of the study, the antecedents and consequences of smart good governance were ranked using the fuzzy Delphi method (FDM). Results: A comparison of the precedents of smart good governance showed that smart technology and data, electronic and smart interaction, rule of law, competent and committed authorities, cyber and smart security, good smart society and citizens, smart management and executive leadership, e-architecture, smart e-government, the quality of laws and regulations, transparent governance, democratic and smart infrastructure, strategies and political stability, the strengthening of civil society, and public awareness were ranked as the most important factors, respectively. It was also shown that efficiency and effectiveness, entrepreneurship, improving quality of life and sustainable development, ethics and morality, equality and inclusive justice, downsizing, successful crisis management, immediate response to challenges, increasing closeness and compassion between government and people, reducing corruption, technological knowledge, reducing work time and administrative processes, and eliminating bureaucracy were the most important consequences of smart good government, respectively. The insights from this study can contribute to establishing smart good governance. Manuscript profile
      • Open Access Article

        4 - A Multi-Objective Differential Evolutionary Algorithm-based Approach for Resource Allocation in Cloud Computing Environment
        Saeed Bakhtiari Mahan Khosroshahi
        In recent years, the cloud computing model has received a lot of attention due to its high scalability, reliability, information sharing and low cost compared to separate machines. In the cloud environment, scheduling and optimal allocation of tasks affects the effectiv More
        In recent years, the cloud computing model has received a lot of attention due to its high scalability, reliability, information sharing and low cost compared to separate machines. In the cloud environment, scheduling and optimal allocation of tasks affects the effective use of system resources. Currently, common methods for scheduling in the cloud computing environment are performed using traditional methods such as Min-Min and meta-heuristic methods such as ant colony optimization algorithm (ACO). The above methods focused on optimizing one goal and do not estimate multiple goals at the same time. The main purpose of this research is to consider several objectives (total execution time, service level agreement and energy consumption) in cloud data centers with scheduling and optimal allocation of tasks. In this research, multi-objective differential evolution algorithm (DEA) is used due to its simple structure features and less adjustable parameters. In the proposed method, a new approach based on DEA to solve the problem of allocation in cloud space is presented which we try to be effective in improving resource efficiency and considering goals such as time, migration and energy by defining a multi-objective function and considering mutation and crossover vectors. The proposed method has been evaluated through a CloudSim simulator by testing the workload of more than a thousand virtual machines on Planet Lab. The results of simulation show that the proposed method in comparison with IqrMc, LrMmt and FA algorithms, in energy consumption by an average of 23%, number of migrations by an average of 29%, total execution time by an average of 29% and service level agreement violation (SLAV) by an average of 1% has been improved. In this case, use of the proposed approach in cloud centers will lead to better and appropriate services to customers of these centers in various fields such as education, engineering, manufacturing, services, etc. Manuscript profile
      • Open Access Article

        5 - Using Sentiment Analysis and Combining Classifiers for Spam Detection in Twitter
        mehdi salkhordeh haghighi Aminolah Kermani
        The welcoming of social networks, especially Twitter, has posed a new challenge to researchers, and it is nothing but spam. Numerous different approaches to deal with spam are presented. In this study, we attempt to enhance the accuracy of spam detection by applying one More
        The welcoming of social networks, especially Twitter, has posed a new challenge to researchers, and it is nothing but spam. Numerous different approaches to deal with spam are presented. In this study, we attempt to enhance the accuracy of spam detection by applying one of the latest spam detection techniques and its combination with sentiment analysis. Using the word embedding technique, we give the tweet text as input to a convolutional neural network (CNN) architecture, and the output will detect spam text or normal text. Simultaneously, by extracting the suitable features in the Twitter network and applying machine learning methods to them, we separately calculate the Tweeter spam detection. Eventually, we enter the output of both approaches into a Meta Classifier so that its output specifies the final spam detection or the normality of the tweet text. In this study, we employ both balanced and unbalanced datasets to examine the impact of the proposed model on two types of data. The results indicate an increase in the accuracy of the proposed method in both datasets. Manuscript profile
      • Open Access Article

        6 - Outage and Throughput Analysis of Bidirectional Cognitive Amplify-and-Forward Relaying Networks with Wireless Power Transfer
        Ehsan Soleimani Nasab
        Cognitive radio is a promising technology which aims to achieve better frequency spectrum utilization. On the other hand, wireless energy harvesting can provide extra energy requirement at the nodes. Two scenarios in a two-way network are assumed where in the first scen More
        Cognitive radio is a promising technology which aims to achieve better frequency spectrum utilization. On the other hand, wireless energy harvesting can provide extra energy requirement at the nodes. Two scenarios in a two-way network are assumed where in the first scenario, relay harvests its required energy from end-sources of secondary network in presence of cognitive radio network and in the second scenario, both end-sources harvest energy from relay in secondary network. Both the Nakagami-m fading caused by signal propagation and the interference at relay caused by primary users in a cognitive radio network are considered. Closed-form expressions for outage probability and throughput of bidirectional cognitive radio amplify-and-forward relaying network using energy harvesting and wireless power transfer techniques over independent and non-identically distributed (i.n.i.d.) Nakagami-m fading channels are proposed. The analytical derivations are validated employing Monte Carlo simulations, where it is demonstrated that the first scenario always outperforms the second one, while both scenarios perform better than no energy harvesting case. Manuscript profile
      • Open Access Article

        7 - Explaining the adoption process of software-oriented networks (SDN) using the foundational data method and systems approach
        Elham Ziaeipour ali rajabzadeh ghotri Alireza Taghizadeh
        Software Defined Networking (SDN) is one of the technologies with most promising role in digital transformation. Dynamic structure of SDN can adapt to ever changing nature of future networks and their users. The important impact of this technology on intelligence, agi More
        Software Defined Networking (SDN) is one of the technologies with most promising role in digital transformation. Dynamic structure of SDN can adapt to ever changing nature of future networks and their users. The important impact of this technology on intelligence, agility, management and control of current network devices as well as upcoming communication technologies reduces expenses and creates innovative businesses. Although, service providers are very interested in deploying SDN to transform their static infrastructures to a dynamic and programmable platform, they do not consider it as one of their priorities and still depend on traditional methods to manage their network. Therefore, this study highlights the factors affecting the acceptance of SDN architecture and its application by the national telecom operators, and proposes a comprehensive and new paradigm model using a systems approach and Grounded theory (Strauss and Corbin model). This innovative model is provided by systematically reviewing the theoretical foundations and conducting in-depth interviews with managers and experts in telecom industry. During the modeling process, more than a thousand initial codes were determined. Finally, based on the opinion of experts on these codes, a total of 73 open codes, 12 axial codes and 6 main categories have been extracted. Manuscript profile
      • Open Access Article

        8 - High Performance Computing: Next Generation Requirements and Research Axes
        ehsan arianyan MohammadMahdi Esnaashari Fatemeh Ehsani Boshla Shaghayeghsadat Hossieni bayan Masoud Dehyadegari Behnam Samadi
        Nowadays, increasing the processing power of supercomputers is a worldwide race. This race, signifies the importance of supercomputers in the current era. They are engines of improving technology in almost all scientific areas, such as computational biology, earth scien More
        Nowadays, increasing the processing power of supercomputers is a worldwide race. This race, signifies the importance of supercomputers in the current era. They are engines of improving technology in almost all scientific areas, such as computational biology, earth sciences, cosmology, fluid dynamics, and plasma modeling, to name a few. Next generation of supercomputers can be divided into two broad categories: 1) emerging technologies such as neuromorphic and quantum computing and 2) Exascala computing. Emerging technologies will be the future of supercomputing, however, not in a very recent future. Therefore, in this paper, we have focused on Exascale computing, and have tried to provide a comprehensive overview of the main requirements for this technology to be achieved and become available. Requirements have been overviewed from different aspects; hardware, software, artificial intelligence, and cloud computing. In addition, we have attempted to provide a complete taxonomy of hot research topics within this area. Manuscript profile
      • Open Access Article

        9 - Reduction of network load by mapping the application in the network on a chip using the discrete Harris hawk algorithm
        Elham Hajebi Vahid Sattari-Naeini
        Reducing load and power consumption in on-chip network systems is very important and one of the most important issues to increase the efficiency of on-chip network is the issue of mapping an application on the chip network. Solving the application mapping problem to fin More
        Reducing load and power consumption in on-chip network systems is very important and one of the most important issues to increase the efficiency of on-chip network is the issue of mapping an application on the chip network. Solving the application mapping problem to find the best mapping is a complex and time consuming issue and has a huge impact on network latency and power consumption. In this paper, using the Harris hawk algorithm, we have been able to provide a method for mapping processing cores to the network on chip to reduce the load on the network and thus congestion in the links and improve network performance. The simulation results show that this algorithm performs better than the basic algorithms. Manuscript profile
      • Open Access Article

        10 - Pathology of software export development in Iran
        Maryam Saleh Gorgory Mohsen Gerami Vahid Yazdanian
        Today, the new developments of the world economy, including the severe fluctuation of the price of raw materials, the increase in the wages of the human force, the increase in the cost of transportation, storage and other production factors, have made many developing co More
        Today, the new developments of the world economy, including the severe fluctuation of the price of raw materials, the increase in the wages of the human force, the increase in the cost of transportation, storage and other production factors, have made many developing countries to think about entering The field of production and trade of goods that have the least amount of dependence on high-risk economic components. The software industry is one of these industries. In addition to having high added value, this industry has the least need for raw materials and other cost-generating components. In fact, the software industry is a pure knowledge industry and is based on research and development. Our country can also take steps to export software in order to gain benefits from this industry. Considering this, this research has addressed the pathology of software export development. In this research, the statistical population of which are software companies that are members of the software exporters' union, about the demand, national perspective and strategy, international trust and communication, features of the software industry and infrastructure and internal factors are investigated. The results of our research show that the problems of software export are the lack of demand and software infrastructure and internal factors. Manuscript profile
      • Open Access Article

        11 - Improved routing for load balancing in wireless sensor networks on the Internet of things, based on multiple ant colony algorithm
        Farhang Padidaran Moghaddam Hamid Maghsoudi
        An important issue in dynamic computer networks such as Internet networks, where the cost of connections varies continuously, is to create a traffic load balancing and increase the transmission speed of packets in the network, so that data packets are using paths with m More
        An important issue in dynamic computer networks such as Internet networks, where the cost of connections varies continuously, is to create a traffic load balancing and increase the transmission speed of packets in the network, so that data packets are using paths with minimal congestion, as a result, one of the main approaches to solve routing problems and load balancing algorithms is based on ant - based algorithms using a novel approach based on optimization of multiple ant colony optimization, the purpose of this research is to present an appropriate routing algorithm in order to shorten and improve the path due to end - to - end delay parameters, packet loss rate, bandwidth and energy consumption rate, to reach a sense of data on the Internet systems. this method has been implemented in MATLAB software and shows the results of the improvement experiments in the mentioned parameters. Manuscript profile
      • Open Access Article

        12 - Two Fuzzy Virtual Force Algorithms to Improve Sensor Deployment in Wireless Sensor Networks
        Vahid Kiani
        Maximizing area coverage is an important issue in the placement of wireless network sensors, the realization of which helps to improve the network monitoring power. In many applications, the sensors are first randomly distributed in the sensing filed and then their plac More
        Maximizing area coverage is an important issue in the placement of wireless network sensors, the realization of which helps to improve the network monitoring power. In many applications, the sensors are first randomly distributed in the sensing filed and then their placement is modified. The virtual force algorithm (VFA) tries to achieve a more desirable deployment from an initial sensing deployment by considering repulsive and attractive forces between the sensors. In this paper, the combination of Takashi-Sugeno fuzzy system with VFA is used to achieve a better redeployment of the sensors. To adaptively adjust optimal distance value of the sensors, two fuzzy methods are proposed in this paper and their role in improving performance of the virtual force algorithm is analyzed. Comparison of the performance of the proposed methods with the state-of-the-art reveals that intelligent and adaptive adjustment of the optimal distance using a fuzzy system leads to higher final coverage ratio over traditional virtual force algorithm (VFA), improved virtual force algorithm (IVFA), fuzzy redeployment algorithm (FRED), and two metaheuristics GA, and PSO. On the other hand, the proposed VF-based methods require much less time to solve the problem than GA and PSO metaheuristic methods. Manuscript profile
      • Open Access Article

        13 - Open Data Accessing Policymaking in Iran, in the Aspect of Preserving Privacy and Personal Data Ownership
        Behrooz Eliasi معصومه صادقی نسرین دسترنج Mehdi Hosseinpour Tahereh Mirsaeedghazi
        Enhancing accessibility to open data ensures to promote the research, innovation, and extension of solutions confronting with complex social challenges in our country. Offered policies by OECD and other scientific associations, is an emphasis on this strategy. Certainly More
        Enhancing accessibility to open data ensures to promote the research, innovation, and extension of solutions confronting with complex social challenges in our country. Offered policies by OECD and other scientific associations, is an emphasis on this strategy. Certainly, implementing the strategy needs stablishing governance systems, clarifying processes, and trustiness guarantee to research and business areas. The main part of valuable data resources is personal in nature and gathering, storage, and processing them in cybernet is an enormous source of earning for data-driven businesses. Including the main challenges in trustiness issue, are decision making on privacy policies and ownership. In this paper, considering the complexity in ownership concept for personal data ecosystem, challenges on offered policies like OECD reports will be negotiated to enhance open data. Also, shortages in E-trade and cybercrime rules in our country are briefly debated. Then, aiming to suggest an accessing policy to open data, referring to public sensitiveness to personal data, firstly the detailed conclusions of a field study including realizing criterias of goal and possibly policymaking will be extracted by Delphi method. This work shows the public awareness in this subject, even in an excellence target community, is not desirable. Moreover, trustiness in privacy for personal data, hoping to effective law performance on violations, is not satisfiable. Finally, with a field evaluation by FAHP method, policymaking options will be measured and analysed and strategic requirements for performing elected policy will be suggested. Manuscript profile
      • Open Access Article

        14 - Introducing a new optimal energy method for targets tracking in wireless sensor network using a hunting search algorithm
        Shayesteh Tabatabaei Hassan Nosrati Nahook
        In this paper, in order to increase the accuracy of target tracking, it tries to reduce the energy consumption of sensors with a new algorithm for tracking distributed targets called hunting search algorithm. The proposed method is compared with the DCRRP protocol and t More
        In this paper, in order to increase the accuracy of target tracking, it tries to reduce the energy consumption of sensors with a new algorithm for tracking distributed targets called hunting search algorithm. The proposed method is compared with the DCRRP protocol and the NODIC protocol, which uses the OPNET simulator version 11.5 to test the performance of these algorithms. The simulation results show that the proposed algorithm performs better than the other two protocols in terms of energy consumption, healthy delivery rate and throughput rate. Manuscript profile
      • Open Access Article

        15 - A Collaborative Filtering Movie Recommendation System Based on Users Correlation and Weighted K-Means with High Accuracy
        Nozar Ebrahimi Lame Fatemeh saghafi Majid Gholipour
        A Recommendation system is a BI tool that it uses data mining methods for guiding and helping the user to select the best items based on her/his preferences and in the shortest time. Despite more than two decades of academic research on recommendation systems, this issu More
        A Recommendation system is a BI tool that it uses data mining methods for guiding and helping the user to select the best items based on her/his preferences and in the shortest time. Despite more than two decades of academic research on recommendation systems, this issue is still one of the most up-to-date research challenges. Recommendation systems save the users time, increase their satisfaction and their loyalty to sales sites and lead to the development of e-commerce, by personalizing the recommendations of goods or services to site users. Nowadays, recommendation systems have many applications in various sectors of e-commerce, especially in media products such as books, movies, and music. The famous e-commerce sites such as eBay, Amazon, and Netflix and domestic sites such as Digikala, Divar, and Filimo widely use recommendation systems. These systems use a variety of big data filtering methods to provide appropriate recommendations. The most important and widely used filtering method is collaborative filtering (CF). In this paper, we implement three CF recommender systems based on the correlation coefficient between users, selecting the optimal number of neighbors and calculating weighted scores for unwatched movies. The best method with the least error is selected as the desired model. We use Movielens ml-latest-small 100k research dataset with 9742 movies and 610 users as input. The results showed 3.29% less RMSE error compared with the latest research that has used the correlation method. Manuscript profile
      • Open Access Article

        16 - Architectural tactics identification in source code based on a semantic approach
        Ehsan Sharifi احمد عبداله زاده بارفروش
        Software systems are alive as long as they can be changed and used. Changing the source code without considering the consequences can lead to the architectural erosion of software systems. Architectural erosion gradually makes the system unchangeable and moves it toward More
        Software systems are alive as long as they can be changed and used. Changing the source code without considering the consequences can lead to the architectural erosion of software systems. Architectural erosion gradually makes the system unchangeable and moves it towards deterioration. Architectural decisions in the source code are usually made using architectural tactics. These tactics are fine-grained decisions that are made to achieve a certain quality attribute. Recognizing these tactics in the source code, allows developers to change the code while knowing the implementation location of these decisions. This slows the architectural erosion process and delays the system's movement towards deterioration. Thus, this paper introduces a method based on semantic web for recognizing the architectural tactics presented in the source code. Based on this approach, the new concept of microtactic is introduced that increases the possibility of recognizing architectural tactics using a semantic web and ontological approach. The evaluation results show that this method in comparison with other similar methods recognizes the tactics with higher precision and quality. Manuscript profile