• XML

    isc pubmed crossref medra doaj doaj
  • List of Articles


      • Open Access Article

        1 - Presenting a novel solution to choose a proper database for storing big data in national network services
        Mohammad Reza Ahmadi davood maleki ehsan arianyan
        The increasing development of tools producing data in different services and the need to store the results of large-scale processing results produced from various activities in the national information network services and also the data produced by the private sector an More
        The increasing development of tools producing data in different services and the need to store the results of large-scale processing results produced from various activities in the national information network services and also the data produced by the private sector and social networks, has made the migration to new databases solutions with appropriate features inevitable. With the expansion and change in the size and composition of data and the formation of big data, traditional practices and patterns do not meet new requirements. Therefore, the necessity of using information storage systems in new and scalable formats and models has become necessary. In this paper, the basic structural dimensions and different functions of both traditional databases and modern storage systems are reviewed and a new technical solution for migrating from traditional databases to modern databases is presented. Also, the basic features regarding the connection of traditional and modern databases for storing and processing data obtained from the comprehensive services of the national information network are presented and the parameters and capabilities of databases in the standard and Hadoop context are examined. In addition, as a practical example, a solution for combining traditional and modern databases has been presented, evaluated and compared using the BSC method. Moreover, it is shown that in different data sets with different data volumes, a combined use of both traditional and modern databases can be the most efficient solution. Manuscript profile
      • Open Access Article

        2 - Priority based Deployment of IoT Applications in Fog
        Masomeh Azimzadeh Ali Rezaee Somayyeh  Jafarali Jassbi MohammadMahdi Esnaashari
        Fog computing technology has emerged to respond to the need for modern IoT applications for low latency, high security, etc. On the other hand, the limitations of fog computing such as heterogeneity, distribution, and resource constraints make service management in this More
        Fog computing technology has emerged to respond to the need for modern IoT applications for low latency, high security, etc. On the other hand, the limitations of fog computing such as heterogeneity, distribution, and resource constraints make service management in this environment challenging. Intelligent service placement means placing application services on fog nodes to ensure their QoS and effective use of resources. Using communities to organize nodes for service placement is one of the approaches in this area, where communities are mainly created based on the connection density of nodes, and applications are placed based on a single-criteria prioritization approach. This leads to the creation of unbalanced communities and inefficient placement of applications. This paper presents a priority-based method for deploying applications in the fog environment. To this end, balanced communities are created and applications are placed in balanced communities based on a multi-criteria prioritization approach. This leads to optimal use of network capacities and increases in QoS. The simulation results show that the proposed method improves deadline by up to 22%, increases availability by about 12%, and increases resource utilization by up to 10%. Manuscript profile
      • Open Access Article

        3 - Identifying and ranking factors affecting the digital transformation strategy in Iran's road freight transportation industry focusing on the Internet of Things and data analytics
        Mehran Ehteshami Mohammad Hasan Cheraghali Bita Tabrizian Maryam Teimourian sefidehkhan
        This research has been done with the aim of identifying and ranking the factors affecting the digital transformation strategy in Iran's road freight transportation industry, focusing on the Internet of Things and data analytics. After reviewing the literature, semi-stru More
        This research has been done with the aim of identifying and ranking the factors affecting the digital transformation strategy in Iran's road freight transportation industry, focusing on the Internet of Things and data analytics. After reviewing the literature, semi-structured interviews were conducted with 20 academic and road freight transportation industry experts in Iran, who were selected using the purposive sampling method and saturation principle. In the quantitative part, the opinions of 170 employees of this industry, who were selected based on Cochran's formula and stratified sampling method, were collected using a researcher-made questionnaire. Delphi technique, literature review and coding were used to analyze the data in the qualitative part. In the quantitative part, inferential statistics and SPSS and smartPLS software were used. Finally, 40 indicators were extracted in the form of 8 factors and ranking of indicators and affecting factors was done using factor analysis. The result of this research shows that the internal factors have the highest rank and software infrastructure, hardware infrastructure, economic, external factors, legal, cultural and penetration factor are in the next ranks respectively. Therefore, it is suggested that organizations consider their human resource empowerment program in line with the use of technology and digital tools. Manuscript profile
      • Open Access Article

        4 - A Review on Hadith Text Processing Tasks
        Sepideh Baradaran Behrooz Minaei Mohammad Ebrahim Shenassa Sayyed Ali Hossayni
        In order to facilitate and achieve higher precision and less processing time, it is recommended to evaluate the authenticity of hadith by intelligent methods. Due to the huge volume of narrative texts (hadith) and the complex concepts and relationships in them, many res More
        In order to facilitate and achieve higher precision and less processing time, it is recommended to evaluate the authenticity of hadith by intelligent methods. Due to the huge volume of narrative texts (hadith) and the complex concepts and relationships in them, many researches have been conducted in the field of automatic hadith processing. In this field, some researchers have evaluated intelligent methods in the fields of Matn (text) and Isnad processing, which according to the review of previous researches, about 47% of them in the field of hadith text processing and 46% in the case of Isnad processing of hadiths and 7% have done research in both fields. By examining 97 researches in the field of processing hadiths, it was found that hadiths were evaluated in the field of measuring the accuracy of the text or Isnad or both cases. Processing tasks can be classified into different categories such as ontology construction, hadith text classification, hadith similarities and hadith authentication. The most used hadith processing method has been the information retrieval method in the field of hadith text processing. Manuscript profile
      • Open Access Article

        5 - A Horizon for Sentiment Analysis in Social Networks based on Interpreting Contents
        Maryam Tayefeh Mahmoudi َAmirmansour  Yadegari Parvin Ahmadi Kambiz Badie
        Interpreting contents in social networks with the aim of analyzing the sentiment of their narrators is of particular significance. In this paper, we present a framework for such a purpose, which is able to classify the messages hidden in contents based on using some rul More
        Interpreting contents in social networks with the aim of analyzing the sentiment of their narrators is of particular significance. In this paper, we present a framework for such a purpose, which is able to classify the messages hidden in contents based on using some rule-type protocols with high abstraction level. According to this framework, items such as prosodic of a content's narrator, context of disseminating a content and the key propositions in a content's text are regarded in the condition part of a protocol, while the possible classes for the message in a content are considered as its action part. It is to be noted that the proposed rule-type protocols can equally be used for other languages due to the generic-ness of the above-mentioned items. Results of computer simulations on a variety of different contents in the social networks show that the proposed framework is sufficiently capable of analyzing the sentiment of the contents' narrators in these networks. Manuscript profile
      • Open Access Article

        6 - Anomaly and Intrusion Detection through Datamining and Feature Selection using PSO Algorithm
        Fereidoon Rezaei Mohamad Ali Afshar Kazemi Mohammad Ali Keramati
        Today, considering technology development, increased use of Internet in businesses, and movement of business types from physical to virtual and internet, attacks and anomalies have also changed from physical to virtual. That is, instead of thieving a store or market, th More
        Today, considering technology development, increased use of Internet in businesses, and movement of business types from physical to virtual and internet, attacks and anomalies have also changed from physical to virtual. That is, instead of thieving a store or market, the individuals intrude the websites and virtual markets through cyberattacks and disrupt them. Detection of attacks and anomalies is one of the new challenges in promoting e-commerce technologies. Detecting anomalies of a network and the process of detecting destructive activities in e-commerce can be executed by analyzing the behavior of network traffic. Data mining systems/techniques are used extensively in intrusion detection systems (IDS) in order to detect anomalies. Reducing the size/dimensions of features plays an important role in intrusion detection since detecting anomalies, which are features of network traffic with high dimensions, is a time-consuming process. Choosing suitable and accurate features influences the speed of the proposed task/work analysis, resulting in an improved speed of detection. In this article, by using data mining algorithms such as J48 and PSO, we were able to significantly improve the accuracy of detecting anomalies and attacks. Manuscript profile
      • Open Access Article

        7 - Intrusion Detection Based on Cooperation on the Permissioned Blockchain Platform in the Internet of Things Using Machine Learning
        Mohammad Mahdi  Abdian majid ghayori Seyed Ahmad  Eftekhari
        Intrusion detection systems seek to realize several objectives, such as increasing the true detection rate, reducing the detection time, reducing the computational load, and preserving the resulting logs in such a way that they cannot be manipulated or deleted by unauth More
        Intrusion detection systems seek to realize several objectives, such as increasing the true detection rate, reducing the detection time, reducing the computational load, and preserving the resulting logs in such a way that they cannot be manipulated or deleted by unauthorized people. Therefore, this study seeks to solve the challenges by benefiting from the advantages of blockchain technology, its durability, and relying on IDS architecture based on multi-node cooperation. The proposed model is an intrusion detection engine based on the decision tree algorithm implemented in the nodes of the architecture. The architecture consists of several connected nodes on the blockchain platform. The resulting model and logs are stored on the blockchain platform and cannot be manipulated. In addition to the benefits of using blockchain, reduced occupied memory, the speed, and time of transactions are also improved by blockchain. In this research, several evaluation models have been designed for single-node and multi-node architectures on the blockchain platform. Finally, proof of architecture, possible threats to architecture, and defensive ways are explained. The most important advantages of the proposed scheme are the elimination of the single point of failure, maintaining trust between nodes, and ensuring the integrity of the model, and discovered logs. Manuscript profile
      • Open Access Article

        8 - Improving resource allocation in mobile edge computing using gray wolf and particle swarm optimization algorithms
        seyed ebrahim dashti saeid shabooei
        Mobile edge computing improves the experience of end users to achieve appropriate services and service quality. In this paper, the problem of improving resource allocation when offloading tasks based on mobile devices to edge servers in computing systems was investigate More
        Mobile edge computing improves the experience of end users to achieve appropriate services and service quality. In this paper, the problem of improving resource allocation when offloading tasks based on mobile devices to edge servers in computing systems was investigated. Some tasks are processed locally and some are offloaded to edge servers. The main issue is that the offloaded tasks for virtual machines in computing networks are properly scheduled to minimize computing time, service cost, computing network waste, and the maximum connection of a task with the network. In this paper, it was introduced using the hybrid algorithm of particle swarm and gray wolf to manage resource allocation and task scheduling to achieve an optimal result in edge computing networks. The comparison results show the improvement of waiting time and cost in the proposed approach. The results show that, on average, the proposed model has performed better by reducing the work time by 10% and increasing the use of resources by 16%. Manuscript profile
      • Open Access Article

        9 - Multi-level ternary quantization for improving sparsity and computation in embedded deep neural networks
        Hosna Manavi Mofrad Seyed Ali ansarmohammadi Mostafa Salehi
        Deep neural networks (DNNs) have achieved great interest due to their success in various applications. However, the computation complexity and memory size are considered to be the main obstacles for implementing such models on embedded devices with limited memory and co More
        Deep neural networks (DNNs) have achieved great interest due to their success in various applications. However, the computation complexity and memory size are considered to be the main obstacles for implementing such models on embedded devices with limited memory and computational resources. Network compression techniques can overcome these challenges. Quantization and pruning methods are the most important compression techniques among them. One of the famous quantization methods in DNNs is the multi-level binary quantization, which not only exploits simple bit-wise logical operations, but also reduces the accuracy gap between binary neural networks and full precision DNNs. Since, multi-level binary can’t represent the zero value, this quantization does’nt take advantage of sparsity. On the other hand, it has been shown that DNNs are sparse, and by pruning the parameters of the DNNs, the amount of data storage in memory is reduced while computation speedup is also achieved. In this paper, we propose a pruning and quantization-aware training method for multi-level ternary quantization that takes advantage of both multi-level quantization and data sparsity. In addition to increasing the accuracy of the network compared to the binary multi-level networks, it gives the network the ability to be sparse. To save memory size and computation complexity, we increase the sparsity in the quantized network by pruning until the accuracy loss is negligible. The results show that the potential speedup of computation for our model at the bit and word-level sparsity can be increased by 15x and 45x compared to the basic multi-level binary networks. Manuscript profile
      • Open Access Article

        10 - computer security models and proposing a new perspective: A review paper
        Hadi sadjadi Reza Kalantari
        In this article first the use of computer security models and its benefits are discussed in a novel way. Then, while briefly introducing the space of computer security encounters in the form of ontology, for the first time, three perspectives in the study of patterns in More
        In this article first the use of computer security models and its benefits are discussed in a novel way. Then, while briefly introducing the space of computer security encounters in the form of ontology, for the first time, three perspectives in the study of patterns in this field have been identified and distinguished from each other. These three perspectives include the view of secure models, the view of security models, and the view of the framework and system to security models. The first and third perspectives are briefly explained and the second perspective is studied in detail from the perspective of the organization of patterns, including the five types of organization. The five types mentioned include software-based lifecycle organization, logical-level organization-based organization, threat-based classification-based organization, attack-based classification-based organization, and application-based organization. In this type of introduction of patterns, the audience acquires a comprehensive view of the discourse of computer security patterns and acquires the necessary knowledge to make better use of these patterns. Finally, the analysis and idea of this research is presented in the form of introducing a new type of organization in order to facilitate the proper use and addressing of patterns. In this idea, it is stated that the existing categories are mostly static and forward-looking and do not have the necessary dynamism and backwardness, and the idea of covering all stakeholders and security ontology can have this feature and, in addition, include agile patterns as well. . Manuscript profile
      • Open Access Article

        11 - Identifying the Key Drivers of Digital Signature Implementation in Iran (using fuzzy Delphi method)
        Ghorbanali Mehrabani Fatemeh Zargaran khouzani
        iThe purpose of this article is to identify and analyze the key drivers of digital signature implementation in Iran with a fuzzy Delphi approach. In terms of practical purpose and in terms of information gathering, the research has benefited from a hybrid approach. The More
        iThe purpose of this article is to identify and analyze the key drivers of digital signature implementation in Iran with a fuzzy Delphi approach. In terms of practical purpose and in terms of information gathering, the research has benefited from a hybrid approach. The statistical community consists of all experts and specialists in the field of information technology and digital signature and articles in this field. The sample size of the statistical community of experts is 13 people who were selected by the purposeful sampling method. 30 articles were selected based on their availability and downloadable, non-technical nature, and relevance to the topic. The method of data analysis was done according to the fuzzy Delphi approach. Validity and reliability were calculated and confirmed using the CVR index and Cohen's kappa test with coefficients of 0.83 and 0.93, respectively. The results prove that the key drivers of digital signature implementation in Iran include 5 main dimensions and 30 concepts, which are 1) security (information confidentiality, information security, sender authentication, document authentication, privacy protection, trust between parties), 2) business (digital business models, communication needs, staff management, organization size, organizational structure, organization resources, organizational culture, top managers, competition ecosystem, e-governance), 3) user (perceived convenience, perceived benefit, consumer behavior, consumer literacy, consumer lifestyle), 4) technical (development of technical infrastructure, systems integration, system complexity, system tanks, design quality, technical speed of certificate production and verification, impermeability of hackers) and 5) Legal (legal licenses, penal laws, legislative body, e-commerce laws). Manuscript profile
      • Open Access Article

        12 - Community Detection in Bipartite Networks Using HellRank Centrality Measure
        Ali Khosrozadeh Ali Movaghar Mohammad Mehdi Gilanian Sadeghi Hamidreza Mahyar
        Community structure is a common and important feature in many complex networks, including bipartite networks. In recent years, community detection has received attention in many fields and many methods have been proposed for this purpose, but the heavy consumption of ti More
        Community structure is a common and important feature in many complex networks, including bipartite networks. In recent years, community detection has received attention in many fields and many methods have been proposed for this purpose, but the heavy consumption of time in some methods limits their use in large-scale networks. There are methods with lower time complexity, but they are mostly non-deterministic, which greatly reduces their applicability in the real world. The usual approach that is adopted to community detection in bipartite networks is to first construct a unipartite projection of the network and then communities detect in that projection using methods related to unipartite networks, but these projections inherently lose information. In this paper, based on the bipartite modularity measure that quantifies the strength of partitions in bipartite networks and using the HellRank centrality measure, a quick and deterministic method for community detection from bipartite networks directly and without need to projection, proposed. The proposed method is inspired by the voting process in election activities in the social society and simulates it. Manuscript profile
      • Open Access Article

        13 - Generalizing The Concept of Business Processes Structural Soundness from Classic Petri-nets to BPMN2.0 Process Models
        Yahya Poursoltani Mohammad Hassan Shirali-Shahreza S. Alireza Hashemi Golpayegani
        BPMN2.0 Standard is a modeling language, which can be understood and used by a wide range of users. However, because of its non-formal nature, models (designed using it) can be containing structural errors such as Deadlock (impossibility of executing some of process tas More
        BPMN2.0 Standard is a modeling language, which can be understood and used by a wide range of users. However, because of its non-formal nature, models (designed using it) can be containing structural errors such as Deadlock (impossibility of executing some of process tasks) and Livelock (infinite repetition of tasks) may be produced by using them. These semantic errors can create anomalies in the workflow of the organization. So far, some researches has been conducted on the validation of these process models and various solutions have been provided to discover some of these structural errors. The question that may be raised about these methods is whether it is possible to definitely guarantee the structural accuracy of a BPMN method model by using any of them? To answer this question, we need a comprehensive definition of a correct BPMN2.0 process model, based on which we can evaluate the comprehensiveness of validation methods and strongly make sure that the considered method can discover all of the structural errors of the process model. In this paper, based on concept of general process models and the concept of soundness (based on process models created using Petri nets) and the generalization of its properties, i.e. Liveness and Boundness to BPMN2.0 process models, a comprehensive definition for a correct (sound) BPMN2 process model provided. Then, the comprehensiveness of the suggested methods of some of the most important researches conducted has been evaluated based on it. This definition can be used as a measure for efficiency of BPMN validation methods. Manuscript profile
      • Open Access Article

        14 - Presenting the ICT Policies Implementation Model of the 6th Development Using the Neural Network Method
        Nazila Mohammadi Gholamreza   Memarzadeh Tehran Sedigheh Tootian Isfahani
        It is inevitable to properly manage the implementation of information and communication technology policies in a planned way in order to improve the country's position in the fields of science and technology. The purpose of this research is to provide a model of the eff More
        It is inevitable to properly manage the implementation of information and communication technology policies in a planned way in order to improve the country's position in the fields of science and technology. The purpose of this research is to provide a model of the effective factors on the implementation of Iran's ICT policies with the help of the neural network technique and based on Giddens' constructive theory. From the point of view of conducting it, this research is of a survey type and based on the purpose, it is of an applied type because it is trying to use the results of the research in the Ministry of Communication and Information Technology and the Iranian Telecommunications Company. Data collection is based on library and field method. The tool for collecting information is research researcher-made questionnaire. The statistical population of the research is information and communication technology experts at the headquarters of Iran Telecommunication Company (810 people), of which 260 people were randomly selected as a sample based on Cochran's formula. MATLAB software was used for data analysis. According to the findings, the best combination for development is when all input variables are considered at the same time, and the worst case is when the infrastructure development variable is ignored, and the most important based on network sensitivity analysis is related to infrastructure development and the least important is related to content supply. Manuscript profile
      • Open Access Article

        15 - Test case Selection based on Test-Driven Development
        Zohreh Mafi mirian mirian
        Test-Driven Development (TDD) is one of the test-first software production methods in which the production of each component of the code begins with writing the test case. This method has been noticed due to many advantages, including the readable, regular and short cod More
        Test-Driven Development (TDD) is one of the test-first software production methods in which the production of each component of the code begins with writing the test case. This method has been noticed due to many advantages, including the readable, regular and short code, as well as increasing the quality, productivity and reliability, and the possibility of regression testing due to the creation of a comprehensive set of unit tests. The large number of unit test cases produced in this method is considered as a strong point in order to increase the reliability of the code, however, the repeated execution of test cases increases the duration of the regression testing in this method. The purpose of this article is to present an algorithm for selecting test cases to reduce the time of the regression test in TDD method. So far, various ideas have been proposed to select test cases and reduce the regression test time. Most of these ideas are based on programming language and software production methods. The idea presented in this article is based on the program difference method and the nature of the TDD method. In this method, meaningful semantic and structural connections are created between unit tests and code blocks, and the test case selection is done based on these relationships. Manuscript profile
      • Open Access Article

        16 - Survey on the Applications of the Graph Theory in the Information Retrieval
        Maryam Piroozmand Amir Hosein Keyhanipour Ali Moeini
        Due to its power in modeling complex relations between entities, graph theory has been widely used in dealing with real-world problems. On the other hand, information retrieval has emerged as one of the major problems in the area of algorithms and computation. As graph- More
        Due to its power in modeling complex relations between entities, graph theory has been widely used in dealing with real-world problems. On the other hand, information retrieval has emerged as one of the major problems in the area of algorithms and computation. As graph-based information retrieval algorithms have shown to be efficient and effective, this paper aims to provide an analytical review of these algorithms and propose a categorization of them. Briefly speaking, graph-based information retrieval algorithms might be divided into three major classes: the first category includes those algorithms which use a graph representation of the corresponding dataset within the information retrieval process. The second category contains semantic retrieval algorithms which utilize the graph theory. The third category is associated with the application of the graph theory in the learning to rank problem. The set of reviewed research works is analyzed based on both the frequency as well as the publication time. As an interesting finding of this review is that the third category is a relatively hot research topic in which a limited number of recent research works are conducted. Manuscript profile
      • Open Access Article

        17 - Application identification through intelligent traffic classification
        Shaghayegh Naderi
        Traffic classification and analysis is one of the big challenges in the field of data mining and machine learning, which plays an important role in providing security, quality assurance and network management. Today, a large amount of transmission traffic in the network More
        Traffic classification and analysis is one of the big challenges in the field of data mining and machine learning, which plays an important role in providing security, quality assurance and network management. Today, a large amount of transmission traffic in the network is encrypted by secure communication protocols such as HTTPS. Encrypted traffic reduces the possibility of monitoring and detecting suspicious and malicious traffic in communication infrastructures (instead of increased security and privacy of the user) and its classification is a difficult task without decoding network communications, because the payload information is lost, and only the header information (which is encrypted too in new versions of network communication protocols such as TLS1.03) is accessible. Therefore, the old approaches of traffic analysis, such as various methods based on port and payload, have lost their efficiency, and new approaches based on artificial intelligence and machine learning are used in cryptographic traffic analysis. In this article, after reviewing the traffic analysis methods, an operational architectural framework for intelligent traffic analysis and classification has been designed. Then, an intelligent model for Traffic Classification and Application Identification is presented and evaluated using machine learning methods on Kaggle141. The obtained results show that the random forest model, in addition to high interpretability compared to deep learning methods, has been able to provide high accuracy in traffic classification compared to other machine learning methods. Finally, tips and suggestions about using machine learning methods in the operational field of traffic classification have been provided. Manuscript profile
      • Open Access Article

        18 - Explanation the role of standardization in the proposed solutions for privacy protection in health data
        batool mehrshad Mohammad mehraeen Mohammad Khansari saeed mortazavi
        Introduction: Due to the importance of data sharing in the digital era and the two main considerations related to it that are; standardization and privacy protection,this article aims to answer a critical question that is,does standardization play a role in the proposed More
        Introduction: Due to the importance of data sharing in the digital era and the two main considerations related to it that are; standardization and privacy protection,this article aims to answer a critical question that is,does standardization play a role in the proposed solutions for health data privacy protection? Methods: The present study is a systematic review conducted by searching databases such as Web of Science, PubMed, ScienceDirect, Springer, Magiran and SID and by applying a time limit filter.After applying the criteria for inclusion and exclusion and evaluating the results,relevant studies were selected. Findings: Articles addressing standardization and privacy protection in health data have been analyzed by taking 5 indicators into account. The need for standardization and its role to preserve privacy in health data have also been explained by examining the findings and discussing various laws related to privacy in the health field and its relationship with standardization.After the investigation,our study reveals that due to the technical structure of the fourth and fifth generation of health care, which has facilitated standardization, privacy protection can also be achieved through standardization.Finally,directions for future research on this topic are also suggested. Conclusion: The results of this research showed that the fourth- and fifth-generation health care systems that are technology-oriented; are formed based on standards,and these standards provide the possibility of their evaluation. Thus if laws related to health data privacy protection are developed based on standards,they will have a high execution guarantee. This further highlights the critical role of standard development organizations in this field. Manuscript profile
      • Open Access Article

        19 - Persian Stance Detection Based On Multi-Classifier Fusion
        Mojgan Farhoodi Abbas Toloie Eshlaghy
        <p style="text-align: left;"><span style="font-size: 12.0pt; font-family: 'Times New Roman',serif; mso-fareast-font-family: 'Times New Roman'; mso-bidi-font-family: Nazanin; mso-ansi-language: EN-US; mso-fareast-language: EN-US; mso-bidi-language: FA;">Stance detection More
        <p style="text-align: left;"><span style="font-size: 12.0pt; font-family: 'Times New Roman',serif; mso-fareast-font-family: 'Times New Roman'; mso-bidi-font-family: Nazanin; mso-ansi-language: EN-US; mso-fareast-language: EN-US; mso-bidi-language: FA;">Stance detection (also known as stance classification, stance prediction, and stance analysis) is a recent research topic that has become an emerging paradigm of the importance of opinion-mining. The purpose of stance detection is to identify the author's viewpoint toward a specific target, which has become a key component of applications such as fake news detection, claim validation, argument search, etc. In this paper, we applied three approaches including machine learning, deep learning and transfer learning for Persian stance detection. Then we proposed a framework of multi-classifier fusion for getting final decision on output results. We used a weighted majority voting method based on the accuracy of the classifiers to combine their results. The experimental results showed the performance of the proposed multi-classifier fusion method is better than individual classifiers.</span></p> Manuscript profile
      • Open Access Article

        20 - A Recommender System Based on the Analysis of Personality Traits in Telegram Social Network
        Mohammad Javad shayegan mohadeseh valizadeh
        <p style="text-align: left;"><span style="font-size: 12.0pt; font-family: 'Times New Roman',serif; mso-fareast-font-family: 'Times New Roman'; mso-bidi-font-family: Nazanin; mso-ansi-language: EN-US; mso-fareast-language: EN-US; mso-bidi-language: FA;">Analysis of perso More
        <p style="text-align: left;"><span style="font-size: 12.0pt; font-family: 'Times New Roman',serif; mso-fareast-font-family: 'Times New Roman'; mso-bidi-font-family: Nazanin; mso-ansi-language: EN-US; mso-fareast-language: EN-US; mso-bidi-language: FA;">Analysis of personality traits of individuals has always been one of the interesting research topics. In addition, achieving personality traits based on data obtained from individuals' behavior is a challenging issue. Most people spend most of their time on social media and may engage in behaviors that represent a character in cyberspace. There are many social networks today, one of which is the Telegram social network. Telegram also has a large audience in Iran and people use it to communicate, interact with others, educate, introduce products and so on. This research seeks to find out how a recommendation system can be built based on the personality traits of individuals. For this purpose, the personality of the users of a telegram group is identified using three algorithms, Cosine Similarity, MLP and Bayes, and finally, with the help of a recommending system, telegram channels tailored to each individual's personality are suggested to him. The research results show that this recommending system has attracted 65.42% of users' satisfaction.</span></p> Manuscript profile