Latest Journal News
  • OpenAccess
  • About Journal

    According to the letter No. 3/4817 dated 2007/9/02 of the Research Affairs Office of the Iranian Ministry of Science, Research and Technology and the statement of the vote of the Commission for the Review of Scientific Journals of the country on 2007/7/14, this journal has been awarded a scientific-research degree.

    Journal of Information and Communication Technology (JICT) belongs to the "Iranian Information and Communication Technology Association" and is a peer-reviewed open-access bi-quarterly journal publishing high-quality scientific papers covering all aspects of the information and communication technologies including computer science and engineering, information technology, communication technology, information technology management. It is an interdisciplinary journal devoted to the publication of original articles, review articles, etc., considering the research ethics and academic rules and regulations.
    All papers are subject to a blind reviewing process. The submitted manuscripts will be published after a thorough review and the editorial board's approval. 


    Recent Articles

    • Open Access Article

      1 - An Individual-Oriented Shuffled Frog Leaping Algorithm for Solving Vehicle Routing Problem
      Soheila Shafiezadeh Zahra Beheshti
      Iss. 51 , Vol. 14 , Spring 2022
      The Vehicle Routing Problem (VRP) is one of the most important problems in supply chain management because the optimal allocation of vehicles has a significant impact on reducing costs. VRP is in the class of NP-hard problems and exact algorithms cannot find the best so More
      The Vehicle Routing Problem (VRP) is one of the most important problems in supply chain management because the optimal allocation of vehicles has a significant impact on reducing costs. VRP is in the class of NP-hard problems and exact algorithms cannot find the best solution in an acceptable time. Hence, meta-heuristic algorithms can be employed to solve it. Shuffled Frog Leaping Algorithm (SFLA) is one of the meta-heuristic algorithms, which is efficient, but in some cases, its population diversity rapidly reduces, and the algorithm falls in local optima. In this study, an Individual-Oriented Shuffled Frog Leaping Algorithm (IO-SFLA) is proposed to enhance the exploration and exploitation of SFLA by exchanging the global and local information. Several VRPs in different dimensions are applied to evaluate the performance of IO-SFLA. The efficiency of IO-SFLA is compared with several improved shuffled frog leaping algorithms, Simulated Annealing (SA) and Genetic Algorithm (GA). The results show that IO-SFLA provides significant results compared with the other competitor algorithms. IO-SFLA achieves an average of 1130.442 for the best path cost. The next rank belongs to SA with an average of 1228.725. Other compared algorithms are in the lower ranks with high differences in results. Manuscript profile

    • Open Access Article

      2 - A Novel Model based on Encoder-Decoder Architecture and Attention Mechanism for Automatic Abstractive Text Summarization
      hasan aliakbarpor mohammadtaghi manzouri amirmasoud rahmani
      Iss. 51 , Vol. 14 , Spring 2022
      By the extension of the Web and the availability of a large amount of textual information, the development of automatic text summarization models as an important aspect of natural language processing has attracted many researchers. However, with the growth of deep learn More
      By the extension of the Web and the availability of a large amount of textual information, the development of automatic text summarization models as an important aspect of natural language processing has attracted many researchers. However, with the growth of deep learning methods in the field of text processing, text summarization has also entered a new phase of development and abstractive text summarization has experienced significant progress in recent years. Even though, it can be claimed that all the potential of deep learning has not been used for this aim and the need for progress in this field, as well as considering the human cognition in creating the summarization model, is still felt. In this regard, an encoder-decoder architecture equipped with auxiliary attention is proposed in this paper which not only used the combination of linguistic features and embedding vectors as the input of the learning model but also despite previous studies that commonly employed the attention mechanism in the decoder, it utilized auxiliary attention mechanism in the encoder to imitate human brain and cognition in summary generation. By the employment of the proposed attention mechanism, only the most important parts of the text rather than the whole input text are encoded and then sent to the decoder to generate the summary. The proposed model also used a switch with a threshold in the decoder to overcome the rare words problem. The proposed model was examined on CNN / Daily Mail and DUC-2004 datasets. Based on the empirical results and according to the ROUGE evaluation metric, the proposed model obtained a higher accuracy compared to other existing methods for generating abstractive summaries on both datasets. Manuscript profile

    • Open Access Article

      3 - Exploring the antecedents and consequences of smart good governance using the fuzzy Delphi method (FDM)
      Seyed Abdolrasoul Hosseini Mohammad Ghasemi Nour Mohammad  Yaghubi Habibollah Salarzehi
      Iss. 51 , Vol. 14 , Spring 2022
      Background: The present study aimed to identify the antecedents and consequences of smart good governance using the fuzzy Delphi method (FDM). Method: The present study is a mixed-method study that was conducted using a deductive-inductive approach. It is also an appli More
      Background: The present study aimed to identify the antecedents and consequences of smart good governance using the fuzzy Delphi method (FDM). Method: The present study is a mixed-method study that was conducted using a deductive-inductive approach. It is also an applied study in terms of its objectives and a descriptive survey study in terms of the design and methodology. The research population included all experts in the field of governance. Accordingly, 26 experts were selected based on the principle of theoretical adequacy and using purposive sampling. In the qualitative part of the study, semi-structured interviews were conducted to collect data. In addition, in the quantitative part, a Researcher made questionnaire was used to collect the data. The validity and reliability of the questionnaire were confirmed via content validity and test-retest method. In the qualitative part, the data collected from the interviews were analyzed using ATLAS.ti software and the indexing method. Besides, in the quantitative part of the study, the antecedents and consequences of smart good governance were ranked using the fuzzy Delphi method (FDM). Results: A comparison of the precedents of smart good governance showed that smart technology and data, electronic and smart interaction, rule of law, competent and committed authorities, cyber and smart security, good smart society and citizens, smart management and executive leadership, e-architecture, smart e-government, the quality of laws and regulations, transparent governance, democratic and smart infrastructure, strategies and political stability, the strengthening of civil society, and public awareness were ranked as the most important factors, respectively. It was also shown that efficiency and effectiveness, entrepreneurship, improving quality of life and sustainable development, ethics and morality, equality and inclusive justice, downsizing, successful crisis management, immediate response to challenges, increasing closeness and compassion between government and people, reducing corruption, technological knowledge, reducing work time and administrative processes, and eliminating bureaucracy were the most important consequences of smart good government, respectively. The insights from this study can contribute to establishing smart good governance. Manuscript profile

    • Open Access Article

      4 - A Multi-Objective Differential Evolutionary Algorithm-based Approach for Resource Allocation in Cloud Computing Environment
      Saeed Bakhtiari Mahan Khosroshahi
      Iss. 51 , Vol. 14 , Spring 2022
      In recent years, the cloud computing model has received a lot of attention due to its high scalability, reliability, information sharing and low cost compared to separate machines. In the cloud environment, scheduling and optimal allocation of tasks affects the effectiv More
      In recent years, the cloud computing model has received a lot of attention due to its high scalability, reliability, information sharing and low cost compared to separate machines. In the cloud environment, scheduling and optimal allocation of tasks affects the effective use of system resources. Currently, common methods for scheduling in the cloud computing environment are performed using traditional methods such as Min-Min and meta-heuristic methods such as ant colony optimization algorithm (ACO). The above methods focused on optimizing one goal and do not estimate multiple goals at the same time. The main purpose of this research is to consider several objectives (total execution time, service level agreement and energy consumption) in cloud data centers with scheduling and optimal allocation of tasks. In this research, multi-objective differential evolution algorithm (DEA) is used due to its simple structure features and less adjustable parameters. In the proposed method, a new approach based on DEA to solve the problem of allocation in cloud space is presented which we try to be effective in improving resource efficiency and considering goals such as time, migration and energy by defining a multi-objective function and considering mutation and crossover vectors. The proposed method has been evaluated through a CloudSim simulator by testing the workload of more than a thousand virtual machines on Planet Lab. The results of simulation show that the proposed method in comparison with IqrMc, LrMmt and FA algorithms, in energy consumption by an average of 23%, number of migrations by an average of 29%, total execution time by an average of 29% and service level agreement violation (SLAV) by an average of 1% has been improved. In this case, use of the proposed approach in cloud centers will lead to better and appropriate services to customers of these centers in various fields such as education, engineering, manufacturing, services, etc. Manuscript profile

    • Open Access Article

      5 - Using Sentiment Analysis and Combining Classifiers for Spam Detection in Twitter
      mehdi salkhordeh haghighi Aminolah Kermani
      Iss. 51 , Vol. 14 , Spring 2022
      The welcoming of social networks, especially Twitter, has posed a new challenge to researchers, and it is nothing but spam. Numerous different approaches to deal with spam are presented. In this study, we attempt to enhance the accuracy of spam detection by applying one More
      The welcoming of social networks, especially Twitter, has posed a new challenge to researchers, and it is nothing but spam. Numerous different approaches to deal with spam are presented. In this study, we attempt to enhance the accuracy of spam detection by applying one of the latest spam detection techniques and its combination with sentiment analysis. Using the word embedding technique, we give the tweet text as input to a convolutional neural network (CNN) architecture, and the output will detect spam text or normal text. Simultaneously, by extracting the suitable features in the Twitter network and applying machine learning methods to them, we separately calculate the Tweeter spam detection. Eventually, we enter the output of both approaches into a Meta Classifier so that its output specifies the final spam detection or the normality of the tweet text. In this study, we employ both balanced and unbalanced datasets to examine the impact of the proposed model on two types of data. The results indicate an increase in the accuracy of the proposed method in both datasets. Manuscript profile

    • Open Access Article

      6 - Outage and Throughput Analysis of Bidirectional Cognitive Amplify-and-Forward Relaying Networks with Wireless Power Transfer
      Ehsan Soleimani Nasab
      Iss. 51 , Vol. 14 , Spring 2022
      Cognitive radio is a promising technology which aims to achieve better frequency spectrum utilization. On the other hand, wireless energy harvesting can provide extra energy requirement at the nodes. Two scenarios in a two-way network are assumed where in the first scen More
      Cognitive radio is a promising technology which aims to achieve better frequency spectrum utilization. On the other hand, wireless energy harvesting can provide extra energy requirement at the nodes. Two scenarios in a two-way network are assumed where in the first scenario, relay harvests its required energy from end-sources of secondary network in presence of cognitive radio network and in the second scenario, both end-sources harvest energy from relay in secondary network. Both the Nakagami-m fading caused by signal propagation and the interference at relay caused by primary users in a cognitive radio network are considered. Closed-form expressions for outage probability and throughput of bidirectional cognitive radio amplify-and-forward relaying network using energy harvesting and wireless power transfer techniques over independent and non-identically distributed (i.n.i.d.) Nakagami-m fading channels are proposed. The analytical derivations are validated employing Monte Carlo simulations, where it is demonstrated that the first scenario always outperforms the second one, while both scenarios perform better than no energy harvesting case. Manuscript profile

    • Open Access Article

      7 - Explaining the adoption process of software-oriented networks (SDN) using the foundational data method and systems approach
      Elham Ziaeipour ali rajabzadeh ghotri Alireza Taghizadeh
      Iss. 51 , Vol. 14 , Spring 2022
      Software Defined Networking (SDN) is one of the technologies with most promising role in digital transformation. Dynamic structure of SDN can adapt to ever changing nature of future networks and their users. The important impact of this technology on intelligence, agi More
      Software Defined Networking (SDN) is one of the technologies with most promising role in digital transformation. Dynamic structure of SDN can adapt to ever changing nature of future networks and their users. The important impact of this technology on intelligence, agility, management and control of current network devices as well as upcoming communication technologies reduces expenses and creates innovative businesses. Although, service providers are very interested in deploying SDN to transform their static infrastructures to a dynamic and programmable platform, they do not consider it as one of their priorities and still depend on traditional methods to manage their network. Therefore, this study highlights the factors affecting the acceptance of SDN architecture and its application by the national telecom operators, and proposes a comprehensive and new paradigm model using a systems approach and Grounded theory (Strauss and Corbin model). This innovative model is provided by systematically reviewing the theoretical foundations and conducting in-depth interviews with managers and experts in telecom industry. During the modeling process, more than a thousand initial codes were determined. Finally, based on the opinion of experts on these codes, a total of 73 open codes, 12 axial codes and 6 main categories have been extracted. Manuscript profile

    • Open Access Article

      8 - High Performance Computing: Next Generation Requirements and Research Axes
      ehsan arianyan MohammadMahdi Esnaashari Fatemeh Ehsani Boshla Shaghayeghsadat Hossieni bayan Masoud Dehyadegari Behnam Samadi
      Iss. 51 , Vol. 14 , Spring 2022
      Nowadays, increasing the processing power of supercomputers is a worldwide race. This race, signifies the importance of supercomputers in the current era. They are engines of improving technology in almost all scientific areas, such as computational biology, earth scien More
      Nowadays, increasing the processing power of supercomputers is a worldwide race. This race, signifies the importance of supercomputers in the current era. They are engines of improving technology in almost all scientific areas, such as computational biology, earth sciences, cosmology, fluid dynamics, and plasma modeling, to name a few. Next generation of supercomputers can be divided into two broad categories: 1) emerging technologies such as neuromorphic and quantum computing and 2) Exascala computing. Emerging technologies will be the future of supercomputing, however, not in a very recent future. Therefore, in this paper, we have focused on Exascale computing, and have tried to provide a comprehensive overview of the main requirements for this technology to be achieved and become available. Requirements have been overviewed from different aspects; hardware, software, artificial intelligence, and cloud computing. In addition, we have attempted to provide a complete taxonomy of hot research topics within this area. Manuscript profile

    • Open Access Article

      9 - Reduction of network load by mapping the application in the network on a chip using the discrete Harris hawk algorithm
      Elham Hajebi Vahid Sattari-Naeini
      Iss. 51 , Vol. 14 , Spring 2022
      Reducing load and power consumption in on-chip network systems is very important and one of the most important issues to increase the efficiency of on-chip network is the issue of mapping an application on the chip network. Solving the application mapping problem to fin More
      Reducing load and power consumption in on-chip network systems is very important and one of the most important issues to increase the efficiency of on-chip network is the issue of mapping an application on the chip network. Solving the application mapping problem to find the best mapping is a complex and time consuming issue and has a huge impact on network latency and power consumption. In this paper, using the Harris hawk algorithm, we have been able to provide a method for mapping processing cores to the network on chip to reduce the load on the network and thus congestion in the links and improve network performance. The simulation results show that this algorithm performs better than the basic algorithms. Manuscript profile

    • Open Access Article

      10 - Pathology of software export development in Iran
      Maryam Saleh Gorgory Mohsen Gerami Vahid Yazdanian
      Iss. 51 , Vol. 14 , Spring 2022
      Today, the new developments of the world economy, including the severe fluctuation of the price of raw materials, the increase in the wages of the human force, the increase in the cost of transportation, storage and other production factors, have made many developing co More
      Today, the new developments of the world economy, including the severe fluctuation of the price of raw materials, the increase in the wages of the human force, the increase in the cost of transportation, storage and other production factors, have made many developing countries to think about entering The field of production and trade of goods that have the least amount of dependence on high-risk economic components. The software industry is one of these industries. In addition to having high added value, this industry has the least need for raw materials and other cost-generating components. In fact, the software industry is a pure knowledge industry and is based on research and development. Our country can also take steps to export software in order to gain benefits from this industry. Considering this, this research has addressed the pathology of software export development. In this research, the statistical population of which are software companies that are members of the software exporters' union, about the demand, national perspective and strategy, international trust and communication, features of the software industry and infrastructure and internal factors are investigated. The results of our research show that the problems of software export are the lack of demand and software infrastructure and internal factors. Manuscript profile

    • Open Access Article

      11 - Two Fuzzy Virtual Force Algorithms to Improve Sensor Deployment in Wireless Sensor Networks
      Vahid Kiani
      Iss. 51 , Vol. 14 , Spring 2022
      Maximizing area coverage is an important issue in the placement of wireless network sensors, the realization of which helps to improve the network monitoring power. In many applications, the sensors are first randomly distributed in the sensing filed and then their plac More
      Maximizing area coverage is an important issue in the placement of wireless network sensors, the realization of which helps to improve the network monitoring power. In many applications, the sensors are first randomly distributed in the sensing filed and then their placement is modified. The virtual force algorithm (VFA) tries to achieve a more desirable deployment from an initial sensing deployment by considering repulsive and attractive forces between the sensors. In this paper, the combination of Takashi-Sugeno fuzzy system with VFA is used to achieve a better redeployment of the sensors. To adaptively adjust optimal distance value of the sensors, two fuzzy methods are proposed in this paper and their role in improving performance of the virtual force algorithm is analyzed. Comparison of the performance of the proposed methods with the state-of-the-art reveals that intelligent and adaptive adjustment of the optimal distance using a fuzzy system leads to higher final coverage ratio over traditional virtual force algorithm (VFA), improved virtual force algorithm (IVFA), fuzzy redeployment algorithm (FRED), and two metaheuristics GA, and PSO. On the other hand, the proposed VF-based methods require much less time to solve the problem than GA and PSO metaheuristic methods. Manuscript profile

    • Open Access Article

      12 - Open Data Accessing Policymaking in Iran, in the Aspect of Preserving Privacy and Personal Data Ownership
      Behrooz Eliasi معصومه صادقی نسرین دسترنج Mehdi Hosseinpour Tahereh Mirsaeedghazi
      Iss. 51 , Vol. 14 , Spring 2022
      Enhancing accessibility to open data ensures to promote the research, innovation, and extension of solutions confronting with complex social challenges in our country. Offered policies by OECD and other scientific associations, is an emphasis on this strategy. Certainly More
      Enhancing accessibility to open data ensures to promote the research, innovation, and extension of solutions confronting with complex social challenges in our country. Offered policies by OECD and other scientific associations, is an emphasis on this strategy. Certainly, implementing the strategy needs stablishing governance systems, clarifying processes, and trustiness guarantee to research and business areas. The main part of valuable data resources is personal in nature and gathering, storage, and processing them in cybernet is an enormous source of earning for data-driven businesses. Including the main challenges in trustiness issue, are decision making on privacy policies and ownership. In this paper, considering the complexity in ownership concept for personal data ecosystem, challenges on offered policies like OECD reports will be negotiated to enhance open data. Also, shortages in E-trade and cybercrime rules in our country are briefly debated. Then, aiming to suggest an accessing policy to open data, referring to public sensitiveness to personal data, firstly the detailed conclusions of a field study including realizing criterias of goal and possibly policymaking will be extracted by Delphi method. This work shows the public awareness in this subject, even in an excellence target community, is not desirable. Moreover, trustiness in privacy for personal data, hoping to effective law performance on violations, is not satisfiable. Finally, with a field evaluation by FAHP method, policymaking options will be measured and analysed and strategic requirements for performing elected policy will be suggested. Manuscript profile

    • Open Access Article

      13 - Introducing a new optimal energy method for targets tracking in wireless sensor network using a hunting search algorithm
      Shayesteh Tabatabaei Hassan Nosrati Nahook
      Iss. 51 , Vol. 14 , Spring 2022
      In this paper, in order to increase the accuracy of target tracking, it tries to reduce the energy consumption of sensors with a new algorithm for tracking distributed targets called hunting search algorithm. The proposed method is compared with the DCRRP protocol and t More
      In this paper, in order to increase the accuracy of target tracking, it tries to reduce the energy consumption of sensors with a new algorithm for tracking distributed targets called hunting search algorithm. The proposed method is compared with the DCRRP protocol and the NODIC protocol, which uses the OPNET simulator version 11.5 to test the performance of these algorithms. The simulation results show that the proposed algorithm performs better than the other two protocols in terms of energy consumption, healthy delivery rate and throughput rate. Manuscript profile

    • Open Access Article

      14 - A Collaborative Filtering Movie Recommendation System Based on Users Correlation and Weighted K-Means with High Accuracy
      Nozar Ebrahimi Lame Fatemeh saghafi Majid Gholipour
      Iss. 51 , Vol. 14 , Spring 2022
      A Recommendation system is a BI tool that it uses data mining methods for guiding and helping the user to select the best items based on her/his preferences and in the shortest time. Despite more than two decades of academic research on recommendation systems, this issu More
      A Recommendation system is a BI tool that it uses data mining methods for guiding and helping the user to select the best items based on her/his preferences and in the shortest time. Despite more than two decades of academic research on recommendation systems, this issue is still one of the most up-to-date research challenges. Recommendation systems save the users time, increase their satisfaction and their loyalty to sales sites and lead to the development of e-commerce, by personalizing the recommendations of goods or services to site users. Nowadays, recommendation systems have many applications in various sectors of e-commerce, especially in media products such as books, movies, and music. The famous e-commerce sites such as eBay, Amazon, and Netflix and domestic sites such as Digikala, Divar, and Filimo widely use recommendation systems. These systems use a variety of big data filtering methods to provide appropriate recommendations. The most important and widely used filtering method is collaborative filtering (CF). In this paper, we implement three CF recommender systems based on the correlation coefficient between users, selecting the optimal number of neighbors and calculating weighted scores for unwatched movies. The best method with the least error is selected as the desired model. We use Movielens ml-latest-small 100k research dataset with 9742 movies and 610 users as input. The results showed 3.29% less RMSE error compared with the latest research that has used the correlation method. Manuscript profile

    • Open Access Article

      15 - Architectural tactics identification in source code based on a semantic approach
      Ehsan Sharifi احمد عبداله زاده بارفروش
      Iss. 51 , Vol. 14 , Spring 2022
      Software systems are alive as long as they can be changed and used. Changing the source code without considering the consequences can lead to the architectural erosion of software systems. Architectural erosion gradually makes the system unchangeable and moves it toward More
      Software systems are alive as long as they can be changed and used. Changing the source code without considering the consequences can lead to the architectural erosion of software systems. Architectural erosion gradually makes the system unchangeable and moves it towards deterioration. Architectural decisions in the source code are usually made using architectural tactics. These tactics are fine-grained decisions that are made to achieve a certain quality attribute. Recognizing these tactics in the source code, allows developers to change the code while knowing the implementation location of these decisions. This slows the architectural erosion process and delays the system's movement towards deterioration. Thus, this paper introduces a method based on semantic web for recognizing the architectural tactics presented in the source code. Based on this approach, the new concept of microtactic is introduced that increases the possibility of recognizing architectural tactics using a semantic web and ontological approach. The evaluation results show that this method in comparison with other similar methods recognizes the tactics with higher precision and quality. Manuscript profile
    Most Viewed Articles

    • Open Access Article

      1 - Determining the factors affecting the collective financing of knowledge-based IT companies
      Ali Haji Gholam Saryazdi ali rajabzadeh ghotri alinaghi mashayekhi alireza hassanzade
      Iss. 37 , Vol. 10 , Autumn_Winter 2019
      The method of crowdfunding in the world has expanded rapidly due to the need for financing in the early stages of start-up businesses as well as advances in information technology. In Iran, several financing platforms have been established so far, some of which have bee More
      The method of crowdfunding in the world has expanded rapidly due to the need for financing in the early stages of start-up businesses as well as advances in information technology. In Iran, several financing platforms have been established so far, some of which have been successful and some of which have been unsuccessful. Therefore, it is necessary to help the development of this method by examining the factors affecting it. Since mass financing is a new and new phenomenon, it is necessary to increase its awareness in the society in an appropriate way while determining the factors affecting this method. The collective modeling method is based on social networks and the Web 2 with the aim of recognizing new phenomena. Therefore, in this article, using collective modeling, the factors affecting crowdfunding in Iran in order to support start-up companies in the field of IT are discussed. Manuscript profile

    • Open Access Article

      2 - Image Processing of steel sheets for Defect Detection by using Gabor Wavelet
      masoud shafiee mostafa sadeghi
      Iss. 13 , Vol. 4 , Spring 1391
      In different steps of steel production, various defects appear on the surface of the sheet. Putting aside the causes of defects, precise identification of their kinds helps classify steel sheet correctly, thereby it allocates a high percentage of quality control process More
      In different steps of steel production, various defects appear on the surface of the sheet. Putting aside the causes of defects, precise identification of their kinds helps classify steel sheet correctly, thereby it allocates a high percentage of quality control process. QC of steel sheet for promotion of product quality and maintaining the competitive market is of great importance. In this paper, in addition to quick review of image process techniques used, using image process by means of Gabor wavelet, a fast and precise solution for detection of texture defects in steel sheet is presented. In first step, the approach extracts considerable texture specification from image by using Gabor wavelet. The specification includes both different directions and different frequencies. Then using statistical methods, images are selected that have more obvious defects, and location of defects is determined. Offering the experimental samples, the accuracy and speed of the method is indicated. Manuscript profile

    • Open Access Article

      3 -
      mostafa sadeghi masoud shafiee
      Iss. 14 , Vol. 4 , Autumn_Winter 2012

    • Open Access Article

      4 - طراحی اولین پایگاه داده کلمات دستنویس کردی برای سیستم های تشخیص تصویری کلمات
      fatemeh daneshfar basir alagheband vahid sharafi
      Iss. 17 , Vol. 5 , Spring_Summer 1393
      چکیده: یکی از اجزای زیربنایی سیستم های تشخیص تصویری کلمات پایگاه داده هاست. هر سیستمی که در این زمینه طراحی گردد لاجرم می بایست از یک نوع پایگاه داده ها استفاده کند. بدیهی است چون موضوع مورد مطالعه در این سیستم ها شکل نوشتاری زبان های مختلف میباشد پس برای هر زبان مشخص More
      چکیده: یکی از اجزای زیربنایی سیستم های تشخیص تصویری کلمات پایگاه داده هاست. هر سیستمی که در این زمینه طراحی گردد لاجرم می بایست از یک نوع پایگاه داده ها استفاده کند. بدیهی است چون موضوع مورد مطالعه در این سیستم ها شکل نوشتاری زبان های مختلف میباشد پس برای هر زبان مشخص پایگاه داده بخصوصی لازم است. زبانی که این مقاله بر آن متمرکز شده کردی است و در این مقاله مراحل مختلف چگونگی طراحی اولین پایگاه داده دستنویس برای زبان کردی شرح داده شده است. از آنجا که تاکنون هیچ پایگاه داده ای مخصوص تشخیص تصویری کلمات، مربوط به زبان کردی طراحی نشده است بنابراین زمینه ای بکر و مستعد برای انجام تحقیق محسوب می گردد. همچنین با توجه به اینکه زبان کردی دارای دو رسم الخط مختلف لاتین و آرامی می باشد در این مقاله منحصرا به رسم الخط آرامی البته از نوع دستنویس آن پرداخته شده است. Manuscript profile

    • Open Access Article

      5 - Investigation on Faculty Members’ Competency Model in Virtual Environment Comment by the Faculty Members and Students (Based on Ormaner’s Model)
      maghsod farasatkhah مریم  فرهنگی
      Iss. 21 , Vol. 6 , Spring 1394
      Virtual university, offering an appropriate model for its creation and e-Learning realization, is a significant issue that should be considered seriously by managers and higher education administrative. Higher education system, through virtual university development not More
      Virtual university, offering an appropriate model for its creation and e-Learning realization, is a significant issue that should be considered seriously by managers and higher education administrative. Higher education system, through virtual university development not only could enhance the accessibility of people who interested in learning regardless of time and place constraints but also could full fill the issues such as developments of new strategies on leaning, representing the distinguished course materials, employed the outstanding faculty members, teaching and learning based on individuals ability, augmentation on effectiveness, increase the individual responsibility in learning (student base), learners society realizations and research association establishment. An attempt have been made to investigate the faculty members’ competencies in virtual environment. Considering a descriptive correlation type model and presenting a conceptual model four quarries utilized in order to assessing the model which Mehralborz university choose for field study and evaluating the practical results and their implementations. The questionnaire choose for purpose of data gathering which is checked and verified for validity and reliability and the produced results assessed in two stages. Using the check lists the most adapted competencies with model determined and by using three underlying factors they categorized in three groups. On the second stage, considering the competency discriminations the questionnaire handed out among the faculty members (all these people are working in the field of e-learning) and they gathered for analysis after questionnaires complementation. The results show that there is a significant correlation coefficient between the faculty members’ competency factors in virtual environment. Furthermore the dimensions and competency’s factor as well as students and teachers approach were examined and these two approaches were prioritized. Manuscript profile

    • Open Access Article

      6 - Cyber Threats Foresight Against Iran Based on Attack Vector
      mahdi omrani masoud shafiee siavash khorsandi
      Iss. 33 , Vol. 9 , Autumn_Winter 2017
      Cyber ​​threats have been extraordinary increased in recent years. Cyber ​​attackers, including government agencies or hackers, have made significant advances in the use of various tools for attacking target systems in some countries particularly on Islamic republic of More
      Cyber ​​threats have been extraordinary increased in recent years. Cyber ​​attackers, including government agencies or hackers, have made significant advances in the use of various tools for attacking target systems in some countries particularly on Islamic republic of Iran. The complexity of cyber threats and the devastating effects of them on critical systems highlights necessity of cyber threats Foresight. This research can prepare the country for countering cyber threats based on existing and potential attack Vectors. First of all, 18 major cyber threats drivers base on attack Vectors through reviewing resources and interviewing with seven experts were identified. We use cross-impact analysis Future studies method to indicate main drivers of future cyber threats such as social engineering, Denial of service, ransomwares, spoofing and fraud and non-state actors. Mic Mac software will be used for this step. Finally, future scenarios for cyber threats were identified by using scenario-based approach. Scenario Wizard software will be used. The results of the research include two strong scenarios and 18 possible scenarios, based on the strongest scenario, ransomware, spoofing, fraud, social engineering and Denial of service are the most likely cyber threats by non-state actors through in a limited level Manuscript profile

    • Open Access Article

      7 - An Investigation on the Effect of Multifactor Model of Improving Critical Thinking in E-learning Environments
      mohammadreza nili jamshid heydari hossein moradi
      Iss. 21 , Vol. 6 , Spring 1394
      In the third millennium, people deal with multiple, diverse, and complicated problems as they cannot possess full control over the information, which is constantly produced and accumulated. Having a high skill of critical thinking for assessing the results of different More
      In the third millennium, people deal with multiple, diverse, and complicated problems as they cannot possess full control over the information, which is constantly produced and accumulated. Having a high skill of critical thinking for assessing the results of different issues and decision making about them based on evidences is an unavoidable necessity. The researchers of this work proposed a model with seven factors (components) for critical thinking in e-learning environments. The statistical group of this work is the M.Sc. medical education students of  AZAD university e-learning environments, and the students of the same field from Islamic Azad University traditional education system studying during 2011-2012. Among the research community, 47 members were selected based on a simple random method and divided into two trial (with 23 members) and reference (with 42 members) groups. To train the trial group, the seven-factor critical thinking training scale was utilized in e-learning environments in 15 sessions with empirical sciences course. In the reference group, the same seven-factor critical thinking training scale was used in the classroom environment in lecturing in 15 sessions with empirical sciences course. The model factors and components are challenge, representation, creation of opportunity, creation of motivation, logical analysis, encouragement, responsibility, and commitment. Both groups were subject to two pretest and posttest steps within two trial groups, which were considered as reference to each other. Both groups responded to the Watson- Glaser™ Critical Thinking Appraisal within two pretest and posttest steps, while the covariance analysis statistical test was used for analysis of the results. The results indicate significant difference between the scores between trial and reference groups in improving the critical thinking of the students in terms of inferential, assumption detection, deduction, interpretation, and logical reasoning evaluation components (p=0.001). According to the results, in terms of improving critical thinking, the trial group trained in the e-learning environment indicates higher scores as compared to the group trained in the traditional classroom environment. Manuscript profile

    • Open Access Article

      8 - A Satellite Control Method Using Laguerre Model Predictive Control Approach
      shekoofeh jafari fesharaki farzad tihidkhah heydarali talebi
      Iss. 15 , Vol. 5 , Summer 1392
      In this paper a Model Predictive Method based controller is proposed to control a satellite. Model Predictive Control (MPC) has been well known as a practical control method for various systems in industry. A problem with this method is its computational effort and time More
      In this paper a Model Predictive Method based controller is proposed to control a satellite. Model Predictive Control (MPC) has been well known as a practical control method for various systems in industry. A problem with this method is its computational effort and time consuming. To reduce computational load Laguerre functions have been proposed in this literature. Simulation results are given to show feasibility and the validity of the design. A comparison between the time consumed in the presence and the absence of the Laguerre functions is done too. Manuscript profile

    • Open Access Article

      9 -
      roohollah barzamini masoud shafiee
      Iss. 6 , Vol. 2 , Autumn_Winter 2010

    • Open Access Article

      10 - Using web analytics in forecasting the stock price of chemical products group in the stock exchange
      amir daee Omid Mahdi Ebadati E. keyvan borna
      Iss. 39 , Vol. 11 , Spring_Summer 2019
      Forecasting markets, including stocks, has been attractive to researchers and investors due to the high volume of transactions and liquidity. The ability to predict the price enables us to achieve higher returns by reducing risk and avoiding financial losses. News plays More
      Forecasting markets, including stocks, has been attractive to researchers and investors due to the high volume of transactions and liquidity. The ability to predict the price enables us to achieve higher returns by reducing risk and avoiding financial losses. News plays an important role in the process of assessing current stock prices. The development of data mining methods, computational intelligence and machine learning algorithms have led to the creation of new models in prediction. The purpose of this study is to store news agencies' news and use text mining methods and support vector machine algorithm to predict the next day's stock price. For this purpose, the news published in 17 news agencies has been stored and categorized using a thematic language in Phoenician. Then, using text mining methods, support vector machine algorithm and different kernels, the stock price forecast of the chemical products group in the stock exchange is predicted. In this study, 300,000 news items in political and economic categories and stock prices of 25 selected companies in the period from November to March 1997 in 122 trading days have been used. The results show that with the support vector machine model with linear kernel, prices can be predicted by an average of 83%. Using nonlinear kernels and the quadratic equation of the support vector machine, the prediction accuracy increases by an average of 85% and other kernels show poorer results. ارسال Manuscript profile
    Upcoming Articles

    • Open Access Article

      1 - The effect of emotional intelligence of project managers on the effectiveness of team communication in Iranian research institutes (Case study: Information and Communication Technology Research Institute)
      Mansoureh  Mohammadnezhad Fadard Ehram Safari
      Management is doing things with and through others. However, in managing a research projects, having technical capabilities is considered as the main criterion and less attention is paid to the behavioral aspects of project managers such as emotional intelligence as an More
      Management is doing things with and through others. However, in managing a research projects, having technical capabilities is considered as the main criterion and less attention is paid to the behavioral aspects of project managers such as emotional intelligence as an important component in communicating with people. The results of several studies have shown that not paying attention to this issue has reduced effective communication in project teams and ultimately the failure of projects. The present study was conducted to assess the effect of different dimensions of project manager's emotional intelligence on the effectiveness of team communication. The method of this research is descriptive-analytical correlation that the statistical population consists of managers and members of project teams in the Information and Communication Technology Research Institute. The statistical sample includes 19 project teams that have been selected by census method. In order to collect data, the Bar-On emotional intelligence questionnaire and the senior interpersonal relationships questionnaire were used. Pearson correlation coefficient, multivariate regression and imaginary variable regression as well as dependent t-test were used to analyze the data. The results show that project manager's emotional intelligence affects effective interpersonal relationships in the project team. However, only interpersonal skills, interpersonal skills, and adjustment can predict effective interpersonal relationships in the team, and the dimensions of general mood and stress management do not affect these relationships. Manuscript profile

    • Open Access Article

      2 - Improving Opinion Aspect Extraction Using Domain Knowledge and Term Graph
      Mohammadreza Shams Ahmad  Baraani Mahdi Hashemi
      With the advancement of technology, analyzing and assessing user opinions, as well as determining the user's attitude toward various aspects, have become a challenging and crucial issue. Opinion mining is the process of recognizing people’s attitudes from textual commen More
      With the advancement of technology, analyzing and assessing user opinions, as well as determining the user's attitude toward various aspects, have become a challenging and crucial issue. Opinion mining is the process of recognizing people’s attitudes from textual comments at three different levels: document-level, sentence-level, and aspect-level. Aspect-based Opinion mining analyzes people’s viewpoints on various aspects of a subject. The most important subtask of aspect-based opinion mining is aspect extraction, which is addressed in this paper. Most previous methods suggest a solution that requires labeled data or extensive language resources to extract aspects from the corpus, which can be time consuming and costly to prepare. In this paper, we propose an unsupervised approach for aspect extraction that uses topic modeling and the Word2vec technique to integrate semantic information and domain knowledge based on term graph. The evaluation results show that the proposed method not only outperforms previous methods in terms of aspect extraction accuracy, but also automates all steps and thus eliminates the need for user intervention. Furthermore, because it is not reliant on language resources, it can be used in a wide range of languages. Manuscript profile
  • Affiliated to
    Iranian Information and Communication Technology Association
    Director-in-Charge
    Masoud Shafiee (Amirkabir University of Technology)
    Editor-in-Chief
    Mohammad-Shahram Moin (Research Institute of Communication and Information Technology)
    Executive Editor
    Mohammad-Shahram Moin (Research Institute of Communication and Information Technology)
    Editorial Board
    H. Nezamabadi-pour Ramezan Ali Sadeghzadeh Hassan Rashidi Alireza Behrad Mir Mohsen Pedram Fatemeh saghafi (Associate Prof. of University of Tehran ) ميرهادي سيد عربي mohamad hesam tadayon Masoud Shafiee (Amirkabir University of Technology) Mohamadreza Aref (Sharif University of Technology) Ahmad Motamedi (Amirkabir University of Technology) Ahmad-Reza Sherafat Reza FarajiDana Mohammad Teshnelab (Khajeh Nasir al-Din Tusi University of Technology) Mohammad-Shahram Moin (Research Institute of Communication and Information Technology)
    Print ISSN: 2717-0411
    Online ISSN:2783-2783
    Email
    jouraict@gmail.com
    Address
    Room 612, 6th Floor, Aboureyhan Bldg., AmirKabir University of Technology, Hafez Avenue, Tehran, Iran
    Phone
    021-66495433

    Search

    News( Archive )

    Statistics

    Number of Volumes 13
    Number of Issues 25
    Printed Articles 228
    Number of Authors 1249
    Article Views 561513
    Article Downloads 108106
    Number of Submitted Articles 856
    Number of Rejected Articles 484
    Number of Accepted Articles 203
    Acceptance 23 %
    Admission Time(Day) 184
    Reviewer Count 360
    Last Update 10/4/2022