[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
Main Menu
Home::
Journal Information::
Articles archive::
For Authors::
For Reviewers::
Registration::
Site Facilities::
Indexing::
Contact us::
::
Search in website

Advanced Search
..
Receive site information
Enter your Email in the following box to receive the site news and information.
..
Print ISSN
Print ISSN: 2476-3047
..
:: Search published articles ::

, , ,
Volume 4, Issue 2 (3-2016)
Abstract

With emerging of the Internet, the way we communicate with each other has fundamentally revolutionized. The second development wave of the Internet is not about people, but will be about smart connected devices. Although more than a decade passes from the proposing of "Internet of Things" concept, the deployment of this concept has been done slowly for various reasons such as lack of required technologies development and security challenges. We must spend more time to understand the security challenges and available solutions, when we speak about smarter environments and technologies such as IoT. In this paper, we attempt to analysis existent threats and vulnerabilities in the area of security and privacy of Internet of Things using a systematic approach, while presenting a survey of the solutions proposed in the literature. Finally, research opportunities of this area will be discussed.


, ,
Volume 4, Issue 2 (3-2016)
Abstract

As social networking services are popular, the need for recognizing reliable people has become a main concern. So, trust plays an important role in social networks in order to recognizing trustworthy people. The purpose of this paper is to propose an approach to recognize trustworthy users and protect users from misused by untrustworthy users. In this paper, we suggest a method to measure trust value based on call log histories and QoS requirements. After that, we calculate error-hit, precision and recall. Also, the trust issue in social networks is pointed out and the new approach to evaluate the trust based on call log histories and QoS requirements is proposed. The results indicate that the proposed approach has better error-hit, precision and recall than the other four models (FIFO, combined, QoS-based and call log-based). 


Mansour Esmaeilpour, Mina Feili,
Volume 4, Issue 2 (3-2016)
Abstract

Standards in the field of IT security, due to the youthfulness of this area, it is relatively new, but the long history of standard processes, leading to a mature and efficient development of standards in this area. Several researches have been done in the field of information security that shows the breadth and complexity of information security, as well as, several standards has been developed in this field. Ignoring the information security is as open embrace risky on a variety of issues that may be faced in doing anything with it. Information security plays an important role in protecting the assets of the organization. As regards that no formula cannot guarantee complete security, however, need to a series of criteria and standards to achieve the appropriate level of information security resources to be used effectively and the best way to adopt security. Each of them has covered a specific aspect of security, and sometimes a set of standards to cover only one aspect of security. The adoption of information security standards, it must first be emphasized to match the original standard and note that proportionality is localized or they may create problems. The present research introduces the world deal with information security standards. It will be discussed that how to change views of information security in detail, and introduced a variety of tools and solutions.


Mr Mohsen Rezaei, Dr Reza Ebrahimi Atani,
Volume 4, Issue 2 (3-2016)
Abstract

Authenticated Encryption is a block cipher mode of operation which simultaneously provides confidentiality, integrity, and authenticity assurances on the data transmition. In this regard in 2014 CAESAR competition started which aims at finding authenticated encryption schemes that offer advantages over AES-GCM and are suitable for widespread adoption. This paper provides an easy-to-grasp overview over functional aspects, security parameters, and robustness offerings of the CAESAR candidates, clustered by their underlying designs (block-cipher-, stream-cipher-, permutation-/sponge-, compression-function-based, dedicated) and compares encryption/decryption speed of all CAESAR candidates implemented on three processors of three different architectures  AMD64, armeabi and  mipso32.


Mahtab Roozbahani, Meysam Moradi, Parvaneh Mansoori,
Volume 5, Issue 1 (9-2016)
Abstract

In mathematics and computer science an optimization problem, the problem is finding the best solution among all possible solutions. Given the importance of the knapsack in computer sciences, different algorithms are used to solve it. Knapsack problem is a combinational problem of selectivity and the purpose of solving the most benefit by taking the capacity is the tolerable knapsack. Since the knapsack is a problem of constrained maximization. In this study, a mathematical model in the form of a function unlimited minimization and designed for it, hen this model on Particle Swarm Optimization , Firefly Algorithm and Artificial Bee Colony has been implemented in MATLAB software environment, The results show that the artificial bee colony algorithm, the model is better than the other two algorithms .The advantage of this model is the objective function , because minimization and unlimited models , to implement with many  Bio-Inspired algorithms.


Dr Mahmood Deypir, Mozhgan Ghasabi,
Volume 5, Issue 1 (9-2016)
Abstract

Recently, software defined networks have been introduced for innovation and flexibility in computer networks. They are widely used in infrastructure networks and data centers. Using these networks has advantages such as scalability, efficient bandwidth usage, reducing control traffic, better traffic engineering and etc., which are mainly due to their programmability. There are also some security challenges that often arise from the same property. Software defined networks reliability compared to traditional network reduces due to these challenges. Therefore, if software defined networks are not design based on a security architecture, they will be vulnerable against known cyber-attacks such as DDoS, spoofing, information disclosure and etc. In this paper, software defined network security challenges and corresponding solutions are reviewed. Moreover, some applications of software defined networks for security including network traffic separation, network flow access control, and secure routing are mentioned. In order to do security testing and evaluation of relevant security solutions we have explained how these networks are simulated.


Mr Saeid Rezaei, Mr Mohammad Ali Doostari, Mr Majid Bayat,
Volume 5, Issue 1 (9-2016)
Abstract

Cloud environment are known as a revolution in IT industry in the recent decade and many organizations have used this service for data processing and data storage. Despite of fast growing and numerous advantages, some organizations still do not use this service due to security problems and privacy issues related to storing sensitive data on the untrusted cloud servers. Access control management utilizing encryption techniques is one of the most common methods to solve these kinds of problems. Attribute based encryption is a new cryptographic model which uses descriptive attributes and access structures for managing access control. This article discusses the most recent methods of access control in cloud environment using attribute based encryption. We classify these protocols with respect of efficiently and security features. Finally, all the strengths and weaknesses points of reviewed articles are discussed and a comprehensive security and practical comparison is presented.


Sayed Mohamamd Tabatabaei Parsa, Hassan Shakeri,
Volume 5, Issue 1 (9-2016)
Abstract

Wireless Sensor Networks (WSNs) are an ideal solution for miscellaneous applications of surveillance and control, such as traffic control, environmental monitoring, and battlefield surveillance. The wireless sensor nodes have limited memory and processing capability. The Sybil Attack is a serious threat in which a malicious node creates multiple fake identities in order to mislead the other sensor nodes. This attack can have influence on routing protocols and the operations like voting and data aggregation. In this paper, we present a dynamic and lightweight algorithm with a confidence-aware trust approach. The Algorithm uses the trust value of each sensor node to reduce the false alarm rates and detect indirect Sybil attacks in WSNs. The simulation results demonstrate that the average detection and wrong detection rates are 92% and 0.08% respectively.


Seyedeh Zeinab Mohammadi, Dr Nima Jafari Navimipour,
Volume 5, Issue 1 (9-2016)
Abstract


Ali Hadipour, Dr Seyed Mahdi Sajadieh, Raheleh Moradafifi,
Volume 5, Issue 1 (9-2016)
Abstract

The stream ciphers are set of symmetric cipher algorithms that receive the secret message as a sequence of bits and process encrypted operation using complex function according to key, IV and XOR combination of a sequence of bits. One of the goals in the design of stream ciphers is to get minimum great period using one of the primary T-functions. Also using jump index in designing LFSRs lead to complexity of stream ciphers based on LFSR analysis. In this paper, tried with using of T-functions concepts and jump index, a novel method presented for primary functions design with great period.


Mr. Mehdi Sadeghpour, Dr. Reza Ebrahimi Atani,
Volume 5, Issue 2 (3-2017)
Abstract

Data collection and storage has facilitated by the growth in electronic services, and has led to recording vast amounts of personal information in public and private organizations' databases. These records often include sensitive personal information (such as income and diseases) and must be covered from others access. But in some cases, mining the data and extraction of knowledge from these valuable sources, creates the need for sharing them with other organizations. This would bring security challenges in users' privacy. “Privacy preserving data publishing” is a solution to ensure secrecy of sensitive information in a data set, after publishing it in a hostile environment. This process aimed to hide sensitive information and keep published data suitable for knowledge discovery techniques. Grouping data set records is a broad approach to data anonymization. This technique prevents access to sensitive attributes of a specific record by eliminating the distinction between a number of data set ‌records. In this paper an overview of privacy preserving Data Publishing Techniques will be presented.


Dr Masoumeh Safkhani, Mr Mohamadamin Arghavani,
Volume 5, Issue 2 (3-2017)
Abstract

In recent years the security of SHA-3[1] is one of the most interesting topics in the field of cryptography. Cryptography uses Hash functions in different ways. Thus the security of Hash functions against different attacks is of vital importance. Several attacks and many analysis have been applied to SHA-3 till now but none of them could break it theoretically or practically. Keccak relies on a Sponge architecture.
In this paper, we focus on differential fault analysis attack and we review the latest attacks on SHA-3. Specifically, we describe cube attack, differential fault analysis and also describe zero-sum distinguisher attack and pre-image attack by using linear structures.
 
[1] Secure Hash Algorithm 3(SHA-3)

Dr Mansoor Fateh, Samira Rajabloo, Elahe Alipour,
Volume 5, Issue 2 (3-2017)
Abstract

In this paper, the image steganography based on LSB and pixel classification is reviewed. Then, the method for steganography information in image is presented. This method based on LSB. Our purpose of this paper is to minimize the changes in cover image. At the first, the pixels of the image are selected to hiding the message; second complemented message will be hidden in LSB of selected pixels. In this paper, to solve some problems LSB method and minimize the changes, pixels categorized based on values of bits of second, third, fourth. In each category, ratio of changed pixels to unchanged pixels is calculated. If the ratio is greater than one, the LSB of that category are reversed and those changes reach at least. Mean Square Error and Peak Signal to Noise Ratio are two criterions to evaluate stego-image quality. PSNR and MSE of proposed method in comparison with simple LSB method, are respectively growth rate 0.13 percent and reduction rate 0.19 percent.


Shadi Azizi, Maede Ashouri-Talouki, Hamid Mala,
Volume 5, Issue 2 (3-2017)
Abstract

Location-based services (LBSs) provide appropriate information based on users’ locations. These services can be invoked by an individual user or a group of users. Using these services requires users to reveal their locations; thus, providing uses’ location privacy during the use of these services is an important issue. There are many works to protect users’ location privacy. In this paper, we have reviewed the related works to provide the location privacy for a group of users during the use of LBSs. We have classified them into two categories: the first category consists of the solutions that protect an individual user location privacy through group formation, while the second category contains the specific solutions to provide group location privacy. We have then analyzed and compared the performance and security properties of the related works, and have identified the open issues and future works in this field.


Mohammad Reza Goharei,
Volume 5, Issue 2 (3-2017)
Abstract

In this paper, one of the most important challenges in energy smart grids – Denial of service for critical time messages – is addressed. In order to explain and solve this issue, we have described different communication structures for conveyed messages in energy smart grids, various proposed communication technologies, different types of jammings, and variety of conveyed messages in grid. Also, depending on communication technology and jamming in use, different situations of jamming is explored. To describe jamming problem, Gambler’s ruin problem solving theory is used. To eliminate this vulnerability, the least rate of invalid messages is calculated for the worst situation. TACT technic in he least camouflage traffic load in grids is used to match this problem to real world.
, ,
Volume 5, Issue 2 (3-2017)
Abstract

The Identity Management System (IDM) is a set of policies, rules, procedures, and systems that manage authentication, access control, and audit activity based on digital identity. In the Identity Management System, given the storage of user identity attributes in the identity provider, users will lose their physical and traditional control over their personal identity, and on the other hand, user attributes are subject to more attacks from The inside of the system and the external attackers. Therefore, to ensure user's privacy control is keeping awareness of the release of his / her  identity information, So providing security mechanisms in the area of privacy is essential. The current mechanisms have failed to satisfy users confidence in maintaining the security and privacy associated with their identity information. The designs presented in this area will increase user intervention and involvement during system interactions , and the user must act as an interface role. The Problems of the proposed designs, the high computational burden on a part of the system. In this article, the process of improving privacy control and user awareness for solving problems has been investigated.
Zahra Zolfaghari, Nasour Bagheri,
Volume 6, Issue 1 (9-2017)
Abstract

In this article, we introduce Time Memory Trade Off attack and a method for finding near collisions in a hash function. By considering hash computations, it is easy to compute a lower bound for the complexity of near-collision algorithms, and to construct matching algorithm. However, this algorithm needs a lot of memory, and uses  memory accesses. Recently, some algorithms have been proposed that do not require this amount of memory. They need more hash evaluation, but this attack is actually more practical. These algorithms can be divided in two main group: the first group is based on truncation and the second group is based on covering codes. In this paper, we consider the first group that is based on truncation. For practical implementation, it can be assumed that some memory is available, Leurent [10] showed that it is possible to reduce the complexity significantly by using this memory. In the next step, Sasaki et al. [9] proposed improvement of most popular Time Memory Trade off for K-tree algorithm by using multi-collision based on Helman’s table. As a result, they obtained new trade off curve  that for k=4 the tradeoff curve will be . In this article, at the first the methods of TMTO, and then the method of finding near-collision by using TMTO are explained.
Engineer Nasrin Taj Neyshabouri, Engineer Shaghayegh Naderi, Engineer Mahsa Omidvar Sarkandi, Engineer Hassan Koushkaki,
Volume 6, Issue 1 (9-2017)
Abstract

validation among Users, Stockholders, and delivering important and various services with high availability are part of import dimensions of local search engines. This paper provided a comprehensive research result on Combination of security control tools with main components of local search engines, like crawler, ranker, Indexer and security requirements Consideration in all phases of software development life cycle. We organize the existing research works on securing local search engines based on combination of security standards relevant to software development life cycle of systems with key components of local search engines to help developers, software project managers for implementing security controls and requirements properly.
 


, , , ,
Volume 6, Issue 1 (9-2017)
Abstract

Nowadays, with burgeoning of computer networks and content on the internet, the demand is high for watching and searching contents like videos, music, files and documents and search engines, one of the factors that responds to the demands of the users and give them help to reach their goals faster. Also, the search engines, which have many advantages in the mistake design and configured components crawler, indexer and ranking, is malfunction that leads to the disclosure of confidential information users and websites, providing irrelevant and pollution results and non-secure search engines architecture. Therefore, considering the security problems and prospects components in the architecture of search engines, including local search engines is essential. The main parts that should guarantee their security in the architecture of local search engines is inclusion of the crawler (crawl policies, design factors, code injection attacks, crawler availability, crawler database), the indexer (search module, indexer database, indexing mechanisms, design factors) and the ranking (ranking policy, negative Search Engine Optimization (SEO)). The main topics of this article at first the threats and vulnerabilities main components in local search engines are studied, then requirements and security policies is provided for the three major components that leads to local search engines with secure architecture.
, ,
Volume 6, Issue 1 (9-2017)
Abstract

The use of Wireless Sensors Networks (WSNs) is rapidly growing in the areas of research, application, operation, and commerce. These kind of networks are used for monitoring a desired region of an environment. So, many abilities of these networks, by considering their lower cost, have caused them to be applicable in various areas. WSNs are designed in scale of hundreds to thousands nodes, wherein this great scale is technologically the most challenging issue. One of the most basic and challenging problem is the coverage issue. Security is another important issue. Coverage is the most paramount goal of creating and implementing of WSNs, because coverage is directly related to the degree of quality, method, and durability of the WSNs for recognizing the parameters and defined aims of the regions, and the implementation cost. In this paper, the methods of improving the security of public places (by increasing the coverage of the regions based on the sensors networks) have been investigated.  The results indicate that by choosing an appropriate and optimum coverage, it is possible not only to cover the entire region by utilizing the minimum number of sensors, but also it is possible to increase the security of the monitored places of the network by lesser nodes.

Page 2 from 8     

دوفصل نامه علمی  منادی امنیت فضای تولید و تبادل اطلاعات( افتا) Biannual Journal Monadi for Cyberspace Security (AFTA)
Persian site map - English site map - Created in 0.07 seconds with 45 queries by YEKTAWEB 4714