[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
Main Menu
Home::
Journal Information::
Articles archive::
For Authors::
For Reviewers::
Registration::
Site Facilities::
Indexing::
Contact us::
::
Search in website

Advanced Search
..
Receive site information
Enter your Email in the following box to receive the site news and information.
..
Print ISSN
Print ISSN: 2476-3047
..
:: Search published articles ::

Sara Moqimi , Mohammad Ali Hadavi,
Volume 11, Issue 2 (3-2023)
Abstract

How to exploit vulnerabilities and their damage potentials are mainly affected by the capability of attackers. The more powerful the attacker, the greater risk of threats and vulnerabilities. Therefore, the security analysis of a web application and choosing risk mitigation countermeasures depend on the ability of the attackers threaten the application. Focusing on SQL injection attacks, this paper is aimed at modeling the attacker’s capability to be further used for appropriate security evaluation and choosing cost-effective security controls. We model the attacker’s capability with the triple ⟨Type, Technique, Entry_Point⟩. The value in each component of the triple is obtained from the payloads through which the attacker tries to exploit the injection vulnerabilities. The Type represents the injection type, including a known set of injection attack types namely, Error_based, Union_based, Boolean_based_Blind and etc. The Technique represents the techniques, which are used by the attacker during the attack, e.g. using Special Character, using UNION, using Complex Query, using Encoding and etc. Finally, the Entry_Point represents the set of known injection entry points including GET/POST method, Http_Variables, Frequenc_based_Primary_Application and etc. This model is used for leveling and comparing the attacker’s capabilities as well as for leveling the security of a web application with respect to the level of the attacker who is able to compromise the web application. The results of the experimental evaluation show that the proposed model can be used to determine the attacker’s capability level. The model can be simply extended to adopt other security vulnerabilities attacks.

Vajiheh Sabeti , Masomeh Sobhani,
Volume 11, Issue 2 (3-2023)
Abstract

Steganography is the science and art of hiding the existence of communication. In steganography, by hiding information in a digital media, the existence of communication remains hidden from the enemy’s view. The idea of spatial-based adaptive methods is to embed more in the edge areas of the image. In these methods, areas of the image that have more changes are prioritized for embedding. On the other hand, wavelet-based methods perform embedding in high frequency subbands to match the human vision system. The proposed idea in this paper is embedding with higher priority in areas of high frequency subbands resulting from wavelet transform that have many changes. First, according to the length of the data, a threshold value is determined, based on which suitable embedding areas are identified in each high frequency subband, and then the embedding process is performed in them. This process is such that the receiver can extract the data completely by repeating it. The implementation results show that in the proposed method, the use of Integer Wavelet Transform (IWT) is more successful than Discrete Wavelet Transform (DWT). The quality of the resulted stego image is higher and its security is more than the other wavelet-based methods.

, ,
Volume 12, Issue 1 (9-2023)
Abstract

The development of information and communication technology has led to the increasing production of new products in this field. One of the critical products protect informational assets at various levels of security in this field is cryptographic module. The security of cryptographic modules for providing a practical degree of protection against attacks should be examined totally. Therefore, the security evaluation of a cryptographic module requires a strong awareness of the potential weaknesses that would become security flaws, and careful consideration of security during all aspects of the evaluation process. In this paper, we present a comprehensive picture of the security evaluation criteria of the cryptographic module in accordance with existing international standards (e.g. FIPS 140-2 ,3 and ISO 15408, PKCS#11) and we propose the model based on fuzzy-weighted linear combination for measuring the compliance of these criteria correctly. Also, the structure of any kind of evaluation requires considerable cost and spends amount time, which on the one hand depends on the policies and requirements of the country and on the other hand depends on the facilities and experts. Finally, introducing and providing solutions that help solve the challenges, so we present some challenges about security evaluation in our country actually confirms the importance of study and research in this area.
Dr Somayeh Dolatnezhad Samarin, Dr Morteza Amini,
Volume 12, Issue 1 (9-2023)
Abstract

In recent years, one of the main topics of interest in the security of outsource computations is checking the integrity of the results received from the outsourced computations. Outsourced computations can be run on data received from single or multiple data sources. There are a few methods proposed for system models with distributed data sources. The main solutions provided in this area to verify the correctness of the execution of any or some special functions such as linear, polynomial or aggregate functions are categorised to: (1) verifiable computations, (2) homomorphic authenticators, and (3) methods proposed for specific applications such as outsourced databases, wireless sensor networks and data stream management systems. In this paper, these methods, especially the methods proposed for outsourced computations in data stream management systems, have been reviewed and compared.
Amin Hosseingholizadeh, Farhad Rahmati, Mohammad Ali,
Volume 12, Issue 1 (9-2023)
Abstract

With the emergence of new phenomena in the telecommunications and information technology fields, such as cloud computing and smart networks, we are witnessing new challenges in these areas. One of the most significant challenges is the privacy of outsourced data. Due to the limited processing power of new intelligent devices such as tablets and mobile phones, outsourcing computations to these platforms has gained more attention from users. In addition to data privacy, the security of algorithms used in online software is also of great importance. Therefore, software providers may be concerned about the disclosure of their algorithms after outsourcing them to cloud environments. Existing homomorphic encryption systems can provide privacy for data that needs to be processed online. However, the concurrent privacy of algorithms in these systems has not been addressed. To address this, we introduce a simultaneous homomorphic encryption of data and function called SHDF. This system can homomorphically encrypt all algorithms used in the software and the data to be processed on them, enabling necessary computations to be performed on an insecure server. Furthermore, we show that the proposed system is provably secure. Our implementation results indicate that it is usable in cloud environments with the desired efficiency.
Iman Mirzaali Mazandarani, Dr Nasour Bagheri, Dr Sadegh Sadeghi,
Volume 12, Issue 1 (9-2023)
Abstract

With the increasing and widespread application of deep learning and neural networks across various scientific domains and the notable successes achieved, deep neural networks were employed for differential cryptanalysis in 2019. This marked the initiation of growing interest in this research domain. While most existing works primarily focus on enhancing and deploying neural distinguishers, limited studies have delved into the intrinsic principles and learned characteristics of these neural distinguishers. In this study, our focus will be on analyzing block ciphers such as Speck, Simon, and Simeck using deep learning. We will explore and compare the factors and components that contribute to better performance. Additionally, by detailing attacks and comparing results, we aim to address the question of whether neural networks and deep learning can effectively serve as tools for block cipher cryptanalysis or not.
Mr. Nasser Zarbi, Dr Ali Zaeembashi, Dr Nasour Bagheri,
Volume 12, Issue 1 (9-2023)
Abstract

Leakage-resilient cryptography aims to design key exchange protocols to withstand leakage attacks. These protocols are examined using a leakage-resilient security model to determine whether they possess the claimed security properties. The security analysis focuses on how the leakage-resilient security model has evolved to meet increasing security requirements and cover a broader range of attacks. By studying and analyzing the presented security properties of these models, potential vulnerabilities in protocol design can be effectively addressed. This article delves into various leakage-resilient security models based on two models, CK and eCK, and provides examples of secure key exchange protocols defined within these models. Additionally, it explores the relationship between adversaries' capabilities in these models and different attack schemes in the real world. By offering insights into various leakage-resilient security models, leakage attacks, and the development of secure protocols, it contributes to advancing knowledge in this field.
Mohammad Dakhilalian, Masomeh Safkhani, Fatemeh Pirmoradian,
Volume 12, Issue 1 (9-2023)
Abstract

Providing all remote services requires mutual authentication of participating parties. The framework by which this authentication is done is called authentication protocols. In other words, cryptographic or cryptographic protocol is a distributed cryptographic algorithm that establishes interactions between at least two or more hosts with a specific purpose. In fact, these protocols have provided secure and insecure channels for communication between the parties participating in the protocol. Usually, secure channels are used for registration and insecure channels for mutual authentication. After registering on the server and verifying its identity by the server, the user can benefit from the services provided by the server. Many authentication protocols have been proposed in fields such as e-medical care, Internet of Things, cloud computing, etc. The privacy and anonymity of users in these plans is the biggest challenge in implementing a platform to benefit from remote services. Due to the fact that authentication of users takes place on the insecure platform of the Internet, it can be vulnerable to all existing Internet attacks. In general, there are two methods to analyze and prove the security of authentication protocols. Formal method and In-formal method. The In-formal method, which is based on intuitive arguments, analyst's creativity and mathematical concepts, tries to find errors and prove security. While the formal method, which is done both manually and automatically, has used a variety of mathematical logics and automatic security analysis tools. Manual method using mathematical models such as Real Or Random and mathematical logics such as BAN logic, GNY logic, etc., and automatic method using AVISPA, Scyther, ProVerif, TAMARIN, etc. tools. In fact, the methods of proving and analyzing the security of security protocols are divided into two general categories based on proof of theorem and model verification, and in this article, the details of each of these methods of proving security are explained. It should be noted that most of the security protocol verification tools are based on model verification. The methods based on model checking and then the methods based on proving the theorem are described.
 
Mrs. Narges Mokhtari, Mr. Amirhossein Safari, Dr Sadegh Sadeghi,
Volume 12, Issue 1 (9-2023)
Abstract

Biometric systems are an important technique for user identification in today's world, which have been welcomed due to their non-invasive nature and high resistance to forgery and fraud. Physiological and behavioral biomarkers are two main types of biometric identifiers. Behavioral identifiers, such as voice recognition, are based on human or even animal actions. Physiological biometrics, such as fingerprints and facial recognition, which have been used in our daily lives in the past years, are based on the physical characteristics of the human body. One of the various biometrics that have been investigated in studies in this field is the heart signal, which has been well used in authentication and identification systems due to its simple acquisition process compared to biomarkers such as the brain signal. In addition, there are valid databases on heart signal data, which the researchers of this issue refer to evaluate their systems. In this study, the analysis, analysis, and comparison of different authentication methods using heart signal biometrics have been studied. Also, in the following, the advantages and disadvantages of deep learning methods and models proposed in this field have been examined. In the final part, firstly, the implementation of the method presented in Fuster and Lopez's research is discussed, and then, to evaluate, we present the tests designed using the network created in this study, and after that, concluding based on the results.
Ahmad Rahdari, Mohammad Hesam Tadayon,
Volume 12, Issue 2 (2-2024)
Abstract

Cyber security education in Iran is not aligned with global standards and approaches, and three factors, the educational sector, training applicants and stakeholders, and companies do not have proper knowledge of the required specializations and work roles. Different specializations in cyber security work fields in Iran do not match the international standard puzzles and this has created security holes in the country's cyber ecosystem. People working in cyberspace need a combination of domain-specific knowledge, skills, abilities, and other expertise to be as reliable and resilient as the technologies they work with.
At the international level, several frameworks have been designed and implemented for the training and employment of cybersecurity professionals. The most important of which are the US National Initiative for Cybersecurity Education (NICE) Cybersecurity Workforce Framework, the European Cybersecurity Skills Framework (ECSF), and the Australian Signals Directorate (ASD) Cyber Skills Framework. In this paper, each of these frameworks is briefly introduced and their key features, including purpose, structure, and components, are reviewed and analyzed. In addition, their effectiveness in handling global organizations' challenges in creating and developing cybersecurity expert human resources is evaluated and analyzed critically. This review highlights the strengths and weaknesses of each framework, shows the propinquity of one of the frameworks to Iran's educational and labor markets, and provides recommendations for designing a national framework for training and employing cybersecurity professionals, which can be a great lesson for the country to ensure that the necessary measures are taken as soon as possible by those in charge.
 
Zahra Jafari, Sahar Palimi, Mohamadamin Sabaei, Rahman Hajian, Hossein Erfani,
Volume 12, Issue 2 (2-2024)
Abstract

In the Internet of Things (IoT) environment, security and privacy are paramount concerns for critical applications. The LoRa protocol efficiently enables long-range communication for resource-constrained end devices in LoRaWAN networks. To foster technology adoption and user trust, safeguarding the data collected by end devices is essential. Authentication and key agreement protocols play a pivotal role in achieving this goal. Here, we introduce a novel scheme for authentication and key exchange in LoRaWAN, enabling mutual authentication among participants. This scheme empowers users/end devices and network servers to establish secure end-to-end session keys without unconditional trust. We assess the scheme's security informally and provide formal verification using AVISPA tools and the BAN logic. Furthermore, we compare it to existing authentication schemes, demonstrating its efficiency in terms of computational and communication overhead.
Hadi Norouzi Cholcheh, Salman Niksefat,
Volume 12, Issue 2 (2-2024)
Abstract

Financial transactions in Bitcoin are stored in a distributed database called the block chain. All transactions are publicly available for all network nodes with the aim of transparency and the possibility of verifying the correctness. But this blockchain transparency feature, exploited by transaction analysis techniques, can lead to the violation of users’ privacy and the disclosure of their identities. Researchers have proposed various techniques such as transaction mixing or fair exchange with the aim of improving privacy in Bitcoin transactions. In this paper, we present a new mixing scheme that overcomes some of the weaknesses of previous schemes. Obviously, in the proposed scheme, users can mix different amounts of Bitcoin in each round of the protocol implementation, which leads to achieving the result in a shorter time and at a lower cost. Also, this scheme is more resistant to denial of service attacks by malicious users.
Sajjad Maleki Lonbar, Akram Beigi, Nasour Bagheri,
Volume 12, Issue 2 (2-2024)
Abstract

In the world of digital communication, authentication is an important concern and the need for a safe and secure system increases the necessity of designing authentication systems. To perform authentication, biometric-based approaches are of great interest due to the property of being alive and resistant to forgery. In this study, an authentication system based on heart signal is designed. Due to the process of receiving heart signals, their data usually has a lot of noise. In order to prepare the data, in the proposed system, the heart signals are first cleaned and then transferred to the frequency domain for feature extraction. Also, they are converted into an image by applying the Wigner-Ville distribution, so that each image contains the signal information of each person’s heart and is unique. In the proposed authentication system, these images are used for training and evaluation in a deep convolutional neural network. The output of this system provides the possibility of people’s identification. The data of this study are taken from the NSRDB and MITDB databases, and significant results have been obtained compared to previous studies.
Vajiheh Sabeti, Mahdiyeh Samiei,
Volume 12, Issue 2 (2-2024)
Abstract

Steganalysis is the art of detecting the existence of hidden data. Recent research has revealed that convolutional neural networks (CNNs) can detect data through automatic feature extraction. Several studies investigated the performance of existing models using a limited number of spatial steganography methods. This study aims to propose a CNN and comprehensively investigate its efficiency in detecting different spatial methods. The proposed model comprises three modules: preprocessing, convolutional (five blocks), and classifier (three fully connected layers). The test results for the least-significant-bit (LSB) and pixel-value differencing (PVD) based methods indicate that the proposed method can detect data of even concise length with high
accuracy and a low error. The proposed method also detects complexity-based LSB-M (CBL) as an adaptive approach. Lower embedding rates make this success even more impressive. Manual feature extraction has much lower success rates due to low variations of statistical features at low embedding rates than the proposed model.
Javad Alizadeh, Seyyed Hadi Noorani Asl,
Volume 12, Issue 2 (2-2024)
Abstract

The Internet of Drones (IoD) refers to the use of unmanned aerial vehicles (UAVs) connected to the Internet. This concept is a specific application of IoT. The IoD may offer opportunities, but it also poses security vulnerabilities. It is necessary to use authentication and key agreement protocols in drone communications to prevent these vulnerabilities. In 2020, Alladi et al presented an authentication and key agreement protocol based on physical unclonable functions called SecAutUAV. They analyzed the security of their scheme through both formal and informal methods. In this paper, we demonstrate the vulnerability of the SecAuthUAV protocol to a key recovery attack. An adversary can obtain a session key between a drone and a ground station by intercepting and analyzing the session data. In addition, we present a secret value recovery attack with complexity , which is lower than the complexity of brute force attacks. An adversary could spoof and track the drone by using these values. In order to improve the security and efficiency of SecAuthUAV, we present a new version and compare it to the original. We utilize both the informal method and formal-based ProVerif to analyze the
security of the latest protocol. To compare the efficiency of the new protocol and SecAuthUAV, we counted their number of operators and functions. The new protocol is more secure and efficient than SecAutUAV.
Seyed Hossein Tahami, Hamid Mala,
Volume 12, Issue 2 (2-2024)
Abstract

In a verifiable database scheme (VDB), a client with limited storage resources securely outsources its very large and dynamic database to an untrusted server such that any attempt to tamper with the data, or even any unintentional changes to the data, can be detected by the client with high probability. The latest work in this area has tried to add the secure search feature of single keyword and multiple keywords. In this paper, we intend to add a range query to the features of this database. The scheme presented in this article provides the requirements of a secure search, namely the completeness of the search result, the proof of the empty search result, the lack of additional information leakage and the freshness of the search results, as well as the database with public verifiability. In the proposed scheme, the computational complexity of the client is not changed significantly compared with the previous scheme, but the computational and storage complexity of the server has increased which is justifiable by its rich resources.
Reza Rashidian, Raziyeh Salarifard , Ali Jahanian,
Volume 12, Issue 2 (2-2024)
Abstract

The adoption of post-quantum encryption algorithms to replace older asymmetric algorithms is of paramount importance. Diverse categories of post-quantum encryption, including lattice-based and code-based cryptography, are currently in the final stages of NIST's standardization competition, with the aim of providing security against quantum computers. Among the lattice-based key encapsulation mechanisms (KEM) garnering attention in this competition, the NTRU Prime algorithm stands out. The primary challenge in implementing such algorithms revolves around executing resource-intensive polynomial multiplications within a ring structure. Leveraging the Number Theoretic Transform (NTT) allows us to achieve polynomial multiplication with near-linear efficiency (O (n log n)). To enhance hardware efficiency, butterfly structures are frequently employed in NTT multipliers. Our research centers on comparing our approach with the best multiplication implementations utilized in NTRU Prime on FPGA up to the present version. This involves the redesign and modification of data preprocessing methods and storage structures, resulting in an increase in frequency and a reduction in the utilization of LUT resources.
 
Dr Saeed Banaeian Far, Dr Maryam Rajabzadeh Asaar,
Volume 13, Issue 1 (8-2024)
Abstract

Data outsourcing to reliable centers for data maintenance, protection and accessibility is simple and low-cost and does not require physical infrastructure, hardware, software and human resources. However, real-world events and recent researches have shown that even reliable centers can abuse users' trust. For example, 1) make changes in the data they have, 2) delete them, or 3) make them temporarily/permanently unavailable. Data audit methods assure the data owners that the data recorded in the database is the same as the data sent by the user and reveals the changes made in it. But they only solve the first problem. In 2008, the introduction of a technology called blockchain, which had several attractive features such as transparency, immutability, and autonomy, caused the problems of many systems that needed the mentioned features to be solved. In this article, after reviewing and addressing several blockchain-based data auditing architectures and protocols, we review and analyze their general framework. Finally, we compare the reviewed works and specify some future horizons of this field.

Mr Arash Khalvan, Mr Amirhossein Zali, Dr Mahmoud Ahmadian Attari,
Volume 13, Issue 1 (8-2024)
Abstract

With the advent of computers and quantum algorithms, the security of current public key cryptography systems can face challenges. Breaking the current cryptographic structures would require multi-million qubit quantum computers, which have not yet been built; however, with significant advancements in quantum technology by leading companies in this field and the concern within the cryptography community, there has been a felt need to quickly provide countermeasures. In 2016, the National Institute of Standards and Technology (NIST) sought proposals from around the world to standardize post-quantum cryptographic schemes to address this issue. At that time, the McEliece code-based encryption system (and its equivalent Niederreiter system), despite being proven resistant to both classical and quantum algorithms, was not accepted due to its large public keys. Ultimately, the Classic McEliece, HQC, and BIKE encryption systems, which fall under code-based cryptography, advanced to the final stage of these competitions, and the winners of this cryptographic category will be announced by the end of 2024. This paper aims to review the developments made to optimize code-based structures and to examine the selected code-based cryptographic schemes and the latest status of Classic McEliece standardization.

Parsa Rajabi, Dr. Seyed Mohammad Razavizadeh, Dr. Mohammad Hesam Tadayon,
Volume 13, Issue 1 (8-2024)
Abstract

Authentication plays a pivotal role in ensuring communication security. Cryptographic methods are frequently employed to fulfill this purpose. These methods, implemented at upper network layers, encounter challenges including complexity, power consumption, and overhead. Particularly for users with limited computational power, these methods encounter challenges. A novel solution to overcome these challenges is physical layer authentication (PLA), which involves utilizing physical layer features to embed a tag in the transmitted signal for authentication, leveraging various channel characteristics such as position, velocity, noise, etc. In this paper, a review of previous research is provided, highlighting the differences between physical layer and upper-layer authentication. Furthermore, existing categorizations for PLA and a novel classification based on covertness levels are provided. Moreover, possible attacks and corresponding countermeasures are investigated, followed by suggestions for future research in this area.

Page 7 from 8     

دوفصل نامه علمی  منادی امنیت فضای تولید و تبادل اطلاعات( افتا) Biannual Journal Monadi for Cyberspace Security (AFTA)
Persian site map - English site map - Created in 0.07 seconds with 45 queries by YEKTAWEB 4714