The symmetric encryption algorithm that serves as the cryptographic engine for the triple data encryption algorithm (TDEA).
A U.S. Government-approved, symmetric cipher, encryption algorithm used by business and civilian government agencies that serve as the cryptographic engine for the triple data encryption algorithm (TDEA). The advanced encryption standard (AES) is designed to replace DES. The original “single” DES algorithm is no longer secure because it is now possible to try every possible key with special purpose equipment or a high performance cluster. Triple DES, however, is still considered to be secure.
Deals with the way data records are accessed and retrieved from computer files. When designing a file, security factors, access methods, and storage media costs should be considered. Record keys, pointers, or indexes are needed to read and write data to or from a file. Record key is part of a logical record used for identification and reference. A primary key is the main code used to store and locate records within a file. Records can be sorted and temporary files created using codes other than their primary keys. Secondary keys are used for alternative purposes including inverted files. A given data record may have more than one secondary key. Pointers show the physical location of records in a data file. Addresses are generated using indexing, base registers, segment registers, and other ways.
A graphic model of the logical relationships within a computer-based application system. DFD is an input to the structure chart.
Data/information hiding is closely tied to modularity, abstractions, and maintainability. Data hiding means data and procedures in a module are hidden from other parts of the software. Errors from one module are not passed to other modules; instead they are contained in one module. Abstraction helps to define the procedures, while data hiding defines access restrictions to procedures and local data structures. The concept of data hiding is useful during program testing and software maintenance. Note that layering, abstraction, and data hiding are protection mechanisms in security design architecture.
Data in transit (data on the wire) deals with protecting the integrity and confidentiality of transmitted information across internal and external networks. Line encryption protects the data in transit and line encryption protects data in transfer.
A property whereby data has not been altered in an unauthorized manner since it was created, transmitted, or stored. Data integrity covers data in storage, during data processing, and while in transit.
A cryptographic key used to cryptographically process data (e.g., encrypt, decrypt, and authenticate).
A measure of the currency of security-related data or information. Data latency refers to the time between when information is collected and when it is used. It allows an organization to respond to “where the threat or vulnerability is and where it is headed,” instead of “where it was.” When responding to threats and/or vulnerabilities, this is an important data point that shortens a risk decision cycle.
Three levels of data are possible: Level 1 is classified data. Level 2 is unclassified data requiring special protection, for example, Privacy Act, for Official Use Only, Technical Documents Restricted to Limited Distribution. Level 3 is all other unclassified data.
Provides security for the layer that handles communications on the physical network components of the ISO/OSI reference model.
Data link layer protocols provide (1) error control to retransmit damaged or lost frames and (2) flow control to prevent a fast sender from overpowering a slow receiver. The sliding window mechanism is used to integrate error control and flow control. In data link layer, various framing methods are used including character count, byte stuffing, and bit stuffing. Examples of data link layer protocols and sliding window protocols include bit-oriented protocols such as SDLC, HDLC, ADCCP, or LAPB (Tanenbaum).
Providing or controlling access to data stored in a computer and the use of input/output devices.
A data mart is a subset of a data warehouse with its goal to make the data available to more decision makers.
A generalization of the principle of variable minimization, in which the standardized parts of a message or data are replaced by a much shorter code, thereby reducing the risk of erroneous actions or improper use.