Skip to main content

Cyber Security versus Information Security

The key difference is that information security is mainly relevant to personal information while cyber security is more universal, focusing on other concerns such as our national infrastructure.


My feeling though … is that information security is actually a super-set of cyber security since anything in the cyber realm would involve information or information systems. As usual here is my  pseudo-Venn diagram to enjoy.

Then we have the official NIST definitions from IR 7298 Revision 2. They define cyber security and information security as follows (note there are two definitions for information security).
Cybersecurity: The ability to protect or defend the use of cyberspace from cyber attacks.
Information Security (1): The protection of information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction in order to provide confidentiality, integrity, and availability.
Information Security (2): Protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction in order to provide —
1) integrity, which means guarding against improper information modification or destruction, and includes ensuring information non-repudiation and authenticity;
2) confidentiality, which means preserving authorized restrictions on access and disclosure, including means for protecting personal privacy and proprietary information; and
3) availability, which means ensuring timely and reliable access to and use of information.
Based on the definitions above, the way I look at it — cyber security involves anything security-related in the cyber realm (or cyberspace). Information security involves the security of information or information systems regardless of the realm it occurs in (e.g., risk of exposure in physical world). Since anything that occurs in the cyber realm would involve the protection of information and information systems in some way, you can conclude that information security is a super-set of cyber security.

Comments

Popular posts from this blog

Black swan

A  black swan event  is an incident that occurs randomly and unexpectedly and has wide-spread ramifications. The event is usually followed with reflection and a flawed rationalization that it was inevitable. The phrase illustrates the frailty of inductive reasoning and the danger of making sweeping generalizations from limited observations. The term came from the idea that if a man saw a thousand swans and they were all white, he might logically conclude that all swans are white. The flaw in his logic is that even when the premises are true, the conclusion can still be false. In other words, just because the man has never seen a black swan, it does not mean they do not exist. As Dutch explorers discovered in 1697, black swans are simply outliers -- rare birds, unknown to Europeans until Willem de Vlamingh and his crew visited Australia. Statistician Nassim Nicholas Taleb uses the phrase black swan as a metaphor for how humans deal with unpredictable events in his 2007...

A Graphics Processing Unit (GPU)

A graphics processing unit (GPU) is a computer chip that performs rapid mathematical calculations, primarily for the purpose of rendering images. A GPU may be found integrated with a central processing unit (CPU) on the same circuit, on a graphics card or in the motherboard of a personal computer or server. In the early days of computing, the CPU performed these calculations. As more graphics-intensive applications such as AutoCAD were developed; however, their demands put strain on the CPU and degraded performance. GPUs came about as a way to offload those tasks from CPUs, freeing up their processing power. NVIDIA, AMD, Intel and ARM are some of the major players in the GPU market. GPU vs. CPU A graphics processing unit is able to render images more quickly than a central processing unit because of its parallel processing architecture, which allows it to perform multiple calculations at the same time. A single CPU does not have this capability, although multi...

6G (sixth-generation wireless)

6G (sixth-generation wireless) is the successor to 5G cellular technology. 6G networks will be able to use higher frequencies than 5G networks and provide substantially higher capacity and much lower latency. One of the goals of the 6G Internet will be to support one micro-second latency communications, representing 1,000 times faster -- or 1/1000th the latency -- than one millisecond throughput. The 6G technology market is expected to facilitate large improvements in the areas of imaging, presence technology and location awareness. Working in conjunction with AI, the computational infrastructure of 6G will be able to autonomously determine the best location for computing to occur; this includes decisions about data storage, processing and sharing.  Advantages of 6G over 5G 6G is expected to support 1 terabyte per second (Tbps) speeds. This level of capacity and latency will be unprecedented and wi...