Skip to main content

Data in motion


Data in motion, also referred to as data in transit or data in flight, is digital information that is in the process of being transported between locations either within or between computer systems. The term can also be used to describe data within a computer's random access memory (RAM) that is ready to be read, accessed, updated or processed.

Data in motion includes the following scenarios: data moving from an Internet-capable endpoint device to a web-facing service in the cloud; data moving between virtual machines within and between cloud services and data that is traversing trusted private networks and an untrusted network such as the Internet. Once the data arrives at its final destination, it becomes data at rest.
Because data in motion is vulnerable to man in the middle (MiTM) attacks, it is often encrypted to prevent interception. For example, the iSCSI transport layer incorporates IPSec security, which can encrypt data as it is transferred between two devices to prevent a hacker with a sniffer from seeing the contents of that data. IPSec has been used extensively as a transit encryption protocol for virtual private network (VPN) tunnels; it makes use of cryptography algorithms such as Triple DES (3DES) and Advanced Encryption Standard (AES). Encryption platform software can also be integrated with existing enterprise resource planning (ERP) systems to keep data in motion secure.

Encrypting data in motion

Perhaps the best-known use of cryptography for the data in transit scenario is secure sockets layer (SSL) and transport layer security (TLS). TLS provides a transport layer -- encrypted "tunnel" between email servers or message transfer agents (MTAs), whereas SSL certificates encrypt private communications over the Internet using private and public keys. The ongoing management and responsibility of data in transit resides in the correct application of security controls, including the relevant cryptography processes to handle encryption key management.

Cryptographic protocols have been in use for many years in the form of hypertext transfer protocol secure (HTTPS), typically to provide communication security over the Internet, but it has now become the standard encryption approach for browser-to-web host and host-to-host communications in both cloud and non-cloud environments.

Recent increases show a number of cloud-based providers using multiple factors of encryption, coupled with the ability for users to encrypt their own data at rest within the cloud environment. The use of symmetric cryptography for key exchange followed by symmetric encryption for content confidentiality is also increasing. This approach looks to bolster and enhance standard encryption levels and strengths of encryption


Comments

Popular posts from this blog

Understanding the Evolution: AI, ML, Deep Learning, and Gen AI

In the ever-evolving landscape of artificial intelligence (AI) and machine learning (ML), one of the most intriguing advancements is the emergence of General AI (Gen AI). To grasp its significance, it's essential to first distinguish between these interconnected but distinct technologies. AI, ML, and Deep Learning: The Building Blocks Artificial Intelligence refers to the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. Machine Learning, a subset of AI, empowers machines to learn from data and improve over time without explicit programming. Deep Learning, a specialized subset of ML, involves neural networks with many layers (hence "deep"), capable of learning intricate patterns from vast amounts of data. Enter General AI (Gen AI): Unraveling the Next Frontier Unlike traditional AI systems that excel in specific tasks (narrow AI), General AI aims to replicate human cognitive abilities across various domains. I...

Normalization of Database

Database Normalisation is a technique of organizing the data in the database. Normalization is a systematic approach of decomposing tables to eliminate data redundancy and undesirable characteristics like Insertion, Update and Deletion Anamolies. It is a multi-step process that puts data into tabular form by removing duplicated data from the relation tables. Normalization is used for mainly two purpose, Eliminating reduntant(useless) data. Ensuring data dependencies make sense i.e data is logically stored. Problem Without Normalization Without Normalization, it becomes difficult to handle and update the database, without facing data loss. Insertion, Updation and Deletion Anamolies are very frequent if Database is not Normalized. To understand these anomalies let us take an example of  Student  table. S_id S_Name S_Address Subject_opted 401 Adam Noida Bio 402 Alex Panipat Maths 403 Stuart Jammu Maths 404 Adam Noida Physics Updation Anamoly :  To upda...

How to deal with a toxic working environment

Handling a toxic working environment can be challenging, but there are steps you can take to address the situation and improve your experience at work: Recognize the Signs : Identify the specific behaviors or situations that contribute to the toxicity in your workplace. This could include bullying, harassment, micromanagement, negativity, or lack of support from management. Maintain Boundaries : Set boundaries to protect your mental and emotional well-being. This may involve limiting interactions with toxic individuals, avoiding gossip or negative conversations, and prioritizing self-care outside of work. Seek Support : Reach out to trusted colleagues, friends, or family members for support and advice. Sharing your experiences with others can help you feel less isolated and provide perspective on the situation. Document Incidents : Keep a record of any incidents or behaviors that contribute to the toxic environment, including dates, times, and specific details. This documentation may b...