Skip to main content

Posts

Showing posts from March, 2019

Computational storage

Computational storage is an information technology (IT) architecture in which data is processed at the storage device level to reduce the amount of data that has to move between the storage plane and the compute plane. The lack of movement facilitates real-time data analysis and improves performance by reducing input/output bottlenecks. In many respects, a computational storage device may look just like every other solid state drive (SSD). Some products have a large number of NAND flash memory devices that actually store the data, a controller that manages writing the data to the flash devices and random access memory (RAM) to provide a read/write buffer. What is unique about computational storage devices is the inclusion of one or more multi-core processors. These processors can be used to perform many functions, from indexing data as it enters the storage device to searching the contents for specific entries to providing support for sophisticated arti...

Content Services Platform

A content services platform is software that enables users to create, share, collaborate on and store text, audio and video content. The CSP serves as a repository of record and a single source of truth. Content services platform is a relatively new term that is gaining acceptance as a successor to enterprise content management (ECM) software. Content services platforms for intelligent content can be installed either as a product suite or as separate applications with common application programming interfaces (APIs) and data repositories. Common capabilities include: The ability to store data a single time and use it for multiple purposes. The ability to manage and store metadata for digitized content. Automatic versioning that allows documents and other data to display the most recent version by default, with access to previous versions. Many of the vendors with products in the CSP category have simply expanded their ECM products and spun them into se...

A Graphics Processing Unit (GPU)

A graphics processing unit (GPU) is a computer chip that performs rapid mathematical calculations, primarily for the purpose of rendering images. A GPU may be found integrated with a central processing unit (CPU) on the same circuit, on a graphics card or in the motherboard of a personal computer or server. In the early days of computing, the CPU performed these calculations. As more graphics-intensive applications such as AutoCAD were developed; however, their demands put strain on the CPU and degraded performance. GPUs came about as a way to offload those tasks from CPUs, freeing up their processing power. NVIDIA, AMD, Intel and ARM are some of the major players in the GPU market. GPU vs. CPU A graphics processing unit is able to render images more quickly than a central processing unit because of its parallel processing architecture, which allows it to perform multiple calculations at the same time. A single CPU does not have this capability, although multi...

Black swan

A  black swan event  is an incident that occurs randomly and unexpectedly and has wide-spread ramifications. The event is usually followed with reflection and a flawed rationalization that it was inevitable. The phrase illustrates the frailty of inductive reasoning and the danger of making sweeping generalizations from limited observations. The term came from the idea that if a man saw a thousand swans and they were all white, he might logically conclude that all swans are white. The flaw in his logic is that even when the premises are true, the conclusion can still be false. In other words, just because the man has never seen a black swan, it does not mean they do not exist. As Dutch explorers discovered in 1697, black swans are simply outliers -- rare birds, unknown to Europeans until Willem de Vlamingh and his crew visited Australia. Statistician Nassim Nicholas Taleb uses the phrase black swan as a metaphor for how humans deal with unpredictable events in his 2007...

Data lake

A data lake is a storage repository that holds a vast amount of raw data in its native format until it is needed. While a hierarchical data warehouse stores data in files or folders, a data lake uses a flat architecture to store data. Each data element in a lake is assigned a unique identifier and tagged with a set of extended metadata tags. When a business question arises, the data lake can be queried for relevant data, and that smaller set of data can then be analyzed to help answer the question. The term data lake is often associated with Hadoop-oriented object storage. In such a scenario, an organization's data is first loaded into the Hadoop platform, and then business analytics and data mining tools are applied to the data where it resides on Hadoop's cluster nodes of commodity computers. Like big data, the term data lake is sometimes disparaged as being simply a marketing label for a product that supports Hadoop. Increasingly, however, the term is being accepted ...

Nearables

Nearables are low-power transmitters that activate in the presence of a Bluetooth-enabled or Near Field Communication (NFC-enabled) computing device. The purpose of nearable technology is to provide indoor geolocation services and facilitate short-range communication between active sensors and compatible software applications. The term nearables was introduced by Estimote, a vendor for Bluetooth-enabled beacons and beacon stickers. Beacons are transmitters that have a tiny CPU, memory and battery. Depending upon their purpose, they may be also be equipped with accelerometers, thermometers or light and humidity sensors. Beacons typically have a signal radius of about 250ft and are so small that they can be placed almost anywhere. Estimote's beacon stickers are even smaller and thinner than previous types of beacons, but the trade-off is that they have less memory, less power and a transmission range of about 45ft. In order for a nearable to be useful, the receiver must all...

Data Management-As-A-Service

Data management-as-a-Service (DMaaS) is a type of cloud service that provides enterprises with centralized storage for disparate data sources. The label "as-a-service" references a pay-per-use business model that does not require the customer to purchase or manage infrastructure for data management. In this business model, the customer backs up data to the DMaaS service provider. This is typically done by installing agents on the data sources being backed up, although in the case of cloud data sources, a simple authentication process may be the first step. DMaaS is typically an operating expense that goes up and down based on how much service the customer is consuming. It is technically possible to provide DMaaS using on-premises infrastructure or a private cloud offered by the DMaaS vendor, but all infrastructure must be provided and managed by the DMaaS vendor to be considered a service. Although it may be possible to do DMaaS this way, it is prohibitive to do so for lo...

Data streaming

Data streaming is the continuous transfer of data at a steady, high-speed rate. Although the concept of data streaming is not new, its practical applications are a relatively recent development. This is because in the early years of the world wide web, internet connectivity was not always reliable and bandwidth limitations often prevented streaming data to arrive at its destination in an unbroken sequence. Developers created buffers to allow data streams to catch up, but the resulting jitter caused the user experience to be so poor that most consumers preferred to download content rather than stream it. Today, with the advent of broadband internet, cloud computing and the internet of things (IoT), there is an increased interest in analyzing the data from streaming sources to make data-driven decisions in real time. To facilitate the need for real-time information from disparate data sources, many companies have replaced traditional batch processing with streaming data archit...

Nintex Sign

Nintex Sign is a native electronic signature capability that is powered by Adobe Sign. Public and private sector organizations can use the Nintex Sign platform to streamline or automate paper-based workflows that require signatures. This includes legal documents, invoices and sales paperwork. The Nintex Sign platform is designed to meet security, privacy and regulatory compliance requirements across multiple industries. After being integrated into the Nintex Platform, users can create signature-based workflows that trigger automation before, during or after a signature event. For example, e-signatures can be added to any workflow that requires a sign-off for contract approval. Once the electronic signature request has been added, users can monitor the document status during all stages or updates. When complete, the signed document can be archived automatically to multiple end points. Benefits of Nintex Sign About a quarter of business workflows require a signature, maki...

Data deduplication

Data deduplication -- often called intelligent compression or single-instance storage -- is a process that eliminates redundant copies of data and reduces storage overhead. Data deduplication techniques ensure that only one unique instance of data is retained on storage media, such as disk, flash or tape. Redundant data blocks are replaced with a pointer to the unique data copy. In that way, data deduplication closely aligns with incremental backup, which copies only the data that has changed since the previous backup. For example, a typical email system might contain 100 instances of the same 1 megabyte (MB) file attachment. If the email platform is backed up or archived, all 100 instances are saved, requiring 100 MB of storage space. With data deduplication, only one instance of the attachment is stored; each subsequent instance is referenced back to the one saved copy. In this example, a 100 MB storage demand drops to 1 MB. Target vs. source deduplication Data deduplica...