Skip to main content

Cognitive bias


Cognitive bias is a limitation in objective thinking that is caused by the tendency for the human brain to perceive information through a filter of personal experience and preferences. The filtering process is called heuristics; it's a coping mechanism that allows the brain to prioritize and process the vast amount of input it receives each second. While the mechanism is very effective, its limitations can cause errors that can be exploited.

It may not be totally possible to eliminate the brain's predisposition to take shortcuts, but understanding that bias exists can be useful when making decisions. A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive science, social psychology and behavioral economics. They include:

Cognitive bias and its impact on data analytics

Being aware of how human bias can cloud analytics analysis is an important first step toward preventing it from happening. While data analytics tools can help business executives make data-driven decisions, it is still up to humans to select what data should be analyzed. This is why it is important for business managers to understand that cognitive biases that occur when selecting data can cause digital tools used in predictive analytics and prescriptive analytics to generate false results.

Throughout history, analysts have learned the hard way about the pitfalls of deploying and using predictive modeling without examining the data selected for analysis for cognitive bias. For example, pollsters and election forecasters predicted large margins of victory for Hillary Clinton in the 2016 United States presidential election. The culmination of many types of bias played a part in predictions that inaccurately forecast Hillary Clinton would be elected president and reliance on weak polling data and flawed predictive models resulted in an unpredicted outcome.

Comments

Popular posts from this blog

Black swan

A  black swan event  is an incident that occurs randomly and unexpectedly and has wide-spread ramifications. The event is usually followed with reflection and a flawed rationalization that it was inevitable. The phrase illustrates the frailty of inductive reasoning and the danger of making sweeping generalizations from limited observations. The term came from the idea that if a man saw a thousand swans and they were all white, he might logically conclude that all swans are white. The flaw in his logic is that even when the premises are true, the conclusion can still be false. In other words, just because the man has never seen a black swan, it does not mean they do not exist. As Dutch explorers discovered in 1697, black swans are simply outliers -- rare birds, unknown to Europeans until Willem de Vlamingh and his crew visited Australia. Statistician Nassim Nicholas Taleb uses the phrase black swan as a metaphor for how humans deal with unpredictable events in his 2007...

A Graphics Processing Unit (GPU)

A graphics processing unit (GPU) is a computer chip that performs rapid mathematical calculations, primarily for the purpose of rendering images. A GPU may be found integrated with a central processing unit (CPU) on the same circuit, on a graphics card or in the motherboard of a personal computer or server. In the early days of computing, the CPU performed these calculations. As more graphics-intensive applications such as AutoCAD were developed; however, their demands put strain on the CPU and degraded performance. GPUs came about as a way to offload those tasks from CPUs, freeing up their processing power. NVIDIA, AMD, Intel and ARM are some of the major players in the GPU market. GPU vs. CPU A graphics processing unit is able to render images more quickly than a central processing unit because of its parallel processing architecture, which allows it to perform multiple calculations at the same time. A single CPU does not have this capability, although multi...

6G (sixth-generation wireless)

6G (sixth-generation wireless) is the successor to 5G cellular technology. 6G networks will be able to use higher frequencies than 5G networks and provide substantially higher capacity and much lower latency. One of the goals of the 6G Internet will be to support one micro-second latency communications, representing 1,000 times faster -- or 1/1000th the latency -- than one millisecond throughput. The 6G technology market is expected to facilitate large improvements in the areas of imaging, presence technology and location awareness. Working in conjunction with AI, the computational infrastructure of 6G will be able to autonomously determine the best location for computing to occur; this includes decisions about data storage, processing and sharing.  Advantages of 6G over 5G 6G is expected to support 1 terabyte per second (Tbps) speeds. This level of capacity and latency will be unprecedented and wi...