Skip to main content

Graphics processing unit (GPU)

A graphics processing unit (GPU) is a computer chip that performs rapid mathematical calculations, primarily for the purpose of rendering images. In the early days of computing, the central processing unit (CPU) performed these calculations. As more graphics-intensive applications such as AutoCAD were developed, however, their demands put strain on the CPU and degraded performance. GPUs came about as a way to offload those tasks from CPUs and free up processing power.

Today, graphics chips are being adapted to share the work of CPUs and train deep neural networks for AI applications. A GPU may be found integrated with a CPU on the same circuit, on a graphics card or in the motherboard of a personal computer or server. NVIDIA, AMD, Intel and ARM are some of the major players in the GPU market.  

GPU vs. CPU

A GPU is able to render images more quickly than a CPU because of its parallel processing architecture, which allows it to perform multiple calculations at the same time. A single CPU does not have this capability, although multicore processors can perform calculations in parallel by combining more than one CPU onto the same chip.

A CPU also has a higher clock speed, meaning it can perform an individual calculation faster than a GPU so it is often better equipped to handle basic computing tasks.

In general, a GPU is designed for data-parallelism and applying the same operation to multiple data-items (SIMD). A CPU is designed for task-parallelism and doing different operations.

GPU (graphics processing unit)

A graphics processing unit (GPU) is a computer chip that performs rapid mathematical calculations, primarily for the purpose of rendering images. In the early days of computing, the central processing unit (CPU) performed these calculations. As more graphics-intensive applications such as AutoCAD were developed, however, their demands put strain on the CPU and degraded performance. GPUs came about as a way to offload those tasks from CPUs and free up processing power.

Today, graphics chips are being adapted to share the work of CPUs and train deep neural networks for AI applications. A GPU may be found integrated with a CPU on the same circuit, on a graphics card or in the motherboard of a personal computer or server. NVIDIA, AMD, Intel and ARM are some of the major players in the GPU market.  

GPU vs. CPU

A GPU is able to render images more quickly than a CPU because of its parallel processing architecture, which allows it to perform multiple calculations at the same time. A single CPU does not have this capability, although multicore processors can perform calculations in parallel by combining more than one CPU onto the same chip.

A CPU also has a higher clock speed, meaning it can perform an individual calculation faster than a GPU so it is often better equipped to handle basic computing tasks.

In general, a GPU is designed for data-parallelism and applying the same operation to multiple data-items (SIMD). A CPU is designed for task-parallelism and doing different operations.

How a GPU works

CPU and GPU architectures are also differentiated by the number of cores. The core is essentially the processor within the processor. Most CPUs have between four and eight cores, though some have up to 32 cores. Each core can process its own tasks, or threads. Because some processors have multithreading capability -- in which the core is divided virtually, allowing a single core to process two threads -- the number of threads can be much higher than the number of cores. This can be useful in video editing and transcoding. CPUs can run two threads (independent instructions) per core (the independent processor unit). GPUs can have four to 10 threads per core.

 

Comments

Popular posts from this blog

Black swan

A  black swan event  is an incident that occurs randomly and unexpectedly and has wide-spread ramifications. The event is usually followed with reflection and a flawed rationalization that it was inevitable. The phrase illustrates the frailty of inductive reasoning and the danger of making sweeping generalizations from limited observations. The term came from the idea that if a man saw a thousand swans and they were all white, he might logically conclude that all swans are white. The flaw in his logic is that even when the premises are true, the conclusion can still be false. In other words, just because the man has never seen a black swan, it does not mean they do not exist. As Dutch explorers discovered in 1697, black swans are simply outliers -- rare birds, unknown to Europeans until Willem de Vlamingh and his crew visited Australia. Statistician Nassim Nicholas Taleb uses the phrase black swan as a metaphor for how humans deal with unpredictable events in his 2007...

A Graphics Processing Unit (GPU)

A graphics processing unit (GPU) is a computer chip that performs rapid mathematical calculations, primarily for the purpose of rendering images. A GPU may be found integrated with a central processing unit (CPU) on the same circuit, on a graphics card or in the motherboard of a personal computer or server. In the early days of computing, the CPU performed these calculations. As more graphics-intensive applications such as AutoCAD were developed; however, their demands put strain on the CPU and degraded performance. GPUs came about as a way to offload those tasks from CPUs, freeing up their processing power. NVIDIA, AMD, Intel and ARM are some of the major players in the GPU market. GPU vs. CPU A graphics processing unit is able to render images more quickly than a central processing unit because of its parallel processing architecture, which allows it to perform multiple calculations at the same time. A single CPU does not have this capability, although multi...

6G (sixth-generation wireless)

6G (sixth-generation wireless) is the successor to 5G cellular technology. 6G networks will be able to use higher frequencies than 5G networks and provide substantially higher capacity and much lower latency. One of the goals of the 6G Internet will be to support one micro-second latency communications, representing 1,000 times faster -- or 1/1000th the latency -- than one millisecond throughput. The 6G technology market is expected to facilitate large improvements in the areas of imaging, presence technology and location awareness. Working in conjunction with AI, the computational infrastructure of 6G will be able to autonomously determine the best location for computing to occur; this includes decisions about data storage, processing and sharing.  Advantages of 6G over 5G 6G is expected to support 1 terabyte per second (Tbps) speeds. This level of capacity and latency will be unprecedented and wi...