Skip to main content

.Net Framework


.NET Framework is a managed execution environment for Windows that allows software developers to create an app in one programming language and be assured that the app will be able to work with code written in other languages. The framework is designed to accommodate object code no matter where it is stored or executed.

The .NET Framework is the predominant implementation of Microsoft's .NET technologies. The framework features a common language runtime (CLR) and a class library. The CLR is Microsoft's implementation of the common language infrastructure (CLI), a standard for helping different programming languages and libraries work together. The CLR manages system services such as memory, thread execution, code execution, code safety verification and compilation. The class library contains tested, reusable code that developers can call from their own apps to provide functionality for such things as file input/output, parsing XML and working with Windows Forms. 

Microsoft's development tool for designing and developing .NET apps is called Visual Studio and apps are typically written in Visual Basic (VB), C# or F#. The Microsoft Test Framework (MSTest) can be used to provide quality assurance (QA) for .NET applications.

How the .NET Framework works

Source code written in one language is compiled into an intermediate language (IL) which is stored on disk in an executable file called an assembly. The assembly contains a manifest that provides information about the code's type, version and security requirements. Once the assembly is loaded into the CLR and validated, the IL code can be translated into native machine instructions. 

Language compilers for the .NET Framework use the Common Intermediate Language (CIL), an intermediate code that is compiled at runtime by the common language runtime. The .NET Framework helps resolve version conflicts because it allows multiple versions of the CLR to exist on the same computer.

Comments

Popular posts from this blog

Black swan

A  black swan event  is an incident that occurs randomly and unexpectedly and has wide-spread ramifications. The event is usually followed with reflection and a flawed rationalization that it was inevitable. The phrase illustrates the frailty of inductive reasoning and the danger of making sweeping generalizations from limited observations. The term came from the idea that if a man saw a thousand swans and they were all white, he might logically conclude that all swans are white. The flaw in his logic is that even when the premises are true, the conclusion can still be false. In other words, just because the man has never seen a black swan, it does not mean they do not exist. As Dutch explorers discovered in 1697, black swans are simply outliers -- rare birds, unknown to Europeans until Willem de Vlamingh and his crew visited Australia. Statistician Nassim Nicholas Taleb uses the phrase black swan as a metaphor for how humans deal with unpredictable events in his 2007...

A Graphics Processing Unit (GPU)

A graphics processing unit (GPU) is a computer chip that performs rapid mathematical calculations, primarily for the purpose of rendering images. A GPU may be found integrated with a central processing unit (CPU) on the same circuit, on a graphics card or in the motherboard of a personal computer or server. In the early days of computing, the CPU performed these calculations. As more graphics-intensive applications such as AutoCAD were developed; however, their demands put strain on the CPU and degraded performance. GPUs came about as a way to offload those tasks from CPUs, freeing up their processing power. NVIDIA, AMD, Intel and ARM are some of the major players in the GPU market. GPU vs. CPU A graphics processing unit is able to render images more quickly than a central processing unit because of its parallel processing architecture, which allows it to perform multiple calculations at the same time. A single CPU does not have this capability, although multi...

6G (sixth-generation wireless)

6G (sixth-generation wireless) is the successor to 5G cellular technology. 6G networks will be able to use higher frequencies than 5G networks and provide substantially higher capacity and much lower latency. One of the goals of the 6G Internet will be to support one micro-second latency communications, representing 1,000 times faster -- or 1/1000th the latency -- than one millisecond throughput. The 6G technology market is expected to facilitate large improvements in the areas of imaging, presence technology and location awareness. Working in conjunction with AI, the computational infrastructure of 6G will be able to autonomously determine the best location for computing to occur; this includes decisions about data storage, processing and sharing.  Advantages of 6G over 5G 6G is expected to support 1 terabyte per second (Tbps) speeds. This level of capacity and latency will be unprecedented and wi...