Skip to main content

Service-oriented Architecture (SOA)


Service-oriented architecture (SOA) is a software development model for distributed application components that incorporates discovery, access control, data mapping and security features. There are three major objectives of SOA, all which focus on a different part of the application lifecycle:

SOA is a broad architectural model that defines the goals of an application, as well what approaches will be used to meet those goals. Developers must define specific implementation specifications, usually linked to the formal Web Services Description Language (WSDL) and Simple Object Access Protocol (SOAP) specifications.

The emergence of SOA

For decades, software development required the use of modular functional elements that perform a specific job in multiple places within an application. As application integration and component-sharing operations became linked to pools of hosting resources and distributed databases, enterprises needed a way to adapt their procedure-based development model to the use of remote, distributed components. Simple models like the remote procedure call (RPC) were a start in the right direction, but RPC lacked the security and data-independent features needed for truly open and distributed operations.

The solution to this problem was to redefine the old operation model into a broader and more clearly architected collection of services that could be provided to an application using fully distributed software components. The architecture that wrapped these services in mechanisms to support open use under full security and governance was called the service-oriented architecture, or SOA. SOA was introduced in the late 1990s as a set of principles or requirements; within a decade, there were several suitable implementations.

WS and WSDL models

Initially, SOA implementations were based on the RPC and object-broker technologies available around 2000. But SOA quickly split into two camps. The first is the web services (WS) camp, which represents highly architected and formalized management of remote procedures and components. The second is the representational state transfer (REST) camp, which represents the use of internet technology to access remotely hosted components of applications.

The WS model of SOA uses the WSDL to connect interfaces with services and the SOAP to define procedure or component APIs. WS principles were used to link applications via an enterprise service bus (ESB), which helped businesses integrate their applications, ensure efficiency and improve data governance.

A whole series of WS standards were developed and promoted by industry giants, such as IBM and Microsoft. These standards offered a secure and flexible way to divide software into a series of distributed pieces. However, the model was difficult to use and often introduced considerable overhead into the workflows that passed between components of an application.

The WS model of SOA never reached the adoption levels that advocates had predicted; in fact, it collided with another model of remote components based on the internet: REST. RESTful application program interfaces (APIs) offered low overhead and were easy to understand. As the internet integrated more with applications, RESTful APIs were seen as the future.

SOA and microservices

The tension between SOA as a set of principles and SOA as a specific software implementation came to a head in the face of virtualization and cloud computing. The combination of virtualization and cloud encourages software developers to build applications from smaller functional components. Microservices, one of the critical current software trends, was the culmination of that development model. Because more components mean more interfaces and more complicated software design, the trend exposed the complexity and performance faults of most SOA implementations.

Microservice-based software architectures are actually just modernized implementations of the SOA model. The software components are developed as services to be exposed via APIs, as SOA would require. An API broker mediates access to components and ensures security and governance practices are followed. It also ensures there are software techniques to match diverse I/O formats of microservices to the applications that use them.

But SOA is as valid today as it was when first considered. SOA principles have taken us to the cloud and are supporting the most advanced cloud software development techniques in use today.


Comments

Popular posts from this blog

Understanding the Evolution: AI, ML, Deep Learning, and Gen AI

In the ever-evolving landscape of artificial intelligence (AI) and machine learning (ML), one of the most intriguing advancements is the emergence of General AI (Gen AI). To grasp its significance, it's essential to first distinguish between these interconnected but distinct technologies. AI, ML, and Deep Learning: The Building Blocks Artificial Intelligence refers to the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. Machine Learning, a subset of AI, empowers machines to learn from data and improve over time without explicit programming. Deep Learning, a specialized subset of ML, involves neural networks with many layers (hence "deep"), capable of learning intricate patterns from vast amounts of data. Enter General AI (Gen AI): Unraveling the Next Frontier Unlike traditional AI systems that excel in specific tasks (narrow AI), General AI aims to replicate human cognitive abilities across various domains. I...

Normalization of Database

Database Normalisation is a technique of organizing the data in the database. Normalization is a systematic approach of decomposing tables to eliminate data redundancy and undesirable characteristics like Insertion, Update and Deletion Anamolies. It is a multi-step process that puts data into tabular form by removing duplicated data from the relation tables. Normalization is used for mainly two purpose, Eliminating reduntant(useless) data. Ensuring data dependencies make sense i.e data is logically stored. Problem Without Normalization Without Normalization, it becomes difficult to handle and update the database, without facing data loss. Insertion, Updation and Deletion Anamolies are very frequent if Database is not Normalized. To understand these anomalies let us take an example of  Student  table. S_id S_Name S_Address Subject_opted 401 Adam Noida Bio 402 Alex Panipat Maths 403 Stuart Jammu Maths 404 Adam Noida Physics Updation Anamoly :  To upda...

How to deal with a toxic working environment

Handling a toxic working environment can be challenging, but there are steps you can take to address the situation and improve your experience at work: Recognize the Signs : Identify the specific behaviors or situations that contribute to the toxicity in your workplace. This could include bullying, harassment, micromanagement, negativity, or lack of support from management. Maintain Boundaries : Set boundaries to protect your mental and emotional well-being. This may involve limiting interactions with toxic individuals, avoiding gossip or negative conversations, and prioritizing self-care outside of work. Seek Support : Reach out to trusted colleagues, friends, or family members for support and advice. Sharing your experiences with others can help you feel less isolated and provide perspective on the situation. Document Incidents : Keep a record of any incidents or behaviors that contribute to the toxic environment, including dates, times, and specific details. This documentation may b...