Skip to main content

Posts

Showing posts from January, 2021

Process Mining Software

  Process mining software is a type of programming that analyses data in enterprise application event logs in order to learn how business processes are actually working. The goal of process mining software is to identify bottlenecks and other areas of inefficiency so they can be improved. Process mining software is especially useful to optimize workflow in process-oriented disciplines, such as business process reengineering (BPR) and business process management (BPM). Vendors of process mining software claim the technology can analyse millions of transaction records and spot deviations from normal workflows that might indicate increased risk. If the software is used to analyse the transaction logs of an ERP or CRM system or the audit logs of a workflow management system, for example, data visualization components in the software can show users what processes are running at any given time. Some process mining software a...

Memory Leak

  A memory leak is the gradual loss of available computer memory when a program (an application or part of the operating system) repeatedly fails to return memory that it has obtained for temporary use. As a result, the available memory for that application or that part of the operating system becomes exhausted and the program can no longer function. For a program that is frequently opened or called or that runs continuously, even a very small memory leak can eventually cause the program or the system to terminate. A memory leak is the result of a program bug. Some operating systems provide memory leak detection so that a problem can be detected before an application or the operating system crashes. Some program development tools also provide automatic "housekeeping" for the developer. It is always the best programming practice to return memory and any temporary file to the operating system after the program no longer needs it.

Cloud Infrastructure

  Cloud infrastructure refers to the hardware and software components -- such as servers, storage, a network, virtualization software, services and management tools -- that support the computing requirements of a cloud computing model. Cloud infrastructure also includes an abstraction layer that virtualizes and logically presents resources and services to users through application program interfaces and API-enabled command-line or graphical interfaces. In cloud computing, these virtualized resources are hosted by a service provider or IT department and are delivered to users over a network or the internet. These resources include virtual machines and components, such as servers, memory, network switches, firewalls, load balancers and storage. Cloud infrastructure components In a cloud computing architecture, cloud infrastructure refers to the back-end hardware elements found within most enterprise data centers, but on much greater scale. These include ...

Hybrid Application (hybrid app)

  A hybrid application (hybrid app) is a software application that combines elements of both native and web applications. Native apps are developed for specific mobile platforms and devices. They must be downloaded from an app store and installed locally before they can be used. A disadvantage of native apps is that they require developers to write multiple versions of the same app in order to accomodate each platform. An advantage of native apps is that because they are installed locally, they can take advantage of whatever capabilities the mobile platform provides -- including access to the mobile device's camera, GPS or accelerometer. In contrast, web applications are simply websites that have been optimized for mobile device use. Web apps are accessed through a browser instead of being downloaded and installed locally. An advantage of web apps is that they are platform agnostic. A disadvantage is that a web app is restricted to whatever capabilities the ...

Shared Responsibility Model

  A shared responsibility model is a cloud security framework that dictates the security obligations of a cloud computing provider and its users to ensure accountability. When an enterprise runs and manages its own IT infrastructure on premises, within its own data center, it is responsible for the security of that infrastructure, as well as the applications and data that run on it. When an organization moves to a public cloud computing model, it hands off some, but not all, of these IT security responsibilities to its cloud provider. Each party -- the cloud provider and cloud user -- is accountable for different aspects of security and must work together to ensure full coverage. The type of cloud service model -- infrastructure as a service (IaaS), platform as a service (PaaS) and software as a service (SaaS) -- dictates who is responsible for which security tasks. According to the Cloud Standards Customer Council (CSCC), an advocac...

Data Science

  Data science is the field of applying advanced analytics techniques and scientific principles to extract valuable information from data for business decision-making, strategic planning, and other uses. It's increasingly critical to businesses: The insights that data science generates help organizations increase operational efficiency, identify new business opportunities, and improve marketing and sales programs, among other benefits. Ultimately, they can lead to competitive advantages over business rivals. Data science incorporates various disciplines -- for example, data engineering, data preparation, data mining, predictive analytics, machine learning and data visualization, as well as statistics, mathematics, and software programming. It's primarily done by skilled data scientists, although lower-level data analysts may also be involved. In addition, many organizations now rely partly on citizen data scientists, a group that can include business intell...

Penetration Testing

Penetration testing, also called pen testing or ethical hacking, is the practice of testing a computer system, network or web application to find security vulnerabilities that an attacker could exploit. Penetration testing can be automated with software applications or performed manually. Either way, the process involves gathering information about the target before the test, identifying possible entry points, attempting to break in -- either virtually or for real -- and reporting back the findings.  The main objective of penetration testing is to identify security weaknesses. Penetration testing can also be used to test an organization's security policy, its adherence to compliance requirements, its employees' security awareness and the organization's ability to identify and respond to security incidents. Typically, the information about security weaknesses that are identified or exploited through pen testing is aggregated and provided to the organization's IT a...