- IT Organizational Structure
- Evaluating Hardware Acquisition, Installation, and Maintenance
- Evaluating Systems Software Development, Acquisition, Implementation, and Maintenance
- Evaluating Network Infrastructure Acquisition, Installation, and Maintenance
- The TCP/IP Protocol Suite
- Routers
- Internet, Intranet, and Extranet
- Evaluating IS Operational Practices
- Evaluating the Use of System Performance and Monitoring Processes, Tools, and Techniques
- Exam Prep Questions
Evaluating Systems Software Development, Acquisition, Implementation, and Maintenance
The development, acquisition, implementation, and maintenance software and hardware support key business practices. The IS auditor must ensure that the organization has controls in place that manage these assets in an effective and efficient manner. A clear understanding of the technology and its operational characteristics are critical to the IS auditor’s review.
Understanding Systems Software and Utilities Functionality
We have discussed both hardware and network devices and their important role in today’s IT infrastructure. The hardware can be considered a means to an end, but the software is responsible for delivering services and information. A majority of organizations today use client/server software, which enables applications and data to be spread across a number of systems and to serve a variety of operating systems. The advantage of client/server computing is that clients can request data, processing, and services from servers both internal and external to the organization. The servers do the bulk of the work. This technology enables a variety of clients, from workstations to personal digital assistants (PDAs). Client/server computing also enables centralized control of the organization’s resources, from access control to continuity.
One level above the hardware we have already discussed is firmware. This type of "software" is generally contained on a chip within the component of the hardware (motherboard, video card, modem, and so on). The operating system runs at the next level on top of the hardware and firmware, and is the nucleus of the IT infrastructure. The operating system contains programs that interface among the user, processor, and applications software. Operating systems can be considered the "heart" of the software system. They allow the sharing of CPU processing, memory, application access, data access, data storage, and data processing; they ensure the integrity of the system. The software that is developed for the computer must be compatible with the operating system.
Within the operating system are functions, utilities, and services that control the use and sharing of computer resources. These are the basic functions for the operating system:
-
Defining user interfaces
-
Enabling user access to hardware, data, file systems, and applications
-
Managing the scheduling of resources among users
Along with the never-ending demand for scalability, interoperability, and performance, most operating systems today have parameters that can be customized to fit the organization. These parameters enable administrators to align many different types of functional systems with the performance and security needs of the organization. The selection of parameter settings should be aligned with the organization's workload and control structure. After the parameters are configured, they should be continually monitored to ensure that they do not result in errors, data corruption, unauthorized access, or a degradation of service.
In a client/server model, the server handles the processing of data, and security functions are shared by both workstation and server. The server responds to requests from other computers that are running independently on the network. An example of client/server computing is accessing the Internet via a web browser. An independent machine (workstation) requests data (web pages) from a web server; the web server processes the request, correlates the requested data, and returns it to the requester. The web server might contain static pages (HTML documents), or the HTML pages might contain dynamic data contained within a database-management system (DBMS). In this scenario, the server processes multiple requests, manages the processing, allocates memory, and processes authentication and authorization activities associated with the request. The client/server model enables the integration of applications and data resources.
Client/server architectures differ depending on the needs of organization. An additional component of client/server computing is middleware. Middleware provides integration between otherwise distinct applications. As an example of the application of middleware, IT organizations that have legacy applications (mainframe, non–client/server, and so on) can implement web-based front ends that incorporate the application and business logic in a central access point. The web server and its applications (Java servlets, VBScript, and so on) incorporate the business logic and create requests to the legacy systems to provide requested data. In this scenario, the web "front end" acts as middleware between the users and the legacy systems. This type of implementation is useful when multiple legacy systems contain data that is not integrated. The middleware can then respond to requests, correlate the data from multiple legacy applications (accounting, sales, and so on), and present to the client.
Middleware is commonly used to provide the following functionality:
-
Transaction-processing (TP) monitorsThese applications or programs monitor and process database transactions.
-
Remote procedure calls (RPC)An RPC is a function call in client/server computing that enables clients to request that a particular function or set of functions be performed on a remote computer.
-
Messaging servicesUser requests (messages) can be prioritized, queued, and processed on remote servers.
As an IS auditor, you should assess the use of controls that ensure the confidentiality, integrity, and availability of client/server networks. The development and implementation of client/server applications and middleware should include proper change control and testing of modifications and should ensure that version control is maintained. Lack of proper controls with regard to the authentication, authorization, and data across multiple platforms could result in the loss of data or program integrity.
Organizations use more data today in decision making, customer support, sales, and account management. Data is the lifeblood of any organization. With the high volume of change, transaction processing, and access, it is important to maintain the confidentiality, availability, and integrity of data according to the organization’s business requirements. A DBMS is used to store, maintain, and enforce data integrity, as well provide the capability to convert data into information through the use of relationships and high-availability access. The primary functions of the DBMS are to reduce data redundancy, decrease access time, and provide security over sensitive data (records, fields, and transactions). A typical DBMS can be categorized as a container that stores data. Within the container of related information are multiple smaller containers that comprise logically related data. Figure 3.1 shows a DBMS relational structure for an asset-management system.
Figure 3.1 Relational database structure.
In Figure 3.1 the location (LocationID) of the asset is related to both the point of contact (POC) and the asset itself. Relational databases use rows (tuples, equal to records) and columns (domains or attributes, which correspond to fields).
A DBMS should include a data dictionary that identifies the data elements (fields), their characteristics, and their use. Data dictionaries are used to identify all fields and field types in the DBMS to assist with application development and processing. The data dictionary should contain an index and description of all the items stored in the database.
Three basic database models exist: hierarchical, network, and relational. A hierarchical database model establishes a parent-child relationship between tables (entities). It is difficult to manage relationships in this model when children need to relate to more than one parent; this can lead to data redundancy. In the network database model, children can relate to more than one parent. This can lead to complexity in relationships, making an ID difficult to understand, modify, and recover in the event of a failure. The relational database model separates the data from the database structure, allowing for flexibility in implementing, understanding, and modifying. The relational structure enables new relationships to be built based on business needs.
The key feature of relational databases is normalization, which structures data to minimize duplication and inconsistencies. Normalization rules include these:
-
Each field in a table should represent unique information.
-
Each table should have a primary key.
-
You must be able to make changes to the data (other than the primary key) without affecting other fields.
Users access databases through a directory system that describes the location of data and the access method. This system uses a data dictionary, which contains an index and description of all the items stored in the database.
In a transaction-processing database, all data transactions to include updating, creating, and deleting are logged to a transaction log. When users update the database, the data contained in the update is written first to the transaction log and then to the database. The purpose of the transaction log is to hold transactions for a short period of time until the database software is ready to commit the transaction to the database. This process ensures that the records associated with the change are ready to accept the entire transactions. In environments with high volumes of transactions, records are locked while transactions are committed (concurrency control), to enable the completion of the transactions. Concurrency controls prevent integrity problems when two processes attempt to update the same data at the same time. The database software checks the log periodically and then commits all transactions contained in the log since the last commit. Atomicity is the process by which data integrity is ensured through the completion of an entire transaction or not at all.
Risks and Controls Related to System Software and Utilities
The software (operating systems, applications, database-management systems, and utilities) must meet the needs of the organization. The challenge facing a majority of IT organizations today is the wide variety of software products and their acquisition, implementation, maintenance, and integration. Organizational software is used to maintain and process the corporate data and enable its availability and integrity. The IT organization is responsible for keeping abreast of new software capabilities to improve business processes and expand services. In addition, the IT organization needs to monitor and maintain existing applications to ensure that they are properly updated, licensed, and supported. A capacity-management plan ensures that software expansion or reduction of resources takes place in parallel with overall business growth or reduction. The IT organization must solicit input from both users and senior management during the development and implementation of the capacity plan, to achieve the business goals in the most efficient and effective manner.
Change Control and Configuration Management Principles for System Software
Whether purchased or developed, all software must follow a formal change-control process. This process ensures that software meets the business needs and internal compatibility standards. The existence of a change control process minimizes the possibility that the production network will be disrupted and ensures that appropriate recovery and back-out procedures are in place.
When internal applications are developed and implemented, the IT organization is responsible for maintaining separate development, test, and production libraries. These libraries facilitate effective and efficient management and control of the software inventory, and incorporate security and control procedures for version control and release of software. The library function should be consistent with proper segregation of duties. For example, the system developer may create and alter software logic, but should not be allowed access to information processing or production applications. Source code comparison is an effective method for tracing changes to programs.