The Meaning of Data Processing Cycle – Information Needs

In a simple sense, information is structured, processed and organised data. It gives context to data and allows users to make informed decisions. For instance, a single customer’s sale in a restaurant is financial data this becomes information if the company is able to identify which of the least popular or most popular dish is. If the restaurant had given the customer this information before he came, then his purchase of that particular dish would have been more informed rather than a spur of the moment purchase.

Ordinal data sets are simply those which are continuous and not time sensitive in nature. They can be easily analysed over a period of time because they are independent and do not depend on the previous mean or the future mean of any other data set. They cannot be tied to any reference or conceptual framework because they are independent. The meaning of this concept is that these sets are more often used to represent the meaning of some other discrete data set which is itself less objective because it is not dependent on the previous mean or the future mean. A common example of such a set is the stock price index, which can take a very long time to form but is again based on some previous or expected value of a company’s stock.

The second step in the information processing cycle is to extract the meaning from the data. This is done by various processes such as word grouping, fuzzy matching, extractions, transformations and neural network algorithms. In essence, this is what machine learning does; it uses artificial intelligence to perform this process. Machine learning applies four classifiers to extract the meaning from the data. One of these is a logistic regression, which takes into account the date and time of data acquisition, its language (what the customer said), its location (who was the customer) and many other factors. The classification of the variables used in this algorithm is called the neural network, because it is an artificial system that is trained using data from various sources in order to detect patterns.