The Community for Technology Leaders
Green Image
Issue No. 05 - October (1996 vol. 8)
ISSN: 1041-4347
pp: 758-772
<p><b>Abstract</b>—In some environments, it is more difficult for distributed systems to cooperate. In fact, some distributed systems are highly heterogeneous and might not readily cooperate. In order to alleviate these problems, we have developed an environment that preserves the autonomy of the local systems, while enabling distributed processing. This is achieved by 1) modeling the different application systems into a central knowledge base (called a Metadatabase), 2) providing each application system with a local knowledge processor, and 3) distributing the knowledge within these local shells. This paper is concerned with describing the knowledge decomposition process used for its distribution. The decomposition process is used to minimize the needed cooperation among the local knowledge processors, and is accomplished by "serializing" the rule execution process. A rule is decomposed into a ordered set of subrules, each of which is executed in sequence and located in a specific local knowledge processor. The goals of the decomposition algorithm are to minimize the number of subrules produced, hence reducing the time spent in communication, and to assure that the sequential execution of the subrules is "equivalent" to the execution of the original rule.</p>
Heterogeneous distributed database management systems, production systems, autonomous systems, distributed knowledge processing, knowledge distribution, Metadatabase.
Gilbert Babin, Cheng Hsu, "Decomposition of Knowledge for Concurrent Processing", IEEE Transactions on Knowledge & Data Engineering, vol. 8, no. , pp. 758-772, October 1996, doi:10.1109/69.542028
320 ms
(Ver 3.3 (11022016))