Categorification of Negative Information using Enrichment
In many applications of category theory it is useful to reason about "negative information". For example, in planning problems, providing an optimal solution is the same as giving a feasible solution (the "positive" information) together with a proof of the fact that there cannot be feasible solutions better than the one given (the "negative" information). We model negative information by introducing the concept of "norphisms", as opposed to the positive information of morphisms. A "nategory" is a category that has "Nom"-sets in addition to hom-sets, and specifies the compatibility rules between norphisms and morphisms. With this setup we can choose to work in "coherent" "subnategories": subcategories that describe a potential instantiation of the world in which all morphisms and norphisms are compatible. We derive the composition rules for norphisms in a coherent subnategory; we show that norphisms do not compose by themselves, but rather they need to use morphisms as catalysts. We have two distinct rules of the type morphism + norphism→norphism. We then show that those complex rules for norphism inference are actually as natural as the ones for morphisms, from the perspective of enriched category theory. Every small category is enriched over P= ⟨Set, ×, 1⟩. We show that we can derive the machinery of norphisms by considering an enrichment over a certain monoidal category called PN(for "positive"/"negative"). In summary, we show that an alternative to considering negative information using logic on top of the categorical formalization is to "categorify" the negative information, obtaining negative arrows that live at the same level as the positive arrows, and suggest that the new inference rules are born of the same substance from the perspective of enriched category theory.
READ FULL TEXT