

Accuracy of inferences in such models depends on the quality of network parameters.
There is not a single, stable name for this theory: some people use "theory of imprecise probabilities"; others say "theory of credal sets", or "Quasi-Bayesian theory", or "theory of lower expectations", or ... I'm a founding member of the Society for Imprecise Probability Theory and Applications; I also helped organize some of the International Symposium for Imprecise Probabilities and Their Applications (ISIPTA) and edited some of its proceedings.
I have developed graph-based models that represent sets of probability measures over sets of variables; these are often called Fabio G. Still on concepts of independence, I have considered such concepts in the realm of full conditional measures (that is, measures that extend standard probability by adopting conditional probability as the primary object of interest, and hence allowing conditioning on events of probability zero): I have also looked at sequential decision making (that is, planning) under uncertainty.
Learning reliable parameters of Bayesian networks often requires a large amount of training data, which may be hard to acquire and may contain missing values.
On the other hand, qualitative knowledge is available in many computer vision applications, and incorporating such knowledge can improve the accuracy of parameter learning.