Sunday, November 28, 2010

i was thinking about the "(weighted) diagram scheme" approach to (enriched) colimits vs the "presheaf" approach... and it seemed to me that the contrast here is somewhat analogous to the contrast between "x-valued random variable" and "probability measure on x"... or something like that... here i'm thinking of an "x-valued random variable" as something like a space y (analog of the diagram scheme) equipped with both a probability measure and a map to x...

???hmm, maybe this analogy really is closer than i'd realized... the "weight" on the diagram scheme being precisely the analog of the probability measure on the domain of the random variable...

(originally i was going to specialize to the case where the colimits are set-based and so the domain of the random variable has some sort of canonical probability measure such as the equi-probability measure on a finite set, but then the obvious connection between "weight" and "measure" struck me and it seemed clear that the analogy goes deeper.)

maybe make a bit of a dictionary here...

diagram = measurable function
weighted diagram = random variable
diagram scheme = measurable space
weighted diagram scheme = probability measure space
weight on diagram scheme = probability measure on measurable space
colimit of weighted diagram = expectation of random variable

??maybe this analogy could help me understand certain ideas about "calculus of co-ends" (or something like that) ... i think i remember todd for example doing co-end calculations in some graphical way apparently strongly resembling the ordinary calculus of integrals, complete with long italianate s "integral signs"... maybe i didn't get it not only because of not getting co-ends but also because of not getting ordinary integrals... at least, i remember an early stage in my math education where i was happy with stuff like category theory and point set topology but loathed anything with an integral sign... i still loathe integral signs because my opinion on newton vs leibniz is approximately the reverse of the conventional one; i think that leibniz had a deeper understanding of what was going on but that his notation was extremely bad... anyway, i'm now half-way imagining that the main point of the "graphical calculus of co-ends" is to signify a weight on a diagram scheme by a little "ds" thing, exactly the aspect of leibniz's notation that i find the most obscurantist and annoying... (except that if that were the case then shouldn't it be called "calculus of weighted colimits" instead of "calculus of co-ends" which should have to do with some kind of "non-commutative integration" (involving traces of operators) if i'm not too badly confused??)

anyway, i was originally going to try to relate the analogy here to the semi-philosophical question as to whether "x-valued random variable" is just an awkward conceptual substitute for "probability measure on x", on the grounds that the main thing that you do with an x-valued random variable is to push forward the probability measure on its domain to x, and so why bother with the random variable in the first place when instead you could have just dealt directly with a probability measure on x? eliminate the middleman ... (middleman here = domain of x-valued random variable) ...

i remember rota taking the side that the random variable approach is the clearly superior approach because of stuff like how it psychologically promotes thinking about the correlation between a parallel pair of (for example real-valued) random variables in terms of the commutative algebra that they generate. but besides finding rota enjoyable to read i've gotten used to him presenting indefensible pronouncements as the outcome of settled arguments so i generally don't find his pronouncements very definitive.

anyway, as usual it might be interesting to work the analogy here both ways, trying to transport insight in both directions.

No comments:

Post a Comment