Not signed in (Sign In)

Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

  • Sign in using OpenID

Site Tag Cloud

2-category 2-category-theory abelian-categories adjoint algebra algebraic algebraic-geometry algebraic-topology analysis analytic-geometry arithmetic arithmetic-geometry book bundles calculus categorical categories category category-theory chern-weil-theory cohesion cohesive-homotopy-type-theory cohomology colimits combinatorics complex complex-geometry computable-mathematics computer-science constructive cosmology deformation-theory descent diagrams differential differential-cohomology differential-equations differential-geometry digraphs duality elliptic-cohomology enriched fibration foundation foundations functional-analysis functor gauge-theory gebra geometric-quantization geometry graph graphs gravity grothendieck group group-theory harmonic-analysis higher higher-algebra higher-category-theory higher-differential-geometry higher-geometry higher-lie-theory higher-topos-theory homological homological-algebra homotopy homotopy-theory homotopy-type-theory index-theory integration integration-theory k-theory lie-theory limits linear linear-algebra locale localization logic mathematics measure-theory modal modal-logic model model-category-theory monad monads monoidal monoidal-category-theory morphism motives motivic-cohomology nforum nlab noncommutative noncommutative-geometry number-theory of operads operator operator-algebra order-theory pages pasting philosophy physics pro-object probability probability-theory quantization quantum quantum-field quantum-field-theory quantum-mechanics quantum-physics quantum-theory question representation representation-theory riemannian-geometry scheme schemes set set-theory sheaf sheaves simplicial space spin-geometry stable-homotopy-theory stack string string-theory superalgebra supergeometry svg symplectic-geometry synthetic-differential-geometry terminology theory topology topos topos-theory tqft type type-theory universal variational-calculus

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

Welcome to nForum
If you want to take part in these discussions either sign in now (if you have an account), apply for one now (if you don't).
    • CommentRowNumber1.
    • CommentAuthorIan_Durham
    • CommentTimeMay 31st 2010

    A friend of mine just posted the following quote to his Facebook page and it brought up the question in my mind of whether or not there was a categorical way to describe entropy since it seems like there could be (and it might offer some improvements on existing descriptions which can be rather disparate). Anyone know if such a thing exists?

    The quote: “A mathematician is a device for turning coffee into theorems” - Alfréd Rényi

    • CommentRowNumber2.
    • CommentAuthorEric
    • CommentTimeMay 31st 2010

    It should be possible to define the entropy of a category (in a way similar to cardinality). So then, the trick, as always, is to start with an interesting category. Since entropy is, in a way, a counting procedure, it is probably related to decategorification somehow.

    • CommentRowNumber3.
    • CommentAuthorDavidRoberts
    • CommentTimeMay 31st 2010

    Considering the links with partition functions/generating functions, there may be some sort of way to approach it via species (I know that is just a pointer to elsewhere, but perhaps someone will fill it in). This is just a random guess, so it may not be appropriate, but learning about species surely does one’s category-theoretic muscles good :-)

    • CommentRowNumber4.
    • CommentAuthorHarry Gindi
    • CommentTimeMay 31st 2010

    @Eric: That’s a huge stretch. The cardinality of a category is a generalization of the cardinality of a set. Sets in general don’t have a notion of entropy, so I’m not sure how your reasoning applies.

    • CommentRowNumber5.
    • CommentAuthorEric
    • CommentTimeMay 31st 2010

    Did you ever hear of brainstorming?

    • CommentRowNumber6.
    • CommentAuthorEric
    • CommentTimeMay 31st 2010

    Ian. This is probably a good place to start:

    • CommentRowNumber7.
    • CommentAuthorHarry Gindi
    • CommentTimeMay 31st 2010
    • (edited May 31st 2010)

    I understand that you were brainstorming, but I was explaining why it didn’t make sense to me. It’s not like I said “wow, you’re a big idiot!”

    • CommentRowNumber8.
    • CommentAuthorIan_Durham
    • CommentTimeMay 31st 2010

    Sets in general don’t have a notion of entropy

    They do if they have some kind of structure to them. So, for example, the sets that underly permutation groups can be assigned an entropy since, in one of its most general forms (as envisaged by Shannon) entropy is another way to count the number of configurations that something can have (there’s a widely-believed myth that it’s all based on probabilities, but even Shannon admitted it didn’t have to be).

    • CommentRowNumber9.
    • CommentAuthorIan_Durham
    • CommentTimeMay 31st 2010
    • (edited May 31st 2010)

    @Eric: Cool! Thanks for the references.

    Edit: Just glanced at the first and I think this runs along the same lines I was thinking (i.e. in keeping with Shannon’s very general notion).

    • CommentRowNumber10.
    • CommentAuthorDavid_Corfield
    • CommentTimeMay 31st 2010
    • CommentRowNumber11.
    • CommentAuthorIan_Durham
    • CommentTimeMay 31st 2010

    Thanks David!

    • CommentRowNumber12.
    • CommentAuthorexpixpi
    • CommentTimeJan 30th 2014
    • (edited Jan 30th 2014)
    What is "the scale" of a categorical object X? It is defined as a Log-wise categorical construction. Namely, we have to fix a base object "e" of the category and, if exists (by construction or by universal property) is an object S(X) of the same category such that Hom(S,e) is isomorphic to X. Please, note some crutial log-wise properties:

    1) Provided X and Y objects of the category, by universal properties, S(X x Y)=S(X) + S(Y) product and coproduct (in that category)
    2) Also, provided an object X, then S(Hom(X,X)) = X x S(X) isomorphic

    Then, entropy of an categorical object X may be defined as the SCALE of the object d(x)=Hom(X,X) within any category. Indeed is a functor with differential properties, since d(X x Y)=X x dY + dX x Y (cartesian product).

    a) The Shanon entropy can be derived in category Set and e={0,1}. The scale is then its cardinality and absolute and relative entropy can be easily derived.
    b) Using the Sierpinsky object in Top, and using properties regarding fiber bundles, Boltzman entropy can be easily derived for configuration topological spaces.
    c) Regarding Hilbert spaces categories, the categorical entropy can be realized in quantum field entropies, but I am now working on this.
    • CommentRowNumber13.
    • CommentAuthorUrs
    • CommentTimeJan 30th 2014

    Hi expixpi,

    it seems that you want to communicate some original thoughts or ongoing reserach. Therefore I suppose it would help if you could point to a document where the ideas you are sketching are laid out. Then people could have a look and could react, if reaction is what you are after.

    • CommentRowNumber14.
    • CommentAuthorJohn Baez
    • CommentTimeJan 31st 2014

    For an answer to the original question, “Is there a categorical way to describe entropy”, try:

    It’s a description of Shannon entropy as the unique functor with certain properties.

    This thread caught my attention because Tobias Fritz and I are busy finishing up a similar (but, it turned out, harder to prove) characterization of relative entropy. Right now this is a draft, but in a few weeks I hope it’ll be done!

    Now, back to what I was going to do…

    • CommentRowNumber15.
    • CommentAuthorUrs
    • CommentTimeJan 31st 2014

    By the way, the original question at the top of this thread is from May 2010, almost four years ago, and the OP is not around here anymore. If you do want to inform him of your articles, then you may have to contact him by email.

    • CommentRowNumber16.
    • CommentAuthorDavid_Corfield
    • CommentTimeJan 31st 2014

    John, re #14, I keep meaning to get back to my Bayesian roots. Your morphisms in FinStat seem to be equivalent to Cencov’s markov morphisms, see e.g., Lebanon’s paper section 4.

    • CommentRowNumber17.
    • CommentAuthorDavid_Corfield
    • CommentTimeJan 31st 2014
    • (edited Jan 31st 2014)

    Hmm, looking back I guess there are some differences. Markov embeddings, I think, are just your (f,s)(f, s), but between sets XX and YY without mention of distributions pp and qq. On each XX there sits the space of distributions and the only metric on this space preserved by all Markov embeddings is the Fisher metric.There’s a family of distances/divergences compatible with the metric, so you must be invoking some stronger constraint, Zhu 1997 (pdf).

    • CommentRowNumber18.
    • CommentAuthortonyjones
    • CommentTimeFeb 1st 2014
    • (edited Feb 1st 2014)
    dont know how relevant this is but the title sounds interesting: ''Entropy in a Category''
    http://arxiv.org/abs/1006.5122

    also came across this by Blute et al. : ENTROPIC HOPF ALGEBRAS AND MODELS OF NON-COMMUTATIVE LOGIC
    http://www.emis.ams.org/journals/TAC/volumes/10/17/10-17.pdf
    • CommentRowNumber19.
    • CommentAuthorUrs
    • CommentTimeFeb 2nd 2014

    TonyJones,

    thanks for the links. I just looked at them briefly. In both cases it seems to me that they use the term “entropy” mainly since they need some word.

    • CommentRowNumber20.
    • CommentAuthorUrs
    • CommentTimeFeb 2nd 2014
    • (edited Feb 2nd 2014)

    Regarding “entropy” and “category”: I am never quite sure what people are after who wish to combine these two words. For instance the original question in this thread could also be “Is there a categorical way to describe chocolate?” and I would equally be hoping that the question were made more specific. In the end, whenever you come up with a decent definition of anything, likely category theory will help to think about it. That’s the whole point of category theory.

    Regarding abstract characterization of entropy: from a random collection of a plethora of articles that discuss this, there are now listed two at entropy – References – Axiomatic characterizations.

    The reference

    • Bernhard Baumgartner, Characterizing Entropy in Statistical Physics and in Quantum Information Theory (arXiv:1206.5727)

    there gives a characterization that is dead simple: it observes that to characterize vN entropy of density matrices it is sufficient to assume that the functional takes larger values on larger systems and specifically takes nn times its value on nn copies of a single system.

    It would seem that if you have an ambient category with a minimum of structure that allows to speak of probability distributions/density matrices in the first place, then likely these simplistic axioms may also be formulated in that category.

    For instance density matrices may be neatly formalized in suitable dagger-monoidal categories, in a flavor of linear logic. To state these simple axioms in there for an entropy function on the space of density matrices, it seems all that one may need to require in addition is some object that plays the role of the real line in which the entropy is supposed to take values. One needs just that it has a linear order and contains natural numbers.

    • CommentRowNumber21.
    • CommentAuthorexpixpi
    • CommentTimeFeb 21st 2014
    Urs, thank you for your comments. I am now building a document in this line for discussion and reaction (absolutely wanted)

    The main idea is that in the literature about formal descriptions of anything (groups, modules, topological spaces, sets, configuration spaces...), there are several parallel concepts for "entropy" (this is, complexity and disorder). It feels like it should be a more general (categorical) description for that concept.

    And it seems that the point is to describe for any object X how huge the set of self-referent automorphisms Hom(X;X). I hope to be able to transfer it for a paper for discussion here.
    • CommentRowNumber22.
    • CommentAuthorexpixpi
    • CommentTimeMay 3rd 2014
    • (edited May 3rd 2014)
    Thank you all. After hard evaluation, this subject seems to be considered for a PhD Thesis so I appreciate very much all your contributions and I will come back with some mathematical, physical and information results in some months (or maybe years) ...

    The final approach is not to consider directly "entropy" but the "complexity" of a certain object X of the category. As supposed, that complexity is consider to be realized by self-similarity considering X^X. Considering only locally internal cartesian small categories (for instance Top), that is another object that can be managed. For instance, complexity of the empty object is itself, complexity of an initial object is the same initial object.

    Normally, these objects are too big to be managed properly. Then we take the logarithmic scale such that 2^S=X^X for a proper 2 tiny object. We will see what happens with concrete categories like Diff or Hil.
    • CommentRowNumber23.
    • CommentAuthorexpixpi
    • CommentTimeMay 3rd 2014
    For instance the original question in this thread could also be “Is there a categorical way to describe chocolate?” ...

    The categorical approach provides a transversal structure that helps to
    1) Provide a common origin to different questions originally in different fields
    2) Try to apply the concrete results, as structural, from one field to other maybe not directly connected

    So, “Is there a categorical way to describe chocolate?” the answer is YES ... the chocolate essence ...

    That allows us to recognize a kind of pie, a color, a flavor, some cookies, some extraordinary creative desserts, ...

    I am looking for the very essence of complexity. I do not study categories, I use them.
    • CommentRowNumber24.
    • CommentAuthorUrs
    • CommentTimeMay 3rd 2014

    Maybe you are looking for synthetic complexity theory.

    But, as I said by email, that will need a bit more than just the observation that exponential objects are a categorification of, yes, exponentials, and hence that there are “logarithm objects”. That’s clear. Now one might ask what a useful formulation, in this sense, of an expression for the entropy in the form xρ(x)lnρ(x)\int_x \rho(x) ln \rho(x) might be. Sure. But you still need to do that, I think.

    • CommentRowNumber25.
    • CommentAuthorexpixpi
    • CommentTimeAug 28th 2014
    • (edited Aug 28th 2014)
    Thank you for all your comments and references, specially to Urs. Some time and work have passed since last post and some advances have been made in this issue.

    1. Certainly, we (the doctoral team) preffer complexity to entropy. Entropy use to be a mere materialization or cuantification of complexity. For instance, in Prob category, Shannon entropy is the meassure of the object that vehicles the entropy of the original probability space.

    2. We have followed the strategy propossed by Urs and we have taken first Prob and Top concrete and internal categories. Provided an object X, we try to find (hopefully not always possible) an sub-object A sucht that X^X is (weakly or string) isomorphic to e^A. This is not clearly a functor, since not always is well defined or even defined. Effectively, e is a simple but not trivial object of the category and is not univocately defined (you can change the base if usuful). We trend to use for e "germs" like Sierpinsky in Top and Be(0,5) -the coin- in probabilty spaces.

    3. The simplest object is universally the empty object, which provides an empty object as complexity.

    4. The Shannon entropy arises naturally as the meassure (probability) of the complexity sub-object.

    5. The sense of that is that X^X has always at least one element (the identity). We are measuring how many (and how much) different are endomorphisms from that identity. The greater it is that collection, the more complex it is. We use many categories resources, like equalizers, coequalizers, limits

    6. The results trend to be transversal. We are now treating Bayes, and hopefully some results will be general for concrete internal categories.

    I will post here furter results. Thank you for your patience.