数学笔记

All systems are the same: The Principle of Computational Equivalence [1, 2], due to S. Wolfram, is the heuristic statement that almost all processes (involving classical computations) that are not obviously simple are of equivalent sophistication. My corollary: Simple processes can be considered degenerate, be catalogued, and so on. The classification of these degenerate processes is the dual flip side of the universal nondegenerate system.

- Mathematical structures are all degenerate systems (as with symmetries) which do not express the complexity of a full fledged system, as most systems are.

Binomial theorem. Derivative. There is no volume, just the faces, as with the cross-polytopes. Also, the derivative has the boundary conditions as in homology.

Study Tai-Danae Bradley's thesis to learn about the transition between classical and quantum probabilities, between sets (F1) and vector spaces (Fq), and look for a connection with N and N2 as per the Yoneda lemma.

David Corfield: Spatial notions of cohesion as the basis for geometry. A fourfold adjunction: components {$\dashv$} discrete {$\dashv$} points {$\dashv$} codiscrete, and a threefold adjunction of modalities based on that, originally due to Lawvere.

Anyons are composite particles in two-dimensions that have statistics in between fermion (object) and boson (arrow) statistics. How can they be understood in terms of category theory?

Pushdown automata have a stack of priorities. In general, automata deal with concerns - rūpesčiai.

Automata are important for the dynamics of the three languages.

Study categories with a single initial state and a single final state. What does Yoneda Lemma mean for them?

Yoneda Lemma. The set function {$\theta \rightarrow \alpha \theta \textrm{Hom}(f,\_)$} is a rule for a pushdown automaton. The {$\alpha$} comes from the finite automaton (the input) and the {$\textrm{Hom}(f,\_)$} should describe the stack of memory. This all, on the left-hand side, is compared to a finite automaton on the right-hand side.

Looking at the Standard Model: 12 fermions are the 12 topologies. 4 bosons are the 4 representations of the nullsome. Higgs boson is the nullsome.

Mass is an indicator of subsystems.

How do position-momentum relate symplectic geometry and the entropy of phase space? Does entropy makes sense in a phase space without a notion of momentum?

John Baez about toposes and Lawvere:

- "In algebraic geometry we are often interested not just in whether or not something is true, but in where it is true." Relate this to scopes: truths about everything, anything, something, nothing.
- "Grothendieck thought about this very hard and invented his concept of topos, which is roughly a category that serves as a place in which one can do mathematics." A place for figuring things out? What would that mean? Ways of extending the mind by leveraging basic ways of figuring things out and organizing them around a particular observer?

Is there a kind of mathematics that is behind every science, every house of knowledge, every person?

Is every computation system of sufficient sophistication equivalent to a house of knowledge? And are lesser systems equivalent to a part of a house of knowledge?

The Logical Foundations of Cognition John Mcnamara, Gonzalo E. Reyes, 1994.

Functor is a contextualization. Is Z a group or an abelian group? "Form follows function".

Relate content and context.

- Problem -> Contentualize -> Contextualize -> Reformulate problem -> Find relevant categorical requirements -> Solve problem

Functors preserve all paths

With formula we have variables of certain types. And if we have another formula, then we may take all of the variables to be different, but we may allow some of the variables to be the same (refer to the same value) if they have the same type. But in the case of metaphor we see what happens if we make the identification without respecting the type. How much logic can carry over? We can test the boundaries and explore.

Can a category be simply considered as an algebra of paths? Which is to say, rather than think in terms of objects and arrows, simply think in terms of paths and the conditions on them: identity paths and composition of paths. Relate these paths to a matrix and to symmetric functions on the eigenvalues of a matrix.

Standard model: leptons mediate forces

- photon - electromagnetic force
- 8 gluons - strong force
- Z boson - weak force
- W (+,-) boson - weak force

Also:

- 6 quarks and 6 antiquarks
- 6 leptons
- 3 or 6 neutrinos
- 1 Higgs

Thus 34 or 37 particles.

When you have an object, then you ignore the other objects, the "non-objects", thus in the binomial theorem you don't have to deal with their relationships.

Yoneda Lemma:

- Validating a single step, in a generic way, is equivalent to validating everything.
- The theta was validated by somebody else, and we just validate the extension.
- The extension happens on the inside and the outside, the validations balance each other, like left and right parentheses.
- alpha (f) <-> Hom (Hom(f,_),alpha)
- the elements in F(A) are just dummies, they don't matter - they don't have meaning - double check, what could they possibly mean?

Entropy

- Should distinguish between interpretations of apparent entropy based on probability (as in physics) and actual entropy based on irreversibility (as in automata theory).
- Apparent entropy depends on having a particular partitioning that depends on the observer.
- The observer is defined by interpretation of the choice frameworks.
- Actual entropy does not require any partitioning.

Yoneda Lemma

- Splits in two the validation, "division into two". The subroutine validates the changes in the internal step and the external step, but assumes that the incoming natural transformation has already been validated.

Algebraic thinking is step-by-step, as with a PDA, state-by-state.

Yoneda Lemma: Covariant version and contravariant version are two PDAs that come together to form a Turing machine.

What is the relationship between circumstances (12 topologies) and the meaning of context in automata theory.

House of knowledge: Pushdown automata: Every question has an answer. Their two wings, entering the game and leaving the game, are linked by the three-cycle.

Could particles be extremal black holes?

The twosome is an expression of the nature of entropy, the second law of thermodynamics. The mind shifts from a state of greater ambiguity (where opposites coexist) to a state of lesser ambiguity (where all is the same). Correspondingly, the mind shifts from a state of higher energy to a state of lower energy.

The onesome defines order, whereas the twosome defines entropy, and so the two are related in that way.

The reversal of entropy is the reversal of the twosome.

Tobler's First Law of Geography: Everything is related to everything else, but near things are more related than distant things. Relate to the Yoneda lemma.

In logic, the difference between a statement (such as "you owe me money") being true and its being true and not empty is the difference between the natural numbers and the whole numbers. It's likewise the difference between weakly increasing and strictly increasing, less than or equal and less than. Weak increase is fundamental for the Yoneda Lemma so that a relation is reflexively satisfied by an object.

https://bartoszmilewski.com/2020/05/22/on-composability/

JS Bell’s essay “Six possible worlds of quantum mechanics”; it’s in his anthology “Speakable and unspeakable in quantum mechanics”

When are two types the same?

When are two identity morphisms the same?

How does the associativity of composition relate to factoring out an identity morphism?

In type theory, implication is understood as a function (by the Howard-Curry correspondence?) This is what I want to say as regards natural transformations - they set up a function that is a validation of a computation, an implication.

Finite automata - truth tables. Push down automata - type theory. Truth tables don't scale, look at negation.

Type theory: Every answer has an amount and a unit. The unit arises from measurement.

Simply typed lambda calculus. Can think of the base types as the atomic units "m", "sec", and think of functions sec->m as ratios "m/sec", which can yield very complicated ratios.

Computational definition of function (lambda calculus) = functional definition of computation (Turing machine)

Computerphile: Graham Hutton's observation: Y combinator is the expression of recursion in the lambda calculus. It has a dual structure which he thinks is related to the double strandedness of DNA.

Noether's theorems: Symmetry yields conversation law. Feynman's argument for that, as related by Sean Caroll. At an extrema of action, perturbations have no impact, to first order. This brings to mind fixed points.

Voevodsky: Substitutional (definitional) equality. Transportational equality (witnessed by elements of the Martin-Lof identity types)(propositional).

Vladimir Voevodsky. 2016. Multiple Concepts of Equality in the New Foundations of Mathematics (slides)

Intuitionistic type theory has 3 finite types:

- type 0 has no terms and means "false"
- type 1 has 1 canonical term and means "true"
- type 2 has 2 terms which are 2 choices

Randomness is related to symmetry breaking.

Zero curvature relates two different ways of doing nothing. You go around a loop and you consider what happens under transport (by a connection) of a vector (whether it changes or not). Nonzero curvature is indicated if something happened.

Information integration theory. Nonparallelism means that there is a dependency as with weighted averaging. Parallelism indicates nondependency. Thus the conscious mind works to distinguish parameters so that they are nondependent, additive. Then each separate parameter can be adjusted multiplicatively.

Recursion: loop = loop. This is like X = X.

Random walks on trees may express how to love and foster consciousness. The tree is the underlying unfolding of everything. The walk is what develops from nothing.

Hughes, Mia. Octonions and supergravity

John Baez. Triality of vector spaces forces their dimension to be 1, 2, 4 or 8.

Multiplication is nonidentity - the factors are treated as having noncomparable elements. Whereas duality inverts, bijects and identifies matching elements as the same.

Gauge field = finite automata. Particle = pushdown automata.

In physics, global tension leads to local release. This is the direction of entropy.

The breaking of duality in topology (in the definition of open sets) relates the discrete and the continuous.

Known that polynomial time is a subset of polynomial space (memory). But not known if polynomial space is a subset of polynomial time.

A cone models a perspective on structure (of perspectives). An object is a perspective.

An object splits a morphism into two. A morphism is expression (of content) and as such splits contradiction into two.

Entropy is related to information and knowledge. How is it related to the foursome? Study Sean Carroll's four different descriptions for entropy. Can I relate them to the Yoneda lemma? The four kinds of entropy could express the four meanings of equality in X=X, with reference to nothing, something, anything, everything.

- Entropy: Measure of uncertainty before the flip.
- Least number of yes and no questions that you need to ask.

- Information: Knowledge you have to gain after the flip.

A system has two directions in time: weakly increasing and weakly decreasing as regards entropy. Or the entropy may stay the same. But suppose that anytime two systems come together, say a system and a subsystem, what if they do so in the way that the entropy direction is compatible. In that way, a global constraint would arise from constraints on the subsystems. Time would be constructed from causal constraints.

Parsiųstas iš http://www.ms.lt/sodas/Book/MathNotes

Puslapis paskutinį kartą pakeistas 2020 rugpjūčio 05 d., 22:09