Did you write this, genuinely? It is pure poetry such that even Samurai would go “hhhOooOOooo!”.
Yes I did! I spent a lot of time cooking an axiomatic framework for myself on this stuff so im happy to have the opportunity to distill my current ideas for others. Thanks! :)
And it is so interesting, because, what you are talking about sounds a lot like computational constraints of the medium performing the computation. We know there are limits in the Universe. There is a hard limit on the types of information we can and cannot reach. Only adds fuel to the fire for hypotheses such as the holographic Universe, or simulation theory.
The planck length and planck constant are both ultimate computational constraints on physical interactions, with planck length being the smallest meaningful scale, planck time being the smallest meaningful interval (universal framerate), and plancks constant being both combined to tell us about limits to how fast things can compute at the smallest meaningful distance steps of interaction, and ultimate bounds on how much energy we can put into physical computing.
Theres an important insight to be had here which ill share with you. Currently when people think of computation they think of digital computer transistors, turing machines, qbits, and mathematical calculations. The picture of our universe being run on an aliens harddrive or some shit like that because thats where were atas a society culturally and technologically.
Calculation is not this, or at least its not just that. A calculation is any operation that actualizes/changes a single bit in a representational systems current microstate causing it to evolve. A transistor flipping a bit is a calculation. A photon interacting with a detector to collapse its superposition is a calculation. The sun computes the distribution of electromagnetic waves and charged particles ejected. Hawking radiation/virtual particles compute a distinct particle from all possible ones that could be formed near the the event horizon.
The neurons in my brain firing to select the next word in this sentence sequence from the probabilistic sea of all things I could possibly say, is a calculation. A gravitational wave emminating from a black hole merger is a calculation, Drawing on a piece of paper a calculation actualizing a drawing from all things you could possibly draw on a paper. Smashing a particle into base components is a calculation, so is deriving a mathematical proof through cognitive operation and symbolic representation. From a certain phase space perspective, these are all the same thing. Just operations that change the current universal microstate to another iteratively flowing from one microstate bit actualization/ superposition collapse to the next. The true nature of computation is the process of microstate actualization.
Lauderes principle from classic computer science states that any classical computer transistor bit/microstate change has a certain energy cost. This can easily be extended to quantum mechanics to show every superposition collapse into a distinct qbit of information has the same actional energy cost structure directly relating to plancks constant. Essentially every time two parts of the universe interact is a computation that cost time and energy to change the universalmicrostate.
If you do a bit more digging with the logic that comes from this computation-as-actualization insight, you discover the fundamental reason why combinatorics is such a huge thing and why pascals triangle/ the binomial coefficents shows up literally everywhere in STEM. Pascals triangle directly governs the amount of microstates a finite computational phase space can access as well as the distribution of order and entropy within that phase space. Because the universe itself is a finite state representation system with 10^122 bits to form microstates, it too is governed by pascals triangle. On the 10^122th row of pascals triangle is the series of binomial coefficent distribution encoding of all possible microstates our universe can possibly evolve into.
This perspective also clears up the apparent mysterious mechanics of quantum superposition collapse and the principle of least action.
A superposition is literally just a collection of unactualized computationally accessable microstate paths a particle/ algorithm could travel through superimposed ontop of eachother. with time and energy being paid as the actional resource cost for searching through and collapsing all possible iterative states at that step in time into one definitive state. No matter if its a detector interaction ,observer interaction, or particle collision interaction, same difference. each possible microstate is seperated by exactly one single bit flip worth of difference in microstate path outcomes creating a definitive distinction between two near-equivalent states.
The choice of which microstate gets selected is statistical/combinatoric in nature. Each microstate is statistically probability weighted based on its entropy/order. Entropy is a kind of ‘information-theoretic weight’ property that affects actualization probability of that microstate based on location in pascals triangle, and its directly tied to algorithmic complexity (more complex microstates that create unique meaningful patterns of information are harder to form randomly from scratch compared to a soupy random cloud state and thus rarer).
Measurement happens when a superposition of microstates entangles with a detector causing an actualized bit of distinction within the sensors informational state. Its all about the interaction between representational systems and the collapsing of all possible superposition microstates into an actualized distinct collection of bits of information.
Plancks constant is really about the energy cost of actualizing a definitive microstate of information from a quantum superposition at smaller and smaller scales of measurement in space or time. The energy cost of distinguishing between two bits of information at a scale smaller than the planck length or at a time interval smaller than planck time will cost more energy than the universe allows in one area (it would create a kugelblitz pure energy black hole if you tried) and so any computational microstate path that differs from another with less than a plancks length worth of disctinction are indistinguishable, blurring together and creating a bedrock limit scale for computational actualization.
But for me, personally, I believe that at some point our own language breaks down because it isn’t quite adapted to dealing with these types of questions, as is again in some sense reminiscent of both Godel and quantum mechanics, if you would allow the stretch. It is undeterminability that is the key, the “event horizon” of knowledge as it were.
language is symbolic representation, cognition and conversation are computations as your neural network traces paths through your activation atlas. Our ability to talk about abstractions is tied to how complex they are to abstract about/model in our mind. The cool thing is language evolves as our understanding does so we can convey novel new concepts or perspectives that didn’t exist before.
The planck length and planck constant are both ultimate computational constraints on physical interactions, with planck length being the smallest meaningful scale, planck time being the smallest meaningful interval (universal framerate), and plancks constant being both combined to tell us about limits to how fast things can compute at the smallest meaningful distance steps of interaction, and ultimate bounds on how much energy we can put into physical computing.
The word “meaningful” does a lot of heavy lifting there. What does that mean? Do things which exceed the limit become meaningless? Because I was thinking, since we cannot know the future, as best as we could ever do is to approach infinitely close to the future, but never actually pass it*, because it hasn’t happened yet, and thus is unknowable. Like an event horizon. Not only on the macroscopic, but also microscopic scale there seems to be a limit beyond which all inquiry, or escape as it were, becomes possible as a matter of some sort of gravitational geometry.
Again, I want to stress I have no idea, I’m just an enthusiast so sorry if my musings make no sense, I appreciate a lot you taking your time with me and all of us to talk about this, so I’m just happy to be here.
Thanks for the cool graphics, they make sense!
*Just like you can’t escape the singularity once you’re bound. Same same but different! A transistor flipping a bit is a calculation.
A transistor flipping a bit is a calculation.
It is an “action”, a passage of time, an action in time, quantified somehow, a single unit of the passage of time? Like a photon is both a wave and a particle but neither and both at the same time? So how does one measure that, and more importantly, what goes on in the immeasurable space meanwhile? Particles jumping in between space and time and shit, everywhere, nowhere. The Universe is a very naughty place and has a gambling problem.
is a calculation
The word “is”, is a problematic word. Are things only calculations? Who is doing the calculating? What is calculating what? Calculation implies intention, as far as I am concerned. Or inevitability! Some form of mechanical causal chain that just plays out the movements it is predestined to play out and free will is an illusion which it very well might be, I don’t know where I was quite going with this, we should start a podcast I’m sure we’d be great.
The neurons in my brain firing to select the next word in this sentence sequence from the probabilistic sea of all things I could possibly say, is a calculation.
I hear ya!
Pascals triangle directly governs the amount of microstates a finite computational phase space can access as well as the distribution of order and entropy within that phase space.
I see!
Each microstate is statistically probability weighted based on its entropy/order. Entropy is a kind of ‘information-theoretic weight’ property that affects actualization probability of that microstate based on location in pascals triangle
The future can be predicted? To what degree of certainty? Can it be fully predicted?
the planck length or at a time interval smaller than planck time will cost more energy than the universe allows in one area
Fair enuf.
Sounds kind of like an event horizon. Like. You know, if you want to travel faster than light, you need infinite energy to get there. Or something like that. I get the feeling that the Universe is simultaneously the largest and the smallest thing that exists. Kind of a weird position to be in when you think about it, because the Universe is infinitely vast, but it is also infinitely small, down to quarks, bosons, or strings, or whatever. Like it’s, well, expanding, elastically, in both directions at once. Would you agree?
actualized bit of distinction
Mhm. Mmyes. Quite so, quite so.
Our ability to talk about abstractions is tied to how complex they are to abstract about/model in our mind.
Cool.
The cool thing is language evolves as our understanding does so we can convey novel new concepts or perspectives that didn’t exist before.
Is it enough to express it all? What would that mean? If we knew all the laws, would we be able to perfectly predict the future?
Yes I did! I spent a lot of time cooking an axiomatic framework for myself on this stuff so im happy to have the opportunity to distill my current ideas for others. Thanks! :)
The planck length and planck constant are both ultimate computational constraints on physical interactions, with planck length being the smallest meaningful scale, planck time being the smallest meaningful interval (universal framerate), and plancks constant being both combined to tell us about limits to how fast things can compute at the smallest meaningful distance steps of interaction, and ultimate bounds on how much energy we can put into physical computing.
Theres an important insight to be had here which ill share with you. Currently when people think of computation they think of digital computer transistors, turing machines, qbits, and mathematical calculations. The picture of our universe being run on an aliens harddrive or some shit like that because thats where were atas a society culturally and technologically.
Calculation is not this, or at least its not just that. A calculation is any operation that actualizes/changes a single bit in a representational systems current microstate causing it to evolve. A transistor flipping a bit is a calculation. A photon interacting with a detector to collapse its superposition is a calculation. The sun computes the distribution of electromagnetic waves and charged particles ejected. Hawking radiation/virtual particles compute a distinct particle from all possible ones that could be formed near the the event horizon.
The neurons in my brain firing to select the next word in this sentence sequence from the probabilistic sea of all things I could possibly say, is a calculation. A gravitational wave emminating from a black hole merger is a calculation, Drawing on a piece of paper a calculation actualizing a drawing from all things you could possibly draw on a paper. Smashing a particle into base components is a calculation, so is deriving a mathematical proof through cognitive operation and symbolic representation. From a certain phase space perspective, these are all the same thing. Just operations that change the current universal microstate to another iteratively flowing from one microstate bit actualization/ superposition collapse to the next. The true nature of computation is the process of microstate actualization.
Lauderes principle from classic computer science states that any classical computer transistor bit/microstate change has a certain energy cost. This can easily be extended to quantum mechanics to show every superposition collapse into a distinct qbit of information has the same actional energy cost structure directly relating to plancks constant. Essentially every time two parts of the universe interact is a computation that cost time and energy to change the universalmicrostate.
If you do a bit more digging with the logic that comes from this computation-as-actualization insight, you discover the fundamental reason why combinatorics is such a huge thing and why pascals triangle/ the binomial coefficents shows up literally everywhere in STEM. Pascals triangle directly governs the amount of microstates a finite computational phase space can access as well as the distribution of order and entropy within that phase space. Because the universe itself is a finite state representation system with 10^122 bits to form microstates, it too is governed by pascals triangle. On the 10^122th row of pascals triangle is the series of binomial coefficent distribution encoding of all possible microstates our universe can possibly evolve into.
This perspective also clears up the apparent mysterious mechanics of quantum superposition collapse and the principle of least action.
A superposition is literally just a collection of unactualized computationally accessable microstate paths a particle/ algorithm could travel through superimposed ontop of eachother. with time and energy being paid as the actional resource cost for searching through and collapsing all possible iterative states at that step in time into one definitive state. No matter if its a detector interaction ,observer interaction, or particle collision interaction, same difference. each possible microstate is seperated by exactly one single bit flip worth of difference in microstate path outcomes creating a definitive distinction between two near-equivalent states.
The choice of which microstate gets selected is statistical/combinatoric in nature. Each microstate is statistically probability weighted based on its entropy/order. Entropy is a kind of ‘information-theoretic weight’ property that affects actualization probability of that microstate based on location in pascals triangle, and its directly tied to algorithmic complexity (more complex microstates that create unique meaningful patterns of information are harder to form randomly from scratch compared to a soupy random cloud state and thus rarer).
Measurement happens when a superposition of microstates entangles with a detector causing an actualized bit of distinction within the sensors informational state. Its all about the interaction between representational systems and the collapsing of all possible superposition microstates into an actualized distinct collection of bits of information.
Plancks constant is really about the energy cost of actualizing a definitive microstate of information from a quantum superposition at smaller and smaller scales of measurement in space or time. The energy cost of distinguishing between two bits of information at a scale smaller than the planck length or at a time interval smaller than planck time will cost more energy than the universe allows in one area (it would create a kugelblitz pure energy black hole if you tried) and so any computational microstate path that differs from another with less than a plancks length worth of disctinction are indistinguishable, blurring together and creating a bedrock limit scale for computational actualization.
language is symbolic representation, cognition and conversation are computations as your neural network traces paths through your activation atlas. Our ability to talk about abstractions is tied to how complex they are to abstract about/model in our mind. The cool thing is language evolves as our understanding does so we can convey novel new concepts or perspectives that didn’t exist before.
The word “meaningful” does a lot of heavy lifting there. What does that mean? Do things which exceed the limit become meaningless? Because I was thinking, since we cannot know the future, as best as we could ever do is to approach infinitely close to the future, but never actually pass it*, because it hasn’t happened yet, and thus is unknowable. Like an event horizon. Not only on the macroscopic, but also microscopic scale there seems to be a limit beyond which all inquiry, or escape as it were, becomes possible as a matter of some sort of gravitational geometry.
Again, I want to stress I have no idea, I’m just an enthusiast so sorry if my musings make no sense, I appreciate a lot you taking your time with me and all of us to talk about this, so I’m just happy to be here.
Thanks for the cool graphics, they make sense!
*Just like you can’t escape the singularity once you’re bound. Same same but different! A transistor flipping a bit is a calculation.
It is an “action”, a passage of time, an action in time, quantified somehow, a single unit of the passage of time? Like a photon is both a wave and a particle but neither and both at the same time? So how does one measure that, and more importantly, what goes on in the immeasurable space meanwhile? Particles jumping in between space and time and shit, everywhere, nowhere. The Universe is a very naughty place and has a gambling problem.
The word “is”, is a problematic word. Are things only calculations? Who is doing the calculating? What is calculating what? Calculation implies intention, as far as I am concerned. Or inevitability! Some form of mechanical causal chain that just plays out the movements it is predestined to play out and free will is an illusion which it very well might be, I don’t know where I was quite going with this, we should start a podcast I’m sure we’d be great.
I hear ya!
I see!
The future can be predicted? To what degree of certainty? Can it be fully predicted?
Fair enuf.
Sounds kind of like an event horizon. Like. You know, if you want to travel faster than light, you need infinite energy to get there. Or something like that. I get the feeling that the Universe is simultaneously the largest and the smallest thing that exists. Kind of a weird position to be in when you think about it, because the Universe is infinitely vast, but it is also infinitely small, down to quarks, bosons, or strings, or whatever. Like it’s, well, expanding, elastically, in both directions at once. Would you agree?
Mhm. Mmyes. Quite so, quite so.
Cool.
Is it enough to express it all? What would that mean? If we knew all the laws, would we be able to perfectly predict the future?