Part of what’s going on here is that math notation is … not good. Not for understanding, readability or explanation. Add in the prestige that surrounds being “good at math” and “being able to read that stuff?” and you get an unhealthy amount of gate keeping.
Whenever I’ve been able to find someone breakdown a set of equations into computer code has been a wonderful clarifying experience. And I think it goes beyond just being better at code or something. Computer code, more often, is less forgiving about what exactly is going on in the system. Maths, IME, often leaves some ambiguity or makes some presumption in the style of “oh, of course you’d need to do that”. While if you going to write a program, it all needs to be there, explicitly.
I recommend Brett Victor’s stuff on this: Kill Math
That’s absolutely the opposite for me. Math language is extremely good in summarizing extremely complex logic in few lines. We have huge ML projects with a looot of logic, that can be summarized with either 10 lines of math or 100 lines on English overwhelming cognitive complex.
Math is the best language we have for logic.
This meme is the proof, left representation is more concise and clearer than the for loop, and therefore allows for easily represent much more complex logic, while for loops become quickly unreadable (map and reduce are for instance more readable)
And what about people who don’t understand a for loop? Who is the strong majority? You are underestimating the complexity of programming language specific syntax. What is for, i++, {}, *= for uninitiated?
Code is something you can play with in a variety of environments and get an intuition for through experimentation. Its a system you can inspect and manipulate and explore yourself. It’s also a language which you can use yourself and create your own abstractions incrementally if helpful.
Ideally better syntax would be used for learning, a Python style rather than C style for loop for example, in which case you’re starting to get toward natural language.
In the end though, I’m not putting programming languages on a pedestal here. I’m saying mathematical notation ought not be thought of as the only way to express and understand mathematical ideas, especially for the purposes of learning, on which some of the points I made just above are relevant as is the link I provided further up. Whatever efficiency it can bring, it also brings opacity and it’s inevitably useful to be prepared to provide other expressions of the equations or systems.
The difference is, a for loop is one of the first things any new programmer learns. Anybody with any programming experience can understand the examples on the right, as they follow the same C-like syntax used by a majority of modern programming languages. Kids used to figure this stuff out just learning to customize their MySpace pages.
Few people learn what the symbols on the left mean before they start taking higher math courses in high school, nearly 10 years into their formal math education.
This isn’t to say one way is better than the other, both examples are built for different use-cases. But I wouldn’t be surprised if in 2023, there are more poeple alive who understand the for loops than sigma notation.
Few people? They are high school level where I grew up. Literally everyone with high school diploma must understand at least the sum. In many high schools the final math exam must include at least one integral, that is the infinitesimal sum.
Programming on the other hand isn’t thought in most schools
Programming is taught in cheap educational microcontroller kits aimed at 12 year olds. You can find them in the STEM section of just about any toy store. This idea that few people ever learn to code before calculus seems crazy to me, most of my peers were at least writing simple scripts by middle school. This is because programming is much more easily self-taught than other STEM subjects, and can be used to solve more immediate everyday problems kids who grew up with computers might seek to solve.
I’m not saying everyone learns to code before they learn higher math. I am saying that you shouldn’t be surprised that the comparisons in the OP have proven popular and insightful for a lot of people, because there are a lot of us who learned to code first.
They are high school level where I grew up. Literally everyone with high school diploma must understand at least the sum.
My school district in Utah did not require any math credits beyond Algebra 2 at the time I graduated. trig and calculus were classes I took because I wanted to. But Utah’s STEM requirements are woefully inadequate in my book.
It would really, really suck if we had to do math with for loops instead of sigma notation. It’s egregiously common. It’s also just not that hard to learn. For example, you can look at this meme and pretty much know how to read it.
It’s funny, with the increase in use of numerical models, so much math has been turned into computer code. Derivatives and integrals as well are defined by finite difference formulas that serve as the basis for the notations. The point of them isn’t to explain, it’s just to simplify writing and reading it. I agree it can be a bit obtuse but if you had to write out a for loop to solve a math equation every time it would take forever lol
Well this is where the computing perspective comes in.
Programming culture has generally learnt over time that the ability to read code is important and that the speed/convenience of writing ought to be traded off, to some extent, for readability. Opinions will vary from programmer to programmer and paradigm/language etc. But the idea is still there, even for a system whose purpose is to run on a computer and work.
In the case of mathematical notation, how much is maths read for the purposes of learning and understanding? Quite a lot I’d say. So why not write it out as a for loop for a text/book/paper that is going to be read my many people potentially many times?!
If mathematicians etc need a quick short hand, I think human history has shown that short hands are easily invented when needed and that we ought not worry about such a thing … it will come when needed.
Actually programs are much less readable than corresponding math representation. Even in a simpler example of a for loop. Code is known to quickly add cognitive complexity, while math language manage to keep complexity understandable.
Have you tried reading how a matrix matrix multiplication is implemented with for loops? Compare it with the mathematical representation to see what I mean
Success of fortran, mathematica, R numpy, pandas and even functional programming is because they are built to make programming closer to the simplicity of math
Will I think there’s a danger here to conflate abstraction with mathematical notation. Code, whether Fortran, C or numpy, is capable of abstraction just as mathematics is. Abstraction can help bring complexity under control. But what happens when you need to understand that complexity because you haven’t learnt it yet?
Now sure writing a program that will actually work and perform well adds an extra cognitive load. But I’m talking more about procedural pseudo code being written for the purposes of explaining to toss who don’t already understand.
Math is the language developed exactly for that, to be an unambiguous, standard way to represent extremely complex, abstract concepts.
In the example above, both the summation and the for loop are simply
a_1 + a_2 + ... + a_n
Math is the language to explain, programming languages is to implement it in a way that can be done by computers. In a real case scenario is more often
sum(x)
or
x.sum()
as a for loop is less readable (and often unoptimized).
If someone doesn’t know math he can do the same as those who don’t know programming: learn it.
Learning barrier of math is actually lower than programming
I agree. Mathematical notation is often terribly opaque. And sometimes outright broken. Why the hell is it sin²(x)? Any reasonable programmer will tell you that this syntax will only lead to trouble. ;)
Putting an exponent on a function symbol like that usually means either a typical exponential/power, except when it’s -1, in which case it’s a functional inverse. sin^(-1)(x) is the functional inverse of sin(x), which is not the same as the reciprocal (sin(x))^(-1). Some people even use sin^(a)(x) where a is an integer to denote functional composition, so sin^(2)(x) = sin(sin(x)).
Part of what’s going on here is that math notation is … not good. Not for understanding, readability or explanation. Add in the prestige that surrounds being “good at math” and “being able to read that stuff?” and you get an unhealthy amount of gate keeping.
Whenever I’ve been able to find someone breakdown a set of equations into computer code has been a wonderful clarifying experience. And I think it goes beyond just being better at code or something. Computer code, more often, is less forgiving about what exactly is going on in the system. Maths, IME, often leaves some ambiguity or makes some presumption in the style of “oh, of course you’d need to do that”. While if you going to write a program, it all needs to be there, explicitly.
I recommend Brett Victor’s stuff on this: Kill Math
That’s absolutely the opposite for me. Math language is extremely good in summarizing extremely complex logic in few lines. We have huge ML projects with a looot of logic, that can be summarized with either 10 lines of math or 100 lines on English overwhelming cognitive complex.
Math is the best language we have for logic.
This meme is the proof, left representation is more concise and clearer than the for loop, and therefore allows for easily represent much more complex logic, while for loops become quickly unreadable (map and reduce are for instance more readable)
Except you know what the left side is saying already. That’s a language and a phrase in that language you know well. What about people who don’t?
And what about people who don’t understand a for loop? Who is the strong majority? You are underestimating the complexity of programming language specific syntax. What is
for
,i++
,{}
,*=
for uninitiated?Code is something you can play with in a variety of environments and get an intuition for through experimentation. Its a system you can inspect and manipulate and explore yourself. It’s also a language which you can use yourself and create your own abstractions incrementally if helpful.
Ideally better syntax would be used for learning, a Python style rather than C style for loop for example, in which case you’re starting to get toward natural language.
In the end though, I’m not putting programming languages on a pedestal here. I’m saying mathematical notation ought not be thought of as the only way to express and understand mathematical ideas, especially for the purposes of learning, on which some of the points I made just above are relevant as is the link I provided further up. Whatever efficiency it can bring, it also brings opacity and it’s inevitably useful to be prepared to provide other expressions of the equations or systems.
The difference is, a for loop is one of the first things any new programmer learns. Anybody with any programming experience can understand the examples on the right, as they follow the same C-like syntax used by a majority of modern programming languages. Kids used to figure this stuff out just learning to customize their MySpace pages.
Few people learn what the symbols on the left mean before they start taking higher math courses in high school, nearly 10 years into their formal math education.
This isn’t to say one way is better than the other, both examples are built for different use-cases. But I wouldn’t be surprised if in 2023, there are more poeple alive who understand the for loops than sigma notation.
Few people? They are high school level where I grew up. Literally everyone with high school diploma must understand at least the sum. In many high schools the final math exam must include at least one integral, that is the infinitesimal sum.
Programming on the other hand isn’t thought in most schools
Programming is taught in cheap educational microcontroller kits aimed at 12 year olds. You can find them in the STEM section of just about any toy store. This idea that few people ever learn to code before calculus seems crazy to me, most of my peers were at least writing simple scripts by middle school. This is because programming is much more easily self-taught than other STEM subjects, and can be used to solve more immediate everyday problems kids who grew up with computers might seek to solve.
I’m not saying everyone learns to code before they learn higher math. I am saying that you shouldn’t be surprised that the comparisons in the OP have proven popular and insightful for a lot of people, because there are a lot of us who learned to code first.
My school district in Utah did not require any math credits beyond Algebra 2 at the time I graduated. trig and calculus were classes I took because I wanted to. But Utah’s STEM requirements are woefully inadequate in my book.
It would really, really suck if we had to do math with for loops instead of sigma notation. It’s egregiously common. It’s also just not that hard to learn. For example, you can look at this meme and pretty much know how to read it.
Concise, yes. Clearer, definitely not.
It’s funny, with the increase in use of numerical models, so much math has been turned into computer code. Derivatives and integrals as well are defined by finite difference formulas that serve as the basis for the notations. The point of them isn’t to explain, it’s just to simplify writing and reading it. I agree it can be a bit obtuse but if you had to write out a for loop to solve a math equation every time it would take forever lol
Well this is where the computing perspective comes in.
Programming culture has generally learnt over time that the ability to read code is important and that the speed/convenience of writing ought to be traded off, to some extent, for readability. Opinions will vary from programmer to programmer and paradigm/language etc. But the idea is still there, even for a system whose purpose is to run on a computer and work.
In the case of mathematical notation, how much is maths read for the purposes of learning and understanding? Quite a lot I’d say. So why not write it out as a for loop for a text/book/paper that is going to be read my many people potentially many times?!
If mathematicians etc need a quick short hand, I think human history has shown that short hands are easily invented when needed and that we ought not worry about such a thing … it will come when needed.
Actually programs are much less readable than corresponding math representation. Even in a simpler example of a for loop. Code is known to quickly add cognitive complexity, while math language manage to keep complexity understandable.
Have you tried reading how a matrix matrix multiplication is implemented with for loops? Compare it with the mathematical representation to see what I mean
Success of fortran, mathematica, R numpy, pandas and even functional programming is because they are built to make programming closer to the simplicity of math
Will I think there’s a danger here to conflate abstraction with mathematical notation. Code, whether Fortran, C or numpy, is capable of abstraction just as mathematics is. Abstraction can help bring complexity under control. But what happens when you need to understand that complexity because you haven’t learnt it yet?
Now sure writing a program that will actually work and perform well adds an extra cognitive load. But I’m talking more about procedural pseudo code being written for the purposes of explaining to toss who don’t already understand.
Math is the language developed exactly for that, to be an unambiguous, standard way to represent extremely complex, abstract concepts.
In the example above, both the summation and the for loop are simply
Math is the language to explain, programming languages is to implement it in a way that can be done by computers. In a real case scenario is more often
or
as a for loop is less readable (and often unoptimized).
If someone doesn’t know math he can do the same as those who don’t know programming: learn it.
Learning barrier of math is actually lower than programming
Using for loops instead of sigma notation would be almost universally awful for readability.
I agree. Mathematical notation is often terribly opaque. And sometimes outright broken. Why the hell is it sin²(x)? Any reasonable programmer will tell you that this syntax will only lead to trouble. ;)
What’s wrong with sin^2(x)?
Putting an exponent on a function symbol like that usually means either a typical exponential/power, except when it’s -1, in which case it’s a functional inverse. sin^(-1)(x) is the functional inverse of sin(x), which is not the same as the reciprocal (sin(x))^(-1). Some people even use sin^(a)(x) where a is an integer to denote functional composition, so sin^(2)(x) = sin(sin(x)).
Besides that pretty major issue, nothing.