I wish there was a better explanation on what the exact problem was that they were trying to solve. I couldn't understand the problem - if I did I would have proposed my own solution, and then compared to the thinking process proposed to validate if that could've worked better for me, but I can't be bothered to follow the thinking process in the symbols like this without even knowing what we are solving for.
Was it about how to design a profitable algorithm? Was it about how to design the bot? was it about understanding if results from the bot were beneficial?
I think the author makes a good point about understanding structure over symbol manipulation, but there's a slippery slope here that bothers me.
In practice, I find it much more productive to start with a computational solution - write the algorithm, make it work, understand the procedure. Then, if there's elegant mathematical structure hiding in there, it reveals itself naturally. You optimize where it matters.
The problem is math purists will look at this approach and dismiss it as "inelegant" or "brute force" thinking. But that's backwards. A closed-form solution you've memorized but don't deeply understand is worse than an iterative algorithm you've built from scratch and can reason about clearly.
Most real problems have perfectly good computational solutions. The computational perspective often forces you to think through edge cases, termination conditions, and the actual mechanics of what's happening - which builds genuine intuition. The "elegant" closed-form solution often obscures that structure.
I'm not against finding mathematical elegance. I'm against the cultural bias that treats computation as second-class thinking. Start with what works. Optimize when the structure becomes obvious. That's how you actually solve problems.
Some people like Peter Norvig prefer top-down, hackers like me and you prefer bottom-up. Many problems can be solved either way. But for some problems, if you use the wrong approach, you're gonna have a bad time. See Ron Jeffries' attempt to solve sudoku.
The top-down (mathematical) approach can also fail, in cases where there's not an existing math solution, or when a perfectly spherical cow isn't an adequate representation of reality. See Minix vs Linux, or OSI vs TCP/IP.
Fair point about problem-fit - some problems do naturally lend themselves to one approach over the other.
But I think the Sudoku example is less about top-down vs bottom-up and more about dogmatic adherence to abstractions (OOP in that case). Jeffries wasn't just using a 'hacker' approach - he was forcing everything through an OOP lens that fundamentally didn't fit the problem structure.
But yes, same issue can happen with the 'mathematical' approach - forcing "elegant" closed-form thinking onto problems that are inherently messy or iterative.
I really enjoyed the book Mathematica by David Bessis, who writes about his creative process as a mathematician. He makes a case that formal math is usually the last step to refine/optimize an idea, not the starting point as is often assumed. His point is to push against the cultural idea that math == symbols. Sounds similar to some of what you're describing.
I have math papers in top journals and that's exactly how I did math;
Just get a proof of the open problem no matter how sketchy. Then iterate and refine.
But people love to reinvent the wheel without caring about abstractions, resulting in languages like Python being the defacto standard for machine learning
I agree with the thrust of the article but my conclusion is slightly different.
In my experience the issue is sometimes that Step 1 doesn't even take place in a clear cut way. A lot of what I see is:
1. Design algorithms and data structures
2. Implement and test them
Or even:
1. Program algorithms and data structures
2. Implement and test them
Or even:
1. Implement
2. Test
Or even:
1. Test
2. Implement
:-(
IMO, this last popular approach gets things completely backwards. It assumes there is no need to think about the problem before hand, to identify it, to spend any amount of time thinking about what needs to happen on a computer for that problem to be solved... you just write down some observable behaviors and begin reactively trying to implement them. Huge waste of time.
The point also about "C-style languages being more appealing" is well taken. It's not so much about the language in particular. If you are able to sit down and clearly articulate what you're trying to do, understand the design tradeoffs, which algorithms and data structures are available, which need to be invented... you could do it in assembly if it was necessary, it's just a matter of how much time and energy you're willing to spend. The goal becomes clear and you just go there.
I have an extensive mathematical background and find this training invaluable. On the other hand, I rarely need to go so far as carefully putting down theorems and definitions to understand what I'm doing. Most of this happens subliminally somewhere in my mind during the design phase. But there's no doubt that without this training I'd be much worse at my job.
Reminds me of the attempt to TDD a way to a sudoku solver. Agreed that it is a bit of a crazy path.
Not that Implement/Test can't work. As frustrating as it is, "just do something" works far better than many alternatives. In particular, with enough places doing it, somebody may succeed.
> Or even: 1. Test 2. Implement
IMO, this last popular approach gets things completely backwards. It assumes there is no need to think about the problem before hand
I think you misunderstand this approach.
The point of writing the tests is to think about the desired behaviour of the system/module you are implementing, before your mind gets lost in all the complexities which necessarily happens during the implementation.
When you write code, and hit a wall, it’s super easy to get hyper-focused on solving that one problem, and while doing so: lose the big picture.
Writing tests first can be a way to avoid this, by thinking of the tests as a specification you think you should adhere to later, without having to worry about how you get there.
For some problems, this works really well. For others, it might not. Just don’t dismiss the idea completely :)
> Programming languages are implementation tools for instructing machines, not thinking tools for expressing ideas.
I completely disagree with that assumption.
Any function call that proceeds to capture logic, e. g. data from reallife systems, drones or robot, or robots in logistics - you will often see they proceed in a logic chain. Sometimes they use a DSL, be it in rails, but also older DSLs such as the sierra game logic and other DSLs.
If you have a good programming language it is basically like "thinking" in that language too. You can also see this in languages such as C, and the creation of git. Now I don't think C is a particularly great language for higher abstractions, but the assumption that "only math is valid and any other instruction to a machine is pointless", is simply flat out wrong. Both is perfectly valid and fine, they just operate on a different level usually. My brain is more comfortable with ruby than with C, for instance. I'd rather want languages to be like ruby AND fast, than have to adjust down towards C or assembly.
Also the author neglects that you can bootstrap in language xyz to see if a specific idea is feasible. That's what happened in many languages.
That's still a chaotic composition of thoughts, not driven by any identified structure or symmetry of the situation.
Why a program is needed? What constraints lead to the existence of that need? Why didn't human interactions need a program or thinking in math? Why do computers use 0s and 1s? You need to start there and systematically derive other concepts, that are tightly linked and have a purpose driven by the pre-existing context.
what compels software people to write opinion pieces. like you don't see bakers, mechanics, dentists, accountants writing blog posts like this...
Edit: to everyone responding that there are trade mags - yes SWE has those too (they're called developer conferences). In both categories, someone has to invite you to speak. I'm asking what compels Joe Shmoe SWE to pontificate on things they haven't been asked by anyone to pontificate on.
What an insane statement. What compels anyone to write an opinion piece? They have an opinion and want to share it! Why in god's name should someone have to be invited to share their opinion, on their own website no less.
Mathematicians certainly write volumes of opinion pieces. The article you are complaining about starts from the presumption that software could benefit from more mathematical thinking, even if that doesn't explain broader general trends.
(But I think it does apply more generally. We refer to it as Computer Science, it is often a branch of Mathematics both historically and today with some Universities still considering it a part of their Math department. Some of the industry's biggest role models/luminaries often considered themselves mathematicians first or second, such as Turing, Church, Dijkstra, Knuth, and more.)
do we really have to retread this? unless you are employed by a university to perform research (or another research organization), you are not a computer scientist or a mathematician or anything else of that sort. no more so than an accountant is an economist or a carpenter is an architect.
> The article you are complaining about starts from the presumption that software
reread my comment - at no point did i complain about the article. i'm complaining that SWEs have overinflated senses of self which compel them to write such articles.
> do we really have to retread this? unless you are employed by a university to perform research (or another research organization), you are not a computer scientist or a mathematician or anything else of that sort. no more so than an accountant is an economist or a carpenter is an architect.
Doing math or science is the criterion for being a mathematician or scientist, not who employs you how.
Those who do not learn history are doomed to repeat it (poorly). Same for anyone doing software that entirely ignores Computer Science. You are missing core skills and reinventing well known wheels when you could be busy building smarter things.
> no more so than an accountant is an economist or a carpenter is an architect
I know many accountants who would claim you can't be an a good accountant without being an economist. Arguably that's most of the course load of an MBA in a nutshell. I don't know a carpenter who would claim to be an architect, usually when carpentry happens is after architecture has been done, but I know plenty of carpenters that claim to be artists and/or artisans (depending on how you see the difference), that take pride in their craft and understand the aesthetic underpinnings.
> reread my comment - at no point did i complain about the article. i'm complaining that SWEs have overinflated senses of self which compel them to write such articles.
You chose which article to post your complaint to. The context of your comment is most directly complaining about this specific article. That's how HN works. If you didn't read the article and feel like just generically complaining about the "over-inflated senses of self" in the software industry, perhaps you should be reading some forum that isn't HN?
> Yes and you have to be invited to publish in a place. Meaning at least one other person has to believe your opinion is significant........
I don't think that this is true. The vast majority of technical math publications, for example, are reviewed, but not invited. And expository, and even technical, math is widely available in fora without any refereeing process (and consequent lack of guarantee of quality).
>I'm asking what compels Joe Shmoe SWE to pontificate on things they haven't been asked by anyone to pontificate on.
The Internet is absolutely full of this. This is purely your own bias here, for any of the trades you mentioned try looking. You will find Videos, podcast and blogs within minutes.
People love talking about their work, no matter their trade. They love giving their opinions.
Bakers certainly write books and magazines[0] on baking, as well as interminable stories about their childhood. Mechanics: [1]. I could only find one obvious one for dentists: [2]. Somebody else did accountants in the thread. I think it's a human thing, to want to share our opinions, whether or not they are well supported by evidence. I suspect software people write blogs because the tech is easier for them given their day job.
I mean, maybe if your background is mathematics this would make sense. But for a lot of us it isn't, we're more linguistically oriented and we certainly are not going to come up with some pure mathematical formula that describes a problem, but we might describe the problem and break it down into steps and then implement those steps.
I wish there was a better explanation on what the exact problem was that they were trying to solve. I couldn't understand the problem - if I did I would have proposed my own solution, and then compared to the thinking process proposed to validate if that could've worked better for me, but I can't be bothered to follow the thinking process in the symbols like this without even knowing what we are solving for.
Was it about how to design a profitable algorithm? Was it about how to design the bot? was it about understanding if results from the bot were beneficial?
I think the author makes a good point about understanding structure over symbol manipulation, but there's a slippery slope here that bothers me.
In practice, I find it much more productive to start with a computational solution - write the algorithm, make it work, understand the procedure. Then, if there's elegant mathematical structure hiding in there, it reveals itself naturally. You optimize where it matters.
The problem is math purists will look at this approach and dismiss it as "inelegant" or "brute force" thinking. But that's backwards. A closed-form solution you've memorized but don't deeply understand is worse than an iterative algorithm you've built from scratch and can reason about clearly.
Most real problems have perfectly good computational solutions. The computational perspective often forces you to think through edge cases, termination conditions, and the actual mechanics of what's happening - which builds genuine intuition. The "elegant" closed-form solution often obscures that structure.
I'm not against finding mathematical elegance. I'm against the cultural bias that treats computation as second-class thinking. Start with what works. Optimize when the structure becomes obvious. That's how you actually solve problems.
Some people like Peter Norvig prefer top-down, hackers like me and you prefer bottom-up. Many problems can be solved either way. But for some problems, if you use the wrong approach, you're gonna have a bad time. See Ron Jeffries' attempt to solve sudoku.
The top-down (mathematical) approach can also fail, in cases where there's not an existing math solution, or when a perfectly spherical cow isn't an adequate representation of reality. See Minix vs Linux, or OSI vs TCP/IP.
Fair point about problem-fit - some problems do naturally lend themselves to one approach over the other.
But I think the Sudoku example is less about top-down vs bottom-up and more about dogmatic adherence to abstractions (OOP in that case). Jeffries wasn't just using a 'hacker' approach - he was forcing everything through an OOP lens that fundamentally didn't fit the problem structure.
But yes, same issue can happen with the 'mathematical' approach - forcing "elegant" closed-form thinking onto problems that are inherently messy or iterative.
I really enjoyed the book Mathematica by David Bessis, who writes about his creative process as a mathematician. He makes a case that formal math is usually the last step to refine/optimize an idea, not the starting point as is often assumed. His point is to push against the cultural idea that math == symbols. Sounds similar to some of what you're describing.
I have math papers in top journals and that's exactly how I did math;
Just get a proof of the open problem no matter how sketchy. Then iterate and refine.
But people love to reinvent the wheel without caring about abstractions, resulting in languages like Python being the defacto standard for machine learning
I agree with the thrust of the article but my conclusion is slightly different.
In my experience the issue is sometimes that Step 1 doesn't even take place in a clear cut way. A lot of what I see is:
Or even: Or even: Or even: :-(IMO, this last popular approach gets things completely backwards. It assumes there is no need to think about the problem before hand, to identify it, to spend any amount of time thinking about what needs to happen on a computer for that problem to be solved... you just write down some observable behaviors and begin reactively trying to implement them. Huge waste of time.
The point also about "C-style languages being more appealing" is well taken. It's not so much about the language in particular. If you are able to sit down and clearly articulate what you're trying to do, understand the design tradeoffs, which algorithms and data structures are available, which need to be invented... you could do it in assembly if it was necessary, it's just a matter of how much time and energy you're willing to spend. The goal becomes clear and you just go there.
I have an extensive mathematical background and find this training invaluable. On the other hand, I rarely need to go so far as carefully putting down theorems and definitions to understand what I'm doing. Most of this happens subliminally somewhere in my mind during the design phase. But there's no doubt that without this training I'd be much worse at my job.
Reminds me of the attempt to TDD a way to a sudoku solver. Agreed that it is a bit of a crazy path.
Not that Implement/Test can't work. As frustrating as it is, "just do something" works far better than many alternatives. In particular, with enough places doing it, somebody may succeed.
> Or even: 1. Test 2. Implement IMO, this last popular approach gets things completely backwards. It assumes there is no need to think about the problem before hand
I think you misunderstand this approach.
The point of writing the tests is to think about the desired behaviour of the system/module you are implementing, before your mind gets lost in all the complexities which necessarily happens during the implementation.
When you write code, and hit a wall, it’s super easy to get hyper-focused on solving that one problem, and while doing so: lose the big picture.
Writing tests first can be a way to avoid this, by thinking of the tests as a specification you think you should adhere to later, without having to worry about how you get there.
For some problems, this works really well. For others, it might not. Just don’t dismiss the idea completely :)
> Programming languages are implementation tools for instructing machines, not thinking tools for expressing ideas.
I completely disagree with that assumption.
Any function call that proceeds to capture logic, e. g. data from reallife systems, drones or robot, or robots in logistics - you will often see they proceed in a logic chain. Sometimes they use a DSL, be it in rails, but also older DSLs such as the sierra game logic and other DSLs.
If you have a good programming language it is basically like "thinking" in that language too. You can also see this in languages such as C, and the creation of git. Now I don't think C is a particularly great language for higher abstractions, but the assumption that "only math is valid and any other instruction to a machine is pointless", is simply flat out wrong. Both is perfectly valid and fine, they just operate on a different level usually. My brain is more comfortable with ruby than with C, for instance. I'd rather want languages to be like ruby AND fast, than have to adjust down towards C or assembly.
Also the author neglects that you can bootstrap in language xyz to see if a specific idea is feasible. That's what happened in many languages.
That's still a chaotic composition of thoughts, not driven by any identified structure or symmetry of the situation.
Why a program is needed? What constraints lead to the existence of that need? Why didn't human interactions need a program or thinking in math? Why do computers use 0s and 1s? You need to start there and systematically derive other concepts, that are tightly linked and have a purpose driven by the pre-existing context.
Effort was made to write this article. Deep insight in several statements.
what compels software people to write opinion pieces. like you don't see bakers, mechanics, dentists, accountants writing blog posts like this...
Edit: to everyone responding that there are trade mags - yes SWE has those too (they're called developer conferences). In both categories, someone has to invite you to speak. I'm asking what compels Joe Shmoe SWE to pontificate on things they haven't been asked by anyone to pontificate on.
What an insane statement. What compels anyone to write an opinion piece? They have an opinion and want to share it! Why in god's name should someone have to be invited to share their opinion, on their own website no less.
Mathematicians certainly write volumes of opinion pieces. The article you are complaining about starts from the presumption that software could benefit from more mathematical thinking, even if that doesn't explain broader general trends.
(But I think it does apply more generally. We refer to it as Computer Science, it is often a branch of Mathematics both historically and today with some Universities still considering it a part of their Math department. Some of the industry's biggest role models/luminaries often considered themselves mathematicians first or second, such as Turing, Church, Dijkstra, Knuth, and more.)
> Computer Science
do we really have to retread this? unless you are employed by a university to perform research (or another research organization), you are not a computer scientist or a mathematician or anything else of that sort. no more so than an accountant is an economist or a carpenter is an architect.
> The article you are complaining about starts from the presumption that software
reread my comment - at no point did i complain about the article. i'm complaining that SWEs have overinflated senses of self which compel them to write such articles.
> do we really have to retread this? unless you are employed by a university to perform research (or another research organization), you are not a computer scientist or a mathematician or anything else of that sort. no more so than an accountant is an economist or a carpenter is an architect.
Doing math or science is the criterion for being a mathematician or scientist, not who employs you how.
Those who do not learn history are doomed to repeat it (poorly). Same for anyone doing software that entirely ignores Computer Science. You are missing core skills and reinventing well known wheels when you could be busy building smarter things.
> no more so than an accountant is an economist or a carpenter is an architect
I know many accountants who would claim you can't be an a good accountant without being an economist. Arguably that's most of the course load of an MBA in a nutshell. I don't know a carpenter who would claim to be an architect, usually when carpentry happens is after architecture has been done, but I know plenty of carpenters that claim to be artists and/or artisans (depending on how you see the difference), that take pride in their craft and understand the aesthetic underpinnings.
> reread my comment - at no point did i complain about the article. i'm complaining that SWEs have overinflated senses of self which compel them to write such articles.
You chose which article to post your complaint to. The context of your comment is most directly complaining about this specific article. That's how HN works. If you didn't read the article and feel like just generically complaining about the "over-inflated senses of self" in the software industry, perhaps you should be reading some forum that isn't HN?
> you don't see bakers, mechanics, dentists, accountants writing things like this...
There are literally industry publications full of these.
Yes and you have to be invited to publish in a place. Meaning at least one other person has to believe your opinion is significant........
> Yes and you have to be invited to publish in a place. Meaning at least one other person has to believe your opinion is significant........
I don't think that this is true. The vast majority of technical math publications, for example, are reviewed, but not invited. And expository, and even technical, math is widely available in fora without any refereeing process (and consequent lack of guarantee of quality).
Accountants certainly do. They've had trade magazines with opinion pieces since well before the internet.
bakers and mechanics have not had their ego stroked by being overpaid for a decade.
Some do get their egos stroked on their shows.
>I'm asking what compels Joe Shmoe SWE to pontificate on things they haven't been asked by anyone to pontificate on.
The Internet is absolutely full of this. This is purely your own bias here, for any of the trades you mentioned try looking. You will find Videos, podcast and blogs within minutes.
People love talking about their work, no matter their trade. They love giving their opinions.
If they'd had opinion pages at the time, the inventors of nixtamalization would have and should have written something like this.
LOL. Have not seen the "nixta" word since 10 years ago when i was researching how to make grits.
Bakers certainly write books and magazines[0] on baking, as well as interminable stories about their childhood. Mechanics: [1]. I could only find one obvious one for dentists: [2]. Somebody else did accountants in the thread. I think it's a human thing, to want to share our opinions, whether or not they are well supported by evidence. I suspect software people write blogs because the tech is easier for them given their day job.
[0] https://www.google.com/search?q=blaking+magazine [1] https://www.google.com/search?q=mechanics+magazines [2] https://dentistry.co.uk/dentistry-magazine-january-2023-digi...
I mean, maybe if your background is mathematics this would make sense. But for a lot of us it isn't, we're more linguistically oriented and we certainly are not going to come up with some pure mathematical formula that describes a problem, but we might describe the problem and break it down into steps and then implement those steps.