The nature of algorithms and mathematics



  • [EDIT: Forked from https://forum.qt.io/topic/90275/i-have-5-years-of-active-experience-in-both-qt-and-labview-ama-featuring-a-lengthy-prologue -- JKSH]

    @kshegunov said in I have >5 years of active experience in both Qt and LabVIEW. AMA. (featuring a lengthy prologue):

    We don't think or work anything like a suitable way for computer programs, specifications, algorithms.

    It's strange you say that. We (people) invented the very concept of algorithm - a sequence of atomic steps to be taken to achieve a goal. How does this not translate to everyday life, or regular thinking?

    OOI, I don't think we "invented" algorithms, I think we discovered them. Cf. mathematics.

    I don't think it is anything like "everyday life, or regular thinking". Humans do not perform "a sequence of atomic steps", but computers do. We jump all over the place, we're not at all rigorous, we implicitly use common-sense & real-world knowledge all the time without even realising that. That's why computers are better than us at multiplying, while we're better (so far) at recognising people.

    The computers of tomorrow will be doing their own "programming", not us.

    The media and people constantly talk about that nonsense, hasn't happened yet. And to be completely blunt it probably won't happen in the current century ...

    Well it's true that pundits have been saying this for way too long, and it hasn't happened yet. But I sense this really is beginning to change recently. We have had a million years practising, and computers have had only 50. I believe it will happen in the current century.


  • Qt Champions 2017

    @JonB said in I have >5 years of active experience in both Qt and LabVIEW. AMA. (featuring a lengthy prologue):

    OOI, I don't think we "invented" algorithms, I think we discovered them. Cf. mathematics.

    We invented the concept - the structured abstract idea of an algorithm. As for mathematics - we invented that too - an abstract and symbolic representation, a knife to dissect reality with and see what's inside.

    I don't think it is anything like "everyday life, or regular thinking". Humans do not perform "a sequence of atomic steps", but computers do.

    Oh, but we do. We just don't usually think about it. :)
    Say you want to drive to the gym, you don't just materialize there, do you? You get in the car, think about the shortest/fastest route. Think about possible obstructions (traffic jams) and then drive to there. And driving is just an algorithm, so much so that the clutch and shift-stick handling is even done in the background, you don't really think about it.

    We jump all over the place, we're not at all rigorous, we implicitly use common-sense & real-world knowledge all the time without even realising that.

    Rigorous is quite a sticky concept. What does rigor constitute in? That you can put a number for the time you drive to the gym? Well you can, based on common sense and real-world knowledge (or what's simply called "past experience"). If you think about it you can estimate pretty much any drive path's time if you consider the factors, and what we do, that computers ordinarily don't - we use heuristics based on our past experience, which allows us to judge fairly correctly even when we have incomplete information. That's both pretty great and quite terrible at times ... ;)

    That's why computers are better than us at multiplying, while we're better (so far) at recognising people.

    Nope, that's because they were designed to multiply. You're specialized in different things. Consider how hard a computer finds image processing and feature analysis. You can do it in an instant, without even thinking about it. It just happens because of the billions of years of evolution that drove that visual cortex of yours to specialize in that task - to get the signals coming from the eyes and process them with incredible speed.

    But I sense this really is beginning to change recently. We have had a million years practising, and computers have had only 50. I believe it will happen in the current century.

    Maybe, maybe not, but it's a matter of speculation at this point.



  • @kshegunov said in I have >5 years of active experience in both Qt and LabVIEW. AMA. (featuring a lengthy prologue):

    We'll have to agree to disagree about much of this :)

    As for mathematics - we invented that too - an abstract and symbolic representation, a knife to dissect reality with and see what's inside.

    There is much difference of opinion as to whether math is "invented" or "discovered". Plato would not agree with you! To me, the area of a circle is discovered and not invented, and that's what math is.


  • Qt Champions 2017

    @JonB said in I have >5 years of active experience in both Qt and LabVIEW. AMA. (featuring a lengthy prologue):

    We'll have to agree to disagree about much of this

    I have no problem with this, but I don't mind arguing about it either. :)
    I've been told I'm INTP personality type, and if you believe in psychology (believe being the key word), then probably I fall into that classification.

    @JonB said in I have >5 years of active experience in both Qt and LabVIEW. AMA. (featuring a lengthy prologue):

    There is much difference of opinion as to whether math is "invented" or "discovered". Plato would not agree with you! To me, the area of a circle is discovered and not invented, and that's what math is.

    Well, Plato would be wrong then! ;P
    The area of circle is a very real (although a circle isn't that much of a real object), however quantifying the area through math is quite invented ... ;)
    And it gets worse the more deeply into math you get. At some point you're thinking more about abstract concepts, than real-world phenomena ...



  • @kshegunov
    I only have limited time to argue, as I really need to do my work!

    For this debate, that's why I originally wrote:

    OOI, I don't think we "invented" algorithms, I think we discovered them. Cf. mathematics.

    See the "think"? I meant, it's my opinion, in a debated area. You seem to be telling me what the case is... :)

    The area of circle is a very real (although a circle isn't that much of a real object), however quantifying the area through math is quite invented ... ;)
    And it gets worse the more deeply into math you get. At some point you're thinking more about abstract concepts, than real-world phenomena ...

    • What do you mean by "real"?
    • To me, the symbolism used in math (like area = pi * r^2) is indeed invented, but the fact is discovered. I don't care how you might or might not represent this, it remains the case that the area is what it is. And that's what I see math as.
    • "than real world phenomena" math (Pure, at least) has never been about "real-world phenomena". As soon as that's the question, it's Physics or maybe Applied Math. There are no "real world" points, lines or circles, by definition.

    To me, when someone sat down and wrote an algorithm to sort some numbers they were discovering that that sequence of steps delivers the correct result, like pure math and circles. The implementation was an invention.


  • Qt Champions 2017

    @JonB said in I have >5 years of active experience in both Qt and LabVIEW. AMA. (featuring a lengthy prologue):

    @kshegunov
    I only have limited time to argue, as I really need to do my work!

    Indeed.

    See the "think"? I meant, it's my opinion, in a debated area. You seem to be telling me what the case is... :)

    True, but exploring that peculiarity will spur yet another argument, and I think I already hijacked JKSH's thread ...

    • What do you mean by "real"?

    Tangible. Something that can be measured, observed or otherwise quantified physically.

    • To me, the symbolism used in math (like area = pi * r^2) is indeed invented, but the fact is discovered. I don't care how you might or might not represent this, it remains the case that the area is what it is. And that's what I see math as.

    Math (in this case geometry) is based on axioms, so yeah, it's invented ... :)
    No one could prove that parallel lines don't cross, nor can it be proven otherwise. It's necessary for those fundamental axioms to not lead to a controversy, but this doesn't mean they form a complete set, or that they're even correct. One of the reasons I often state that QM is actually a mathematical discipline (it has 5 axioms too).

    As a matter of fact, there are different representations for area depending on the chosen metric, of which the most commonly known in programming is the taxicab metric. :)

    • "than real world phenomena" math (Pure, at least) has never been about "real-world phenomena". As soon as that's the question, it's Physics or maybe Applied Math. There are no "real world" points, lines or circles, by definition.

    Before people even considered treating physics as a separate subject, it was so intertwined with math, they were practically the same thing (not that it's not the case now). Math as a subject evolved from the need to describe the mentioned phenomena in some abstract symbolic way.

    To me, when someone sat down and wrote an algorithm to sort some numbers they were discovering that that sequence of steps delivers the correct result, like pure math and circles. The implementation was an invention.

    Okay, I'll leave that particular point simmer a bit.



  • @kshegunov said in I have >5 years of active experience in both Qt and LabVIEW. AMA. (featuring a lengthy prologue):

    Math (in this case geometry) is based on axioms, so yeah, it's invented ... :)

    No, it just isn't, IMHO! A light bulb is an invention, the area of a circle/triangle is a discovery. A light bulb wouldn't be if someone hadn't sat down and built one, but a circle's area would still be what it is whether people or axioms did or did not exist. I really don't think we're going to agree here :)


  • Moderators

    @kshegunov said in The nature of algorithms and mathematics:

    As for mathematics - we invented that too - an abstract and symbolic representation, a knife to dissect reality with and see what's inside.

    @JonB said in The nature of algorithms and mathematics:

    the fact is discovered. I don't care how you might or might not represent this, it remains the case that the area is what it is. And that's what I see math as.

    IMHO, much of this debate/argument here stems from the fact that you both define mathematics somewhat differently from each other. @kshegunov is talking about the broad mathematical tools (which are overwhelmingly diverse), while @JonB is talking about specific theorems/facts that were derived through mathematical studies.

    I agree with @kshegunov that humans invented ("designed", "developed", "formalized") a multitude of mathematical tools. For example, the concept and notation of powers and exponents as "repeated multiplication" is a human creation.

    I also agree with @JonB that humans discovered ("derived", "proved") a multitude of mathematical theorems. For example, after formalizing power functions and exponential functions, humans put these tools to use -- for example, in making money via Compound Interest. We experimented with these tools and consequently discovered the value of Euler's Number.

    I see Mathematics as an ongoing dance between these two: Inventions facilitate new discoveries; discoveries facilitate new inventions. Do you guys find this agreeable?

    • Some examples of mathematical inventions: Algebra, the factorial, calculus
    • Some examples of mathematical discoveries: The value of pi, the relationship between angles in a polygon

    To me, when someone sat down and wrote an algorithm to sort some numbers they were discovering that that sequence of steps delivers the correct result, like pure math and circles. The implementation was an invention.

    Do you think the SHA-3 algorithm (a.k.a. a special case of the Keccak algorithm) was invented or discovered? In this case, there is no "correct result" that the developers were trying to find.

    @JonB said in The nature of algorithms and mathematics:

    I don't think it is anything like "everyday life, or regular thinking". Humans do not perform "a sequence of atomic steps", but computers do. We jump all over the place, we're not at all rigorous, we implicitly use common-sense & real-world knowledge all the time without even realising that.

    Yes, computer scientists have realized the benefits of the human approach. Hence the development of neural networks.

    That's why computers are better than us at multiplying, while we're better (so far) at recognising people.

    Computers are catching up fast! (Some would even say they've surpassed us) A human can't have done this realistically: http://www.bbc.com/news/world-asia-china-43751276

    @kshegunov said in The nature of algorithms and mathematics:

    @JonB said in I have >5 years of active experience in both Qt and LabVIEW. AMA. (featuring a lengthy prologue):

    See the "think"? I meant, it's my opinion, in a debated area. You seem to be telling me what the case is... :)

    True, but exploring that peculiarity will spur yet another argument, and I think I already hijacked JKSH's thread ...

    Eh, don't let me stop you! In any case, this is a standalone thread now :)


  • Moderators

    I find math such an abstract thing in the first place.

    But I would represent the opinion, that nearly everything in mathematics is invented. Based on the following:

    We humans decided to define, what is

    • Addition
    • Subtraction
    • Multiplication
    • Division
    • the inverse operation
    • the Identity element
    • etc

    These are the basics that define math as we know it and those are defenitly inventions by humans.
    Also every constant and relation is highly depending on the numeric system we use, which happens to be a abomination of base 12 (time, dates, angles ...) and base 10

    In my opinion, discoveries in an invented system /relations are basically still inventions 😊



  • @J.Hilk
    I seem to be a lone voice, so I'll keep it very brief, but I cannot help but comment...

    I so strongly do not agree that, e.g., "addition" is:

    defenitly inventions by humans

    ... Of course the symbols or the numeric system (like base 10/12), or the way we measure/express angles (degrees, radians) are indeed human inventions. But a triangle has 3 angles whose magnitudes add up to the same magnitude as the angle of a straight line whether we care to express it, or even notice it, or not. The Sun has a volume which depends only on its radius whether humans sit on Earth and contemplate it or whether humans never evolved. When you start with one apple, and a another apple falls from a tree, there are more apples then there were to start with, even if the only life on Earth turns out to be fruits. In short, there is no invention here, only discovery...

    I guess I'll have to plough my lonely furrow in my understanding/beliefs compared to you Qt guys... :)


Log in to reply
 

Looks like your connection to Qt Forum was lost, please wait while we try to reconnect.