From 10:00 CET Friday 22nd November we will adjust how the server works to deal with some recently reported problems. Therefore there may be a load problem, if you experience more problems than usual trying to access the forum then please PM AndyS or any of the moderators so they can inform me.


"Real" code vs "Fake" code



  • Anybody else have a strange sense of what is "real" code vs what is "fake" code? I am not even sure how to describe the "feeling". For instance, I see C++ programming as real coding, but at the same time feel like QML is less real. When I was programming Python a lot I still felt like Python was less real than C++. Even when I did python JIT programming there was this sensation of it not being real. So I don't know if its just a sensation around the syntax or if its something else. Like compiled vs interpreted. Yet, even when I "compiled" my scripting code it felt less "real".

    Maybe if it doesn't have the potential to produce a segfault its not "real" coding? Also, I don't think I am equating "real" with some kind of machismo thing. Its different than that. Perhaps "closer to the metal" is what I am feeling.


  • Lifetime Qt Champion

    Hi,

    Sure you can make Python segfault, the reference implementation is written in C ^^

    For QMl, I think your feeling comes from the fact that the language is JavaScript based which for a long time was only used to do funny things in website but has since evolved.

    Age of the language might also come into play when thinking about "serious" language as well as where they are used.



  • @SGaist said in "Real" code vs "Fake" code:

    Sure you can make Python segfault

    The only time I could get it to segfault was when I was writing modules for Python. Generally I could only get exceptions to trigger. Those are much kinder to the operation of the program.


  • Moderators

    You're not the only one that has that feeling and I think it is indeed about being close to the metal. The thing is that when you write in those fancy weby languages you think about very abstract machine. You deal with objects, properties, bindings, closures and all those things the hardware doesn't know and couldn't care less about. When you write C or the lower level aspects of C++ you are a lot closer to registers, offsets, alignments and manual memory management (no fancy pants garbage collection). When you fire up a C++ debugger you can basically dig into the raw hardware memory and registers. When you debug, say, JavaScript you're presented with some high level wrappers.
    Not to say one is better than the other in general. They are just better at what they are suited for. All software development these days is just layers upon layers upon layers - OSes, APIs, frameworks, runtimes... C++ is just lower level with all good and bad that this entails.
    Look at this from another angle - to someone who writes assembly for living something like

    template <typename T, typename = std::enable_if_t<std::is_integral<T>::value>
    

    is just a fake gibberish nonsense having nothing to do with "real programming" :)



  • @Chris-Kawa said in "Real" code vs "Fake" code:

    is just a fake gibberish nonsense having nothing to do with "real programming" :)

    I am a C++ programmer and I still say "WTF is that?" when seeing templates. Maybe that is because I have programmed microcontrollers and written assembly in the past...

    I am learning templates though. It just doesn't look like C++ to me yet.


  • Moderators

    I am learning templates though. It just doesn't look like C++ to me yet.

    While not there from the beginning, templates are part of C++ since something like mid eighties. They've been part of all the ISO standardized versions and they are at the very core of the language. Heck, entire standard library is mostly templates. At this point if you're not familiar with them you're basically using C with classes. Nothing wrong with that, but C++ has a lot more to offer (with very low to no cost mind you). Yes, they take you away from that sweet hardware metal into the compile-time land, but they also give you all sorts of benefits and if you're avoiding them on purpose you're missing out on some cool stuff IMO.



  • @fcarney

    For instance, I see C++ programming as real coding, but at the same time feel like QML is less real. When I was programming Python a lot I still felt like Python was less real than C++. Even when I did python JIT programming there was this sensation of it not being real.

    No offence, but I think you need to speak to a shrink ;)


  • Qt Champions 2017

    @Chris-Kawa said in "Real" code vs "Fake" code:

    At this point if you're not familiar with them you're basically using C with classes.

    Polymorphism is dead? I've heard that before, but there's much more nuance than the simple statement. Not that you can't do dynamic dispatch in C, but it's (more) work and is prone to some nasty mishaps.

    Yes, they take you away from that sweet hardware metal into the compile-time land, but they also give you all sorts of benefits and if you're avoiding them on purpose you're missing out on some cool stuff IMO.

    I'd say the opposite is also true - that is - seeking to templatize everything you see. They have merit as much as a hammer in a toolbox - no intrinsic value, but as a part of your general range of tools. Purposely substituting a hammer for a screwdriver and vice versa isn't a productive way of approaching things.



  • @Chris-Kawa said in "Real" code vs "Fake" code:

    missing out on some cool stuff

    I agree. My first C++ compiler was Turbo C++ 2.0. Not sure what version of C++ it was based upon. It was definitely before templates, or at least I didn't know about them. That was in the mid nineties. That is why I am learning them.

    It is interesting though, that quite a few programmers have worked on the codebases at work. Yet, I don't think I have found any code outside the Qt libs, that have templates in them. I have written some simple code using templates to reduce code duplication. No idea what the other programmers I work with will make of that. I rarely use macros either.


  • Moderators

    @kshegunov said:

    Polymorphism is dead?

    Obviously not ;) Sure, there's more to C++ over C than templates, but I'm writing a forum post not a book :)

    They have merit as much as a hammer in a toolbox (...) Purposely substituting a hammer for a screwdriver and vice versa isn't a productive way of approaching things.

    I 100% agree and was not suggesting anything like that. Just pointing out that they elegantly solve some specific problems that would be ugly or cumbersome otherwise and it's good to have that hammer in your toolbox for cases where you need to hit a nail.

    @fcarney said:

    Yet, I don't think I have found any code outside the Qt libs, that have templates in them

    Wow, really? Not even a string or a vector? So you're still exclusively using raw arrays and char* ? That's brutal :)
    I don't think programming microcontrollers inherently requires you to drop to the lower level C++. Sure you can kill performance or skyrocket memory footprint easily if you don't have a head on your neck, but that happens only if you blindly use high level features without knowing how they work.
    I mean you can program stuff like Commodore64 in C++17 these days and it produces pretty sweet assembly so there's no real excuse not to.


  • Qt Champions 2017

    @fcarney said in "Real" code vs "Fake" code:

    Not sure what version of C++ it was based upon.

    No such a thing existed before c++98, so - the original version.


  • Qt Champions 2017

    @Chris-Kawa said in "Real" code vs "Fake" code:

    I 100% agree and was not suggesting anything like that.

    "A scientist's aim in a discussion with his colleagues is not to persuade, but to clarify." ;)



  • @Chris-Kawa said in "Real" code vs "Fake" code:

    Not even a string or a vector

    Doh! Okay, I use those all the time. I was thinking of someone writing their own template. We use templates all the time, but we haven't written very many of them.


  • Qt Champions 2017

    @fcarney said in "Real" code vs "Fake" code:

    We use templates all the time, but we haven't written very many of them.

    Just wrote one today. A simple one, admittedly, but they're useful in the right places. Here's the "glory":

    template <typename T>
    typename std::enable_if<std::is_floating_point<T>::value, bool>::type equal(const T & lhs, const T & rhs)
    {
        return std::abs(lhs - rhs) <= std::numeric_limits<T>::epsilon() * std::min(std::abs(lhs), std::abs(rhs));
    }
    

    I still cringe at the thought of the verbose semi-magic that is meta-programming, and we certainly need a better way.


  • Moderators

    @fcarney said:

    We use templates all the time, but we haven't written very many of them.

    I think that's actually pretty normal and a good model. They are pretty useful for stable and tested library code intended to be reused, but they do make code harder to read and debug so, like I mentioned, I don't suggest to use them everywhere because you can. Just when it makes sense.

    @kshegunov said:

    "A scientist's aim in a discussion with his colleagues is not to persuade, but to clarify." ;)

    Narrator: A scientist walks into a bar...
    Scientist: Weeeell, not so much a bar as a low class drinking establishment and it wasn't really a walk, more of an accidental sway

    and we certainly need a better way.

    Rejoice, concepts are on their way ;)


  • Moderators

    @fcarney said in "Real" code vs "Fake" code:

    Maybe if it doesn't have the potential to produce a segfault its not "real" coding? Also, I don't think I am equating "real" with some kind of machismo thing. Its different than that. Perhaps "closer to the metal" is what I am feeling.

    I believe the terms you're after are not "real"/"fake", but "low-level"/"high-level".

    The way I see it, "real" code is anything you write that makes a machine do something you want. So, high-level Python and QML can be just as "real" as low-level C++ and Assembly.



  • @JKSH said in "Real" code vs "Fake" code:

    I believe the terms you're after are not "real"/"fake", but "low-level"/"high-level".

    I think that may be more accurate. When I first started programming Python it felt like playing with Legos. While programming in C++ felt like using wrenches, hacksaws, and welders. I remember also feeling like I had all the power I needed with Python as I could always "write a module in C" if I needed to.


  • Moderators

    @fcarney said in "Real" code vs "Fake" code:

    I think that may be more accurate. When I first started programming Python it felt like playing with Legos. While programming in C++ felt like using wrenches, hacksaws, and welders. I remember also feeling like I had all the power I needed with Python as I could always "write a module in C" if I needed to.

    The appeal of high-level languages: It has built-in protections to stop us from shooting ourselves in the foot.

    The appeal of low-level languages: It allows us to shoot through the gap between our toes to achieve our goal the way we want.


  • Qt Champions 2017

    @JKSH said in "Real" code vs "Fake" code:

    The appeal of high-level languages: It has built-in protections to stop us from shooting ourselves in the foot.
    The appeal of low-level languages: It allows us to shoot through the gap between our toes to achieve our goal the way we want.

    Somewhat correct. Even with low level languages we are quite protected most of the time (e.g. ASLR).
    In fact one should realize that the software we write is an onion (cf. swiss cheese model) - there are layers on top of layers on top of layers. Peeling them off may lead to tears too.

    Basically you have the hardware, on top of which you have the firmware (microcode for CPUs and such), on top of which you have the BIOS (which may include the firmware or at least the tools to manipulate it), on top of which you have the drivers and the OS, on top of which you have services (or daemons), and in parallel, on top of the OS you have low-level libraries (e.g. .libgl), on top of which you have other libraries (like Qt), on top of which you possibly have even higher level libraries (think your own), on top of which you have applications, on top of which you have (business, manufacturing etc.) processes, which is the ultimate goal. So if any of those layers rots, pretty soon the whole thing starts to smell. On the other hand the abstractions prevent you from just breaking the whole system because you made a mistake, which you inevitably are going to make.

    The big difference between a high-level and low-level language is not so much whether you can write high or low level software, as much as how much does the language expose from the underlying architecture. That is to say - if a language allows you to peel off some of them onion layers to get to the lower levels. Note that there's no language that allows you full control all the way; you can't use C/C++ (or assembly for that matter) to manipulate the way the prefetcher in the CPU works - your control stops at the binary.
    In contrast, JS, as an example, ain't caring for no stinking memory addressing, it hides that in a VM. C (and C++) on the other hand expose it and actually make you deal with that. This is both a blessing and a curse, because in the lower-level language you actually have to deal with some implementation details that aren't necessarily relevant to the task at hand - the high-level languages abstract that away for you. And vice versa - when you actually need to dig in, then the higher level languages are of no use, just because you're behind that impenetrable wall which hides all the assumedly irrelevant details.
    It's always a trade-off.

    And to conclude that somewhat longish missive:
    The appeal of Qt (and by extension C++), for me personally, is that I can if needed really strip away all the crap in between me and the CPU if I need the performance, but it also allows me to be somewhat oblivious about the irrelevant details 95% of the time.



  • There was one moment when programming Python that I had wished I had a "real" programming language. It was the moment I discovered the GIL, and the repercussions to threaded programming. It was such a jarring limitation in my mind that a "real" language wouldn't have. However, this is a different definition of "real" in this case. As in "toy" vs "tool". I do reach for Python for quite a few tasks, but not for any application that needs serious threading. Yes, there is the multiprocessing library, but if I need work arounds or need to write a module I might as well start with C++.


  • Moderators

    @fcarney said in "Real" code vs "Fake" code:

    this is a different definition of "real" in this case. As in "toy" vs "tool". I do reach for Python for quite a few tasks, but not for any application that needs serious threading.

    This makes me think of LabVIEW, which is the primary language at my workplace. People often look down on it and call it a "toy" language simply because coding involves drawing colourful diagrams, and because it has "lab" in its name. However, it trumps all other languages in its ease of creating multi-threaded code.


Log in to reply