"Real" code vs "Fake" code
-
@kshegunov said:
Polymorphism is dead?
Obviously not ;) Sure, there's more to C++ over C than templates, but I'm writing a forum post not a book :)
They have merit as much as a hammer in a toolbox (...) Purposely substituting a hammer for a screwdriver and vice versa isn't a productive way of approaching things.
I 100% agree and was not suggesting anything like that. Just pointing out that they elegantly solve some specific problems that would be ugly or cumbersome otherwise and it's good to have that hammer in your toolbox for cases where you need to hit a nail.
@fcarney said:
Yet, I don't think I have found any code outside the Qt libs, that have templates in them
Wow, really? Not even a string or a vector? So you're still exclusively using raw arrays and char* ? That's brutal :)
I don't think programming microcontrollers inherently requires you to drop to the lower level C++. Sure you can kill performance or skyrocket memory footprint easily if you don't have a head on your neck, but that happens only if you blindly use high level features without knowing how they work.
I mean you can program stuff like Commodore64 in C++17 these days and it produces pretty sweet assembly so there's no real excuse not to. -
@fcarney said in "Real" code vs "Fake" code:
Not sure what version of C++ it was based upon.
No such a thing existed before c++98, so - the original version.
-
@Chris-Kawa said in "Real" code vs "Fake" code:
I 100% agree and was not suggesting anything like that.
"A scientist's aim in a discussion with his colleagues is not to persuade, but to clarify." ;)
-
@Chris-Kawa said in "Real" code vs "Fake" code:
Not even a string or a vector
Doh! Okay, I use those all the time. I was thinking of someone writing their own template. We use templates all the time, but we haven't written very many of them.
-
@fcarney said in "Real" code vs "Fake" code:
We use templates all the time, but we haven't written very many of them.
Just wrote one today. A simple one, admittedly, but they're useful in the right places. Here's the "glory":
template <typename T> typename std::enable_if<std::is_floating_point<T>::value, bool>::type equal(const T & lhs, const T & rhs) { return std::abs(lhs - rhs) <= std::numeric_limits<T>::epsilon() * std::min(std::abs(lhs), std::abs(rhs)); }
I still cringe at the thought of the verbose semi-magic that is meta-programming, and we certainly need a better way.
-
@fcarney said:
We use templates all the time, but we haven't written very many of them.
I think that's actually pretty normal and a good model. They are pretty useful for stable and tested library code intended to be reused, but they do make code harder to read and debug so, like I mentioned, I don't suggest to use them everywhere because you can. Just when it makes sense.
@kshegunov said:
"A scientist's aim in a discussion with his colleagues is not to persuade, but to clarify." ;)
Narrator: A scientist walks into a bar...
Scientist: Weeeell, not so much a bar as a low class drinking establishment and it wasn't really a walk, more of an accidental swayand we certainly need a better way.
Rejoice, concepts are on their way ;)
-
@fcarney said in "Real" code vs "Fake" code:
Maybe if it doesn't have the potential to produce a segfault its not "real" coding? Also, I don't think I am equating "real" with some kind of machismo thing. Its different than that. Perhaps "closer to the metal" is what I am feeling.
I believe the terms you're after are not "real"/"fake", but "low-level"/"high-level".
The way I see it, "real" code is anything you write that makes a machine do something you want. So, high-level Python and QML can be just as "real" as low-level C++ and Assembly.
-
@JKSH said in "Real" code vs "Fake" code:
I believe the terms you're after are not "real"/"fake", but "low-level"/"high-level".
I think that may be more accurate. When I first started programming Python it felt like playing with Legos. While programming in C++ felt like using wrenches, hacksaws, and welders. I remember also feeling like I had all the power I needed with Python as I could always "write a module in C" if I needed to.
-
@fcarney said in "Real" code vs "Fake" code:
I think that may be more accurate. When I first started programming Python it felt like playing with Legos. While programming in C++ felt like using wrenches, hacksaws, and welders. I remember also feeling like I had all the power I needed with Python as I could always "write a module in C" if I needed to.
The appeal of high-level languages: It has built-in protections to stop us from shooting ourselves in the foot.
The appeal of low-level languages: It allows us to shoot through the gap between our toes to achieve our goal the way we want.
-
@JKSH said in "Real" code vs "Fake" code:
The appeal of high-level languages: It has built-in protections to stop us from shooting ourselves in the foot.
The appeal of low-level languages: It allows us to shoot through the gap between our toes to achieve our goal the way we want.Somewhat correct. Even with low level languages we are quite protected most of the time (e.g. ASLR).
In fact one should realize that the software we write is an onion (cf. swiss cheese model) - there are layers on top of layers on top of layers. Peeling them off may lead to tears too.Basically you have the hardware, on top of which you have the firmware (microcode for CPUs and such), on top of which you have the BIOS (which may include the firmware or at least the tools to manipulate it), on top of which you have the drivers and the OS, on top of which you have services (or daemons), and in parallel, on top of the OS you have low-level libraries (e.g. .libgl), on top of which you have other libraries (like Qt), on top of which you possibly have even higher level libraries (think your own), on top of which you have applications, on top of which you have (business, manufacturing etc.) processes, which is the ultimate goal. So if any of those layers rots, pretty soon the whole thing starts to smell. On the other hand the abstractions prevent you from just breaking the whole system because you made a mistake, which you inevitably are going to make.
The big difference between a high-level and low-level language is not so much whether you can write high or low level software, as much as how much does the language expose from the underlying architecture. That is to say - if a language allows you to peel off some of them onion layers to get to the lower levels. Note that there's no language that allows you full control all the way; you can't use C/C++ (or assembly for that matter) to manipulate the way the prefetcher in the CPU works - your control stops at the binary.
In contrast, JS, as an example, ain't caring for no stinking memory addressing, it hides that in a VM. C (and C++) on the other hand expose it and actually make you deal with that. This is both a blessing and a curse, because in the lower-level language you actually have to deal with some implementation details that aren't necessarily relevant to the task at hand - the high-level languages abstract that away for you. And vice versa - when you actually need to dig in, then the higher level languages are of no use, just because you're behind that impenetrable wall which hides all the assumedly irrelevant details.
It's always a trade-off.And to conclude that somewhat longish missive:
The appeal of Qt (and by extension C++), for me personally, is that I can if needed really strip away all the crap in between me and the CPU if I need the performance, but it also allows me to be somewhat oblivious about the irrelevant details 95% of the time. -
There was one moment when programming Python that I had wished I had a "real" programming language. It was the moment I discovered the GIL, and the repercussions to threaded programming. It was such a jarring limitation in my mind that a "real" language wouldn't have. However, this is a different definition of "real" in this case. As in "toy" vs "tool". I do reach for Python for quite a few tasks, but not for any application that needs serious threading. Yes, there is the multiprocessing library, but if I need work arounds or need to write a module I might as well start with C++.
-
@fcarney said in "Real" code vs "Fake" code:
this is a different definition of "real" in this case. As in "toy" vs "tool". I do reach for Python for quite a few tasks, but not for any application that needs serious threading.
This makes me think of LabVIEW, which is the primary language at my workplace. People often look down on it and call it a "toy" language simply because coding involves drawing colourful diagrams, and because it has "lab" in its name. However, it trumps all other languages in its ease of creating multi-threaded code.