optimization isn't always about multi-threading and optimizing hardware utilization. in fact, most performance work is about simply doing less.tantan's video...
Now, I am just a modder and not a full blown dev or anything, but I’ve always questioned others who critcized my scripts and suggested much more complicated ways of doing the same thing. Like I can do exactly what I wanted with 1 line of code, and someone would come in and say “do it this way for better results” and it’s 6 lines of crap that ends up working exactly the same. Why?! Especially when this was for a game that has a notoriously slow script engine, meaning more lines of code = slower, no matter what you were doing.
Generally the performance difference will be minimal, but the benefit to others (and yourself in the future) in keeping the code’s functionality clear and readable is much more important, especially in a professional setting.
A lot of programmers do have this ‘code golf’ mentality that less lines == efficient, but unless its a bottleneck and you’ve benchmarked it to be significantly faster, code readability should always trump performance.
Less code is not a positive metric to measure your implementation by, and is not a valid premise to justify itself. Often increasing the complexity (again, LOC is not an indicator of complexity), tanking performance, and harming the debugging experience is a common result of the mentality. Things that make software worse.
Not all one-liners are bad ofc, that’s not the argument I’m making. It’s about the mentality that less code is more good, where poor decisions are made on a flawed premise.
Less lines doesn’t automatically mean a CPU does less. I can write a one liner in ruby that will munch through a thread. But in general you’re correct imo
My comment though was more that performance isn’t about just CPU time. Development time is another performance metric, as is maintenance, overly complicated abstractions can yield big brain points but rarely anything else
If you solve only the problem you have with less code (why, some sort of example driven development may be of service here) then you’re on the road to better everything
That single line of code may be using a slow abstraction, doesn’t cover edge cases, has no caching of reused values, has no optimization for the common path, or any other number of issues. Thus being slower, fragile, or sometimes not even solving the problem it’s meant to solve.
More often than not performance and robustness comes at a significant increase to the amount of code you have to write in high level languages… Performance optimizations especially.
A high performance parser I was involved in writing was nearly 60x the amount of code (~12k LOC) of the lowest LOC solution you could make (~200LOC), but also several orders of magnitude faster. It also covered more edge cases, and could short circuit to more optimal paths during parsing, increasing the performance for common use cases which had optimized code written just for them.
More lines of code = slower
It doesn’t. This is a fundamental misunderstanding of software engineering and is flawed in almost every way. To the point of it being an armchair statement. Often this is even objectively provable…
Now, I am just a modder and not a full blown dev or anything, but I’ve always questioned others who critcized my scripts and suggested much more complicated ways of doing the same thing. Like I can do exactly what I wanted with 1 line of code, and someone would come in and say “do it this way for better results” and it’s 6 lines of crap that ends up working exactly the same. Why?! Especially when this was for a game that has a notoriously slow script engine, meaning more lines of code = slower, no matter what you were doing.
Generally the performance difference will be minimal, but the benefit to others (and yourself in the future) in keeping the code’s functionality clear and readable is much more important, especially in a professional setting.
A lot of programmers do have this ‘code golf’ mentality that less lines == efficient, but unless its a bottleneck and you’ve benchmarked it to be significantly faster, code readability should always trump performance.
Hard agree.
Less code is not a positive metric to measure your implementation by, and is not a valid premise to justify itself. Often increasing the complexity (again, LOC is not an indicator of complexity), tanking performance, and harming the debugging experience is a common result of the mentality. Things that make software worse.
Not all one-liners are bad ofc, that’s not the argument I’m making. It’s about the mentality that less code is more good, where poor decisions are made on a flawed premise.
Less lines doesn’t automatically mean a CPU does less. I can write a one liner in ruby that will munch through a thread. But in general you’re correct imo
My comment though was more that performance isn’t about just CPU time. Development time is another performance metric, as is maintenance, overly complicated abstractions can yield big brain points but rarely anything else
If you solve only the problem you have with less code (why, some sort of example driven development may be of service here) then you’re on the road to better everything
That single line of code may be using a slow abstraction, doesn’t cover edge cases, has no caching of reused values, has no optimization for the common path, or any other number of issues. Thus being slower, fragile, or sometimes not even solving the problem it’s meant to solve.
More often than not performance and robustness comes at a significant increase to the amount of code you have to write in high level languages… Performance optimizations especially.
A high performance parser I was involved in writing was nearly 60x the amount of code (~12k LOC) of the lowest LOC solution you could make (~200LOC), but also several orders of magnitude faster. It also covered more edge cases, and could short circuit to more optimal paths during parsing, increasing the performance for common use cases which had optimized code written just for them.
It doesn’t. This is a fundamental misunderstanding of software engineering and is flawed in almost every way. To the point of it being an armchair statement. Often this is even objectively provable…