Given modern programming languages and processors are smart enough to figure out the best way to do things efficiently...

Given modern programming languages and processors are smart enough to figure out the best way to do things efficiently, do you think is still necessary to learn algorithms for average coders?
I've been a coder for over 10 years and I've never think about quick sort/bubble sort, just use common sense and don't loop on double.max

Attached: 41SNoh5ZhOL._AC_SY580_.jpg (442x500, 22.84K)

Fucking idiot

I couldn't get imagemagick to make scan lines and along comes this guy and just tells me to do sin x+1 or something
I hate it, wtf do triangles have to do with scan lines? How?

Being a coder of 10 years you'd know that knowing by heart most of the algos presented in this book is practically useless because most of them have already been employed by pretty much any modern language either as syntax constructs or as ready to use types from their standard libraries.

That book is from early 90's ffs.

>modern programming languages and processors
Holy fucking retard. No. You show me how branch prediction and out of order execution (processors) and the asso2iated compiler optimations (programming languages) will always and sanely provide the best results given any computational problem.
Give me proofs or mathematical rigor. Do not link some rhetorical article you think is an authoritative source. Tell me, in your own words, why you think "modern programming languages and processors" provide sufficient facilities for such a broad-stroke claim about all of computer science.
Im willing to bet $100 you are making a baseless claim or you are a coder who fancies himself a programmer and sees no difference between the 2

>ready to use types as part of their standard libraries
tien, yes you would need to know what it is you are using

The things you've mentioned, I don't know any of them and still am making applications,
I don't think about compiler optimisations when I work, I think about making network request, responding to user interactions etc.

Maybe is just me but I doubt other coders would find them interesting.

Just ignore the pompous CS retard. He fails to realize the difference between applied and theoretical knowledge.

This is perhaps not a problem faced by "average coders" but I personally greatly benefited from learning algorithms for numerical computations in the context of scientific computing.
In my case knowing that x*V^-1*x (x=vector, V=covariance matrix) can instead be done via Cholesky decomposition and solving a system of linear equations provided a large speedup.

Thank you. Computers are supposed to be fast

Called it. Youre doing coding and not programming. Every time a smartphone, website, or business computer lags im gonna think about you and get angry. Those damn spinning circles...

Fuck you in advance, my damn nigger

What programming do you even do lol? Are you a webshitter? If you are it's understandable. I write software for scientific instrumentation in C/C++/C#/LabVIEW/IEC-61131 and sometimes the webshit trio. This shit matters.

I-is this bait?

Not bait, would appreciate some resources about APPLIED trig that are not about getting the side of a triangle...which is what I was taught ff

Average coder doesn't even need to code because of all the nocode tools.

OP here, forgot to mention I make smartphone apps. Guess having only 1 user to worry about is much more forgiving.
The backend team often complain about we are making too much requests though.

Sine is an oscillating function, on off on off forever no matter how the input you will always get an output between 2 values

To add to this: you can apply transformations to sine to easily make something that gives 0 or 1 every number
.5*sin(pi*x-.5*pi)+.5

You must be able to tell "hmmm, this code I'm writing is quadratic".
You should have a basic idea of the landscape. Nothing very fancy, just "ah, I could use a hashset here" and "maybe there's an efficient way to do this, I'll google it".
You can get handy enough to roll your own, but I've found that more useful for Advent of Code and for job interviews than for actual work.
For sorting, yeah, unless your language is stuck in the last century all you have to think about is whether it's stable or unstable.
It doesn't come down to processors and languages though, algorithmic complexity is the one thing they can barely help with. It's mainly thanks to libraries.
Learn how to profile your code.

You don't need these algorithms unless you're doing autistic AI or graphic stuff, which most people never do

That was me a while back, do you still have the original code snippet. It was something like.

if (sin(y+1) < 0.5) then darken.

From there you can stretch the wave to the number of pixels you want the width of the scan line to be.