The Lambda Calculus is Horrifying

How do people come up with shit like this? Like I understand what this is and how it works but how does someone actively create ideas like this??? Let alone figure out how to implement them and expand upon them?

Am I just retarded? This shit is so simple but I could never come up with something like this or understand how to build a programming language with it.

Attached: understand.png (292x173, 3.2K)

Other urls found in this thread:

jstor.org/stable/1968337
personal.utdallas.edu/~gupta/courses/apl/lambda.pdf
en.wikipedia.org/wiki/Church–Turing_thesis
twitter.com/AnonBabble

Read the original.
jstor.org/stable/1968337

It's pretty overrated really, see how useless its practical implementation (LISP) is

It's actually not particularly difficult with some mathematical maturity to understand the motivation behind the calculus. Like, it's just an abstraction describing how functions work, what the variables do and what you can do with a function (i.e. evaluate it)

It's literally just f(g(x)) written like f*g. What's so difficult about it?

You don't understand it because you don't know what those symbols mean

no you're a f*g

ebin xDDD

Attached: 2dc6.jpg (336x442, 90.91K)

IIRC the original motivation behind the lambda was to abstract over various binding forms in logic such as quantifiers and set comprehensions. Somewhere along the way, it was discovered that lambda calculus could be used to encode paradoxes of self reference and, from that, the idea that lambdas with beta reduction were a universal model of computation was discovered.

It's just math, lol. Look at the formal definition of natural numbers, it's literally the same shit.

The lambda calculus is pretty brilliant. It is a incredibly based representation which allows simple yet powerful abstractions.

But it takes a while until you really grok it. This explanation is pretty well written and helped me a lot:
personal.utdallas.edu/~gupta/courses/apl/lambda.pdf

It's literally the best idea ever in the history mathematics, the only thing that sucks is the notation it uses.

>best idea ever in mathematics
You mispelled "turing machine"

Fuck no.

The Turing machine is closer to what hardware actually does, but it is a hell of a clumsy motherfucker.

The Lambda calculus is so fucking easy, all you have to realize is that everything is a function. And I mean..
>numbers
are functions
>arithmetics
are functions
>data
is either the input or functions
>functions
..go figure..


This looks weird at first but it is really slick once you wrapped your head around it.

This might look impressive for someone who cannot program in Python, but for me it sounds trivial
What has been accomplished with Lambda calculus exactly?

There is one instance of these shenanigans coming to fruition. The l4 microkernel was remade with Haskell as the base as sel4. That kernel has certain versions that have been formally proven to be bug free.

This is the wrong way to put it..
It is like asking "what has been achieved by Riemann integration?"

Understing Lambda calculus is more like understanding "what is computable?"

Basically there are two representations of programming:
The Turing machine and the Lambda calculus. Both are equally powerful, you might want to read this:
en.wikipedia.org/wiki/Church–Turing_thesis


Modern programming languages are all different, but the basic principles are mostly related to either the Turing machine (C, Java..) or the Lambda Calculus (Haskell, Lisp..). But most contemporary languages allow for both abstractions. At least to some degree..

is this bait or are cucks really this fucking stupid ?

iirc developing it took quite some time. Even simple things like loops and stuff took Church quite a while to figure out or discover. So don't be intimidated. Especially don't be intimidated because, while fun, lambda calculus is totally irrelevant for the practicing programmer.

>What has been accomplished with Lambda calculus exactly?
Unifying logic with mathematics.
This has been object of philosophical debates for decades

Godel unified logic with mathematics without any use of Lambda calculus. All he had to use was encode logical sentences as natural numbers

I don't think it's irrelevant.

If you read a short introduction like this and then unironically SICP you will be a better programmer.


You can give someone a really simple task like writing a recursive fibonacci function and will immediately see if someone understood recursion.

If you are using the naive (tree-recursive) implementation you are intermediate at best.

The map is not the landscape.

It is piss easy to "unify" math and logic by some simple method. But Gödel and and the Lambda Calculus habe completely different goals.

Also it is not a dick contest. Gödel was brilliant, but you can't apply his work to practical computation.