what do business major's actually learn in college?
What do business major's actually learn in college?
Nothing. Nobody learns anything in college you just get a piece of paper that let's you make more money AT YOUR PRETEND JOB UNTIL YOU CAN'T FUCKING TAKE IT AMYMORE AND YOU JUMP OFF A CLIFF.
How to hide shekels, what to do on Shabbat, how to blow the rabbi, lots of stuff man
How to be a social justice warrior faggot
Spreadsheeting mostly
how to suck the pp
Socialism
dont you mean " sociopathism"?
The inner engine of economics and prosperity (profits)
Depends on the major. Business isn't really a major. BBA or MBA is a type of degree.
Excel, debits and credits. That's about it.
Which honestly is all you really need to know about business. Recieving and Paying.
and how to create REAL value for society by solving REAL pain points, or providing bliss points, called a value proposition (something a communist has never provided as all they ever do is displace wealth, not create wealth)
how to manipulate people
that's sells and marketing.
They learn not to put an apostrophe on major’s when it is not a possessive noun, you insufferable dolt!
Agreed, that is marketing.
Business school is basically as another user said, debits and credits.
I use to be an art kid who hated business people, but now I know business is way more creative and has way more practical, functional imaginaiton than artkids ever could muster.
"how to become a sociopath and manipulate people" by dale carnegie.
That's the book they use in college
.
I've read that book, its actually pretty good, and guess what, nothing to do with sociopathy.
Check it out, before you make a fool of yourself
the tips given in the book is what sociopaths use to get what they want. Its made for businessman for a reason. You dont read that book to create long term meaningful relationships
you can apply anything in any way. I guess you believe knives where made for sociopath too?
Integrity and ethics can be ignored as long as you have the very best lawyers to defend you.