less_retarded_wiki

main page, file list (482), single page HTML, source, report, wiki last updated on 10/02/23

Object-Oriented Programming

"I invented the term 'object oriented' and C++ was not what I had in mind" --Alan Kay, inventor of OOP

Object-oriented programming (OOP, also object-obsessed programming and objectfuscated programming) is a programming paradigm that tries to model reality as a collection of abstract objects that communicate with each other and obey some specific rules. While the idea itself isn't bad and can be useful in certain cases, OOP has become extremely overused, extremely badly implemented and downright forced in programming languages which apply this abstraction to every single program and concept, creating anti-patterns, unnecessary issues and of course bloat. We therefore see OOP as a cancer of software development.

Ugly examples of OOP gone bad include Java and C++ (which at least doesn't force it). Other languages such as Python and Javascript include OOP but have lightened it up a bit and at least allow you to avoid using it.

You should learn OOP but only to see why it's bad (and to actually understand 99% of code written nowadays).

Principles

Bear in mind that OOP doesn't have a single, crystal clear definition. It takes many forms and mutations depending on language and it is practically always combined with other paradigms such as the imperative paradigm, so things may be fuzzy.

Generally OOP programs solve problems by having objects that communicate with each other. Every object is specialized to do some thing, e.g. one handles drawing text, another one handles caching, another one handles rendering of pictures etc. Every object has its data (e.g. a human object has weight, race etc.) and methods (object's own functions, e.g. human may provide methods getHeight, drinkBeer or petCat). Objects may send messages to each other: e.g. a human object sends a message to another human object to get his name (in practice this means the first object calls a method of the other object just like we call functions, e.g.: human2.getName()).

Now many OO languages use so called class OOP. In these we define object classes, similarly to defining data types. A class is a "template" for an object, it defines methods and types of data to hold. Any object we then create is then created based on some class (e.g. we create the object alice and bob of class Human, just as normally we create a variable x of type int). We say an object is an instance of a class, i.e. object is a real manifestation of what a class describes, with specific data etc.

The more "lightweight" type of OOP is called classless OOP which is usually based on having so called prototype objects instead of classes. In these languages we can simply create objects without classes and then assign them properties and methods dynamically at runtime. Here instead of creating a Human class we rather create a prototype object that serves as a template for other objects. To create specific humans we clone the prototype human and modify the clone.

OOP furthermore comes with some basic principles such as:

Why It's Shit

So Which Paradigm To Use Instead Of OOP?

After many people realized OOP is kind of shit, there has been a boom of "OOP alternatives" such as functional, traits, agent oriented programming, all kinds of "lightweight"/optional OOP etc etc. Which one to use?

In short: NONE, by default use the imperative paradigm (also called "procedural"). Remember this isn't to say you shouldn't ever apply a different paradigm, but imperative should be the default, most prevalent and suitable one to use in solving most problems. There is nothing new to invent or "beat" OOP.

But why imperative? Why can't we simply improve OOP or come up with something ultra genius to replace it with? Why do we say OOP is bad because it's forced and now we are forcing imperative paradigm? The answer is that the imperative paradigm is special because it is how computers actually work, it is not made up but rather it's the natural low level paradigm with minimum abstraction that reflects the underlying nature of computers. You may say this is just bullshit arbitrary rationalization but no, these properties makes procedural paradigm special among all other paradigms because:

Once computers start fundamentally working on a different paradigm, e.g. functional -- which BTW might happen with new types of computers such as quantum ones -- we may switch to that paradigm as the default, but until then imperative is the way to go.

History

TODO


All content available under CC0 1.0 (public domain). Send comments and corrections to drummyfish at disroot dot org.