OOP can be very good, and it can also be very bad, depending on how you use it. It's a very good tool when you have datatypes that work well with a noun analogy, and are expected to perform actions, and you want to replicate very similar, but not identical behavior among a lot of related types while minimizing repeated code. The analogies that are used to teach beginners are usually retarded.
The perfect use for OOP is a widget toolkit for a GUI (think Qt or GTK+). There's a very good reason why you would do this OO. You want to have a root container type with dozens of container types with similar behavior, you want to have possibly hundreds of widget types with similar behaviors, and the ability to add your own without modifying library code. These widget types will have up to 50 functions that can operate on all of them with similar behavior (identical behavior, for most of the widgets and calls). OO is the best paradigm for this. Even if you do this "without" OO, you will essentially end up with a hand-hobbled object system anyway.
OO is horrible for very data-oriented things, like filesystem drivers. Essentially, if your data can be treated as a noun (rather than a simple pile of data) and needs specific behavior that needs to be extensible, OO is probably what you want. If the analogy is incomplete (ie. your data is a noun, but it doesn't "do" things, you just do things to it, operate on it, or pass it to other things), don't use OO. This is why Filesystem drivers half-fail at OO, because a file doesn't "do" anything, it's just operated on (nonetheless, the file types in C++ are useful as they become generic stream types).
OO can also be useful for, as previously stated, making generic types that operate similarly for all inherited children, like C++'s container types and stream types, but you can get some similar advantages plus more with metaprogramming and duck-typing. In this case, you aren't actually benefiting from OO, but more benefiting from the fact that OO lets you subtype without the type system shitting on you.
This is the most important lesson of programming, kid. If something looks stupid or useless, or if you're trying to make a decision between two paradigms, the answer is almost always "depends what you're trying to do". There is never ever a one-size-fits-all solution. OOP may look horrible to you, but eventually you're going to run into a situation where you'd write 10 times the amount of code avoiding an OO pattern instead of just using one, and it will still run worse.