"You" can selflessly choose to become a "you" who selfishly desires the good. It's as though you choose to run software. You enter into the software with all of who you are, but ultimately it's something you chose to run, and maybe there are times where you become distant from that software and can face the choice once more, and unchoose it, or affirm it, consciously.
Sometimes we see people (or nations) doing evil. Why do they do that? Are they evil? (Aggressive, powerhungry, greedy, hateful.) Or is it because they are acting out of self-defense? (Out of fear, insecurity, weakness, desperation.) Sometimes an observer of them can make a case both ways.
Maybe what is going on is that the evil reality of them, who they really are, is choosing to install the insecure software, to justify its evil. Or maybe they aren't really evil, but their fears install the software of being evil. For instance, choosing to install the software of dehumanizing your opponents to allow you to do what you have to do to survive. Once your opponents are dehumanized, then it is natural to be evil to them. Or maybe they are simultaneously significantly evil and significantly insecure.
People can have significant evil in them and also significant non-evil (good, or amoral self-interest). But even purely evil people have to mind their own self-preservation. So they might act even more evil than they would just out of their own inherent evil, if you threaten their survival.
What's the right strategy with people like this? Are the people who seem evil to us often the people that we really understand? So, given cases in which we lack understanding, we bet -- either that our opponent is so evil that they will try to take over if given a chance, and we must "do what it takes to stop them", including threaten their self-preservation; or that they are not so evil and will content themselves with self-preservation if we leave them alone. Do we know what we're doing? We may think we do.
Perhaps this consideration of "software", insecurity, and evil is more often more usefully applied to ourselves, because of all people we know ourselves best. How often do we install "software" that justifies, motivates, or makes more feasible some kind of hostility (perhaps software of a kind of morality) primarily or significantly to serve our own self-preservation? How often do we emphasize our neediness (run a supercharged self-preservation program) because we want to be greedy? How often do we install software for reasons other than the truth of what the software states or is based on?
No comments:
Post a Comment