Ultimately, I find the question of whether I am a good person impossible to answer and I don't understand how others can do it so easily.
morality is more of a personal intuition than some sort of "universal target"
it's funny how everyone these days is talking breathlessly about "ai alignment"
the whole time ignoring the fact that we can't get even two people to agree on what "moral axioms" we humans should be able to agree on
If I am able to admit that I do not know myself the way I imagine I do, that's a good start. I am coming more from the perspective to take life more often from a playful manner than to insist on its seriousness. Also, to train myself in asking more questions instead of delivering rock solid answers. I must say that I tend a lot to write long texts, though. LOL!! :D
I think the reason is that I am a truly fast typer. HaHa!
the point about "asking questions" is basically a realization that "most people" won't ever accept a "new concept" unless they feel like they thought of it themselves
then it would be intelligent to give them/us the impression that they/we came up with something innovative ourselves/themselves, no? I think this can be realized through inspiration not agitation. If I have understood something which I have not understood before and I can make it appear into art or a piece of work I think that it is "my own invention".
sometimes people need a push
sometimes they don't
What does this mean? I am not familiar with this expression.
oh man, i've seen at least 20 videos talking about "ai alignment"
it's basically the idea that we need to figure out how to get "ai" to "align" with or share or at least act like it shares "human values"
the joke of course being that humans only very roughly "share values" in the first place
Thanks for the explanation.
HaHa! Right! It's very funny to get a thing aligned when I am not even aligned with myself - and as a consequence of this, not with others. LOL
right,
step one:
rigorously define "human values"