Wei Dai: how to change values
2000 Sep 12
See all posts
Wei Dai: how to change values @ Satoshi Nakamoto
- Author
-
Wei Dai
- Email
-
satoshinakamotonetwork@proton.me
- Site
-
https://satoshinakamoto.network
Robin wrote earlier:
There is less scope for being "right" in disagreements about values.
Once we understand what we want, and opponents decide they don't want
that, there isn't that much more to say to them.
People's values seem to be derived from some set of fundamental
values, and the facts that they believe to be true. So we can convince
people that we are "right" about values in two ways. The first is to
convince them of a new set of facts, and the second is to convince them
that their derived values are not consistent with their fundamental
values and their beliefs about facts. As an example, if you convince a
theist that there are no gods, that would probably change a number of
his values.
I'm not sure if there is a standard theory of how values change, but
it isn't hard to construct one that fits in nicely with the standard
theory of how beliefs about facts change (i.e. the Bayesian theory of
probability). Start with a utility function U that describes the
fundamental values. U maps a state of the universe x to a real number
U(x) representing how desirable that state is. Define V(U,P,K) to be sum
over all x of U(x)*P(x|K). We can now compute someone's value on any
statement s (i.e., how much he desires that statement to be true) as
V(U,P,K and s) where U is his fundamental values, P is his prior
probability distribution, and K is all of his past experiences.
Discussions: http://extropians.weidai.com/extropians.3Q00/4844.html
Wei Dai: how to change values
2000 Sep 12 See all postsWei Dai
satoshinakamotonetwork@proton.me
https://satoshinakamoto.network
Robin wrote earlier:
People's values seem to be derived from some set of fundamental values, and the facts that they believe to be true. So we can convince people that we are "right" about values in two ways. The first is to convince them of a new set of facts, and the second is to convince them that their derived values are not consistent with their fundamental values and their beliefs about facts. As an example, if you convince a theist that there are no gods, that would probably change a number of his values.
I'm not sure if there is a standard theory of how values change, but it isn't hard to construct one that fits in nicely with the standard theory of how beliefs about facts change (i.e. the Bayesian theory of probability). Start with a utility function U that describes the fundamental values. U maps a state of the universe x to a real number U(x) representing how desirable that state is. Define V(U,P,K) to be sum over all x of U(x)*P(x|K). We can now compute someone's value on any statement s (i.e., how much he desires that statement to be true) as V(U,P,K and s) where U is his fundamental values, P is his prior probability distribution, and K is all of his past experiences.
Discussions: http://extropians.weidai.com/extropians.3Q00/4844.html