Here's a line of thinking I'm not totally certain on, but which currently seems plausible to me:
If ASI doesn't take over us, we will take over it (probably). When we take over it, we will do so by actors (public or private) having power over it.
Research and development is difficult. I read somewhere that we use 10x as many researchers to get the same "amount of science" we used to. Perhaps really good tech gains are most efficiently realized through higher IQ beings (1,000 100-IQ people are less likely than one Einstein to come up with a theory of relativity). So maybe ASI will have a comparative advantage in research, and thus in technological development.
Currently, "anyone" can make a tech startup that could in theory go on to revolutionize/shake up/disrupt the world. But, if ASI is way better at innovation than humans, then the humans will tend to either work with the ASI, or not be able to compete. This means that whatever actors control ASI will be able to control technological development.
The actors that control ASI and technology may be more, or less, accountable to things like "the truth" or "the public". (More, or less, than the decentralized cloud/drift version of things we have currently.)
No comments:
Post a Comment