Life Consideration Series

Considering AGI for Life in 20310

  • Oh, I can’t imagine a future where humans are researchers in HCI in 10 years (blu3mo)

  • https://twitter.com/blu3mo/status/1715208339733303491

  • I can’t imagine a future where humans are researchers in 10-20 years

  • I’m curious about how other people who are considering similar paths perceive this

    • By replacing “researchers” with “a game where evaluation doesn’t change depending on who played it ∧ humans are weaker than computers,” it seems to hold true

      • I thought about this quite a bit and wrote a definition
      • Evaluation doesn’t change depending on who did it
        • Winning against 100 people in shogi or running 100m in 5 seconds has value when done by humans, but not when done by computers or machines
        • Producing research results or developing software has value whether done by humans or computers
        • Here, “value” is used to encompass various important aspects of life such as personal value, value to others in one’s community, and economic value within one’s economic sphere, among other things
          • If you can win against 100 people in shogi, you’ll probably be happy, receive praise from others, and earn money
  • https://twitter.com/blu3mo/status/1716481148904210583

  • Indeed, there seem to be cases where humans are partially necessary

  • However, a world where the social and economic value of humans driving research as players is recognized doesn’t seem likely to last long, that’s the feeling

https://twitter.com/blu3mo/status/1715452348414177407

I feel like the economic value aspect is more challenging than the researchers’ own motivation (Those funding research would probably choose the cheaper option if both humans and computers can do the same research)