Programming AI to Program...Us? (part 4)
Nick Foster (head of design at Google “X”) envisions the
ledger as much more than a tool for self-improvement, however. He
believes that the system could “plug gaps in its knowledge and
refine its model of human behavior” beyond the individual, but for
the entire human species. “By thinking of user data as
multigenerational, it becomes possible for emerging users to benefit
from the preceding generation’s behaviors and decisions.” Foster
would like to mine this database of human behavior for patterns in
order to make “increasingly accurate predictions about decisions
and future behaviours.” He believes that when enough data has been
collected, “it may be possible to develop a species-level
understanding of complex issues such as depression, health, and
poverty.” While this sounds like a noble intent, the dark side of
the equation involves the reactions and determinations that will be
made from the collected data. Who is deciding and prescribing how to
treat society's ills, Google? And what if you personally don't
ascribe to Google's “values as an organization?” What if
decisions are made to remove from society things that Google has
decided are a risk towards depression, health, or poverty? Have you
ever seen the movie, “Equilibrium?” I highly recommend it.
No comments:
Post a Comment