?

Log in

Previous Entry | Next Entry

complaining about _Big Data_

In the "dark side" section of Big Data, the authors propose not only the idea of predicting a crime in advance of
it being committed, but of then punishing someone for it. I assume they've decided to build a response to
that ridiculous movie I didn't see, but I didn't see it so I am not sure.

In any event, they explore some of the ramifications, by looking at algorithms used by parole boards to
decide who to let out early. Let's just remind ourselves of what this means. Someone _does not have to
serve_ their full sentence, because they've been really well behaved or whatever. A crime has already
been determined to have been committed and punishment assigned. Are mistakes made? Hell yeah!
Are our laws imperfect, criminalizing things which should be legal? Absolutely! And yet we're all mostly
okay with that; there are activists, people write stuff like I just wrote and then we go on about worrying
about other things instead.

"The fundamental trouble is that with such a system we essentially punish people [author emphasis]
before [end author emphasis] they do something bad. And by intervening before they act (for
instance by denying them parole if predictions show there is a high probability that they will
murder), we never know whether or not they would have actually committed the predicted
crime. We do not let fate play out, and yet we hold individuals responsible for what our predictions
tell us they would have done. Such predictions can never be disproven."

Okay, again, _let's remind ourselves_. These are people who have already been determined to have
committed a crime and given a sentence. We are letting them out early because they've been good
and we feel optimism that will continue. It is not an additional punishment to deny parole. _This
has been litigated. _Parole is not a right and denying it is not a punishment._

There are at least some people who come up for parole who do not want it. They know they
will re-offend if they are released. For at least some of these people, being stopped from
re-offending is at least somewhat freeing: that is one less heinous thing they have to
remember themselves doing (or several, if it takes a while to track them down again).

This is not a ridiculous movie with a physically attractive action star exploring deep
philosophical and moral issues. This is about people who have been determined by our system
of justice to have committed a crime and to be deserving of some sentence. Parole is
about whether we let them off early, because they were good. If we want to toss some
extra data to improve our intuition about whether they'll continue to be good when not being
constantly surveilled and controlled, that can only help.

I'm really starting to hate this book. There was an ungodly amount of garbage about big
data startups that could predict how much money a movie would make (more if you hire
a big male lead who has been up for an Oscar! We needed big data for that? Truly?), not
to mention some ridiculousness about how people who intentionally release a bunch of
data about themselves can then be identified when "anonymized" data is matched against
the already released data. I should panic? For why?

There are great things and scary things to say about Big Data. This book is not doing its
topic justice.

ETA: Later, they talk about Google wanting SAT scores and GPAs. The paragraph regarding why this
was a bad idea includes NOTHING about people with very high SAT scores and GPAs being utterly
useless as employees -- which I suspect happens as often, if not more often, than people
with extremely low scores being stellar performers in a high pressure, results oriented
corporation. I don't know why the focus is always on low scores/good performance In Real Life,
rather than high scores/Total Suckitude In Real Life. You would think the latter would do a
better job of putting an end to the use of a terrible metric.