Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsPredictive policing, like 'Minority Report,' comes to New York City ... for real
This is not science fiction. The NYPD and the Miami police department have now contracted out a company named HunchLab to help them institute what they call "predictive policing intelligence." On the HunchLab website, they describe their service like this:
In other words, using their pre-existing data on arrests and crime, the technology is going to predict new locations for crime so that police can be there to respond before it happens.
I only have one question, and of course it's rhetorical, and we all know the answer: does this system account for widespread racism in policing? If the data that HunchLab is given by the NYPD and the Miami police department to predict future crimes is skewed by wrongful arrests and illegal detentions, which, accounting for the reality that racism in policing has never been properly detailed on any massive scale, then we can reasonably ensure that the predictive policing technology will simply predict more racist police interventions. This is wrong and unethical on a hundred different levels.
How will it account for the reality that this NYPD detective testified under oath that he and others fabricated charges against innocent people to meet quotas? Will it account for the racist reality than in some places far more white people that are pulled over by police are found with drugs and contraband, but a higher percentage of African Americans end up arrested by those same police? If the data the system uses is based on arrests, which it likely does, and not the presence of drugs that should in fact warrant an arrest, we can already determine that this system will do nothing but advance more racist policing.
http://www.dailykos.com/story/2015/07/10/1401050/-Predictive-Policing-like-Minority-Report-or-The-Avengers-comes-to-New-York-City-for-real
HunchLab is a web-based predictive policing system. Advanced statistical models automatically include concepts such as aoristic temporal analysis, seasonality, risk terrain modeling, near repeats, and collective efficacy to best forecast when and where crimes are likely to emerge. This all lets you focus on one thing: responding.
In other words, using their pre-existing data on arrests and crime, the technology is going to predict new locations for crime so that police can be there to respond before it happens.
I only have one question, and of course it's rhetorical, and we all know the answer: does this system account for widespread racism in policing? If the data that HunchLab is given by the NYPD and the Miami police department to predict future crimes is skewed by wrongful arrests and illegal detentions, which, accounting for the reality that racism in policing has never been properly detailed on any massive scale, then we can reasonably ensure that the predictive policing technology will simply predict more racist police interventions. This is wrong and unethical on a hundred different levels.
How will it account for the reality that this NYPD detective testified under oath that he and others fabricated charges against innocent people to meet quotas? Will it account for the racist reality than in some places far more white people that are pulled over by police are found with drugs and contraband, but a higher percentage of African Americans end up arrested by those same police? If the data the system uses is based on arrests, which it likely does, and not the presence of drugs that should in fact warrant an arrest, we can already determine that this system will do nothing but advance more racist policing.
http://www.dailykos.com/story/2015/07/10/1401050/-Predictive-Policing-like-Minority-Report-or-The-Avengers-comes-to-New-York-City-for-real
InfoView thread info, including edit history
TrashPut this thread in your Trash Can (My DU » Trash Can)
BookmarkAdd this thread to your Bookmarks (My DU » Bookmarks)
4 replies, 714 views
ShareGet links to this post and/or share on social media
AlertAlert this post for a rule violation
PowersThere are no powers you can use on this post
EditCannot edit other people's posts
ReplyReply to this post
EditCannot edit other people's posts
Rec (6)
ReplyReply to this post
4 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
Predictive policing, like 'Minority Report,' comes to New York City ... for real (Original Post)
MrScorpio
Jul 2015
OP
djean111
(14,255 posts)1. The police already have software that reads a scanned license plate and comes back with
name, address, as much personal info as it can get, plus it sifts through social media and online purchases, and then grades the "threat" with a color designation.
http://blogs.reuters.com/great-debate/2014/12/12/police-data-mining-looks-through-social-media-assigns-you-a-threat-level/
.......
One such application is Beware, sold to police departments since 2012 by a private company, Intrado. This mobile application crawls over billions of records in commercial and public databases for law enforcement needs. The application mines criminal records, Internet chatter and other data to churn out profiles in real time, according to one article in an Illinois newspaper.
Heres how the company describes it on their website:
Accessed through any browser (fixed or mobile) on any Internet-enabled device including tablets, smartphones, laptop and desktop computers, Beware® from Intrado searches, sorts and scores billions of commercial records in a matter of seconds-alerting responders to potentially deadly and dangerous situations while en route to, or at the location of a call.
Crunching all the database information in a matter of seconds, the Beware algorithm then assigns a score and threat rating to a person green, yellow or red. It sends that rating to a requesting officer.
.......
HFRN
(1,469 posts)2. increase in cutting those wooden balls could increase deforestation
and that should be a concern to everyone
jeff47
(26,549 posts)3. Dear god I hated that monumentally stupid part of their system.
I have to imagine the thought process was "Hey, let's throw in a nature reference. Otherwise it's all too technology-ish." Fucking awful.
HFRN
(1,469 posts)4. hardly seemed like something from the future nt