Hacker News new | past | comments | ask | show | jobs | submit login
Algorithm can predict future crime a week in advance, with 90% accuracy (psychnewsdaily.com)
16 points by geox on June 30, 2022 | hide | past | favorite | 10 comments



Alas, there are some cities and neighborhoods where there will probably be several shootings. They usually come on the weekends and they usually come when the weather allows people to be outside. So it shouldn't be hard to write such an algorithm.

That being said, it's not exactly like the pre-crime work of Philip K. Dick. The cities are big enough and the background rate of shooting is high enough, that it's pretty much predictable. But that doesn't mean we know which street or which person.


> That being said, it's not exactly like the pre-crime work of Philip K. Dick.

They’re not predicting individual crimes, they’re just predicting crime rates in a 1km area.


Friend of mine had an old truck. He got pulled over constantly. Always with excuse that taillight was out. Whenever his sister borrowed it she got pulled over. Same excuse.

He sold it, got a different old truck. Equally beat up looking, but different style. Problem stopped. Something about it just set police off.


So, Minority Report is getting real... Automatic face identification was banned in U.S. police not so long ago due to racial bias, so I guess this is going the same way. There is a lot of possible bias and a risk of "oracle effect" (blindly following whatever it says/shows).

While deployed, it will also make police look less effective at fighting crime since their anticipated presence in a neighborhood with high crime rate does not result in increased arrest or prosecution (or speeding/parking tickets, who knows?) if they act as a crime deterrent at all.

Pushing it further by collecting more data from citizens will push us into an Orwellian level of surveillance, with police ringing at your door for having done something related to something else controversial. Anyway I won't really like where that goes


I'm totally impressed about how we probably reached that point of technological advances without addressing the social problems that make those crimes happens.


Even ignoring PKD scifi dystopias, governmental adoption of predictive AI invites kafkaesque beaurocratic dystopias where the model becomes stale but it literally takes decades for it to be revoked/replaced/revised.

An AI model that deduces sociological patterns is subject to the same problem of all social sciences: which is that the boundaries of where a finding applies aren’t clear, are sure to change, and can even be disturbed specifically by application of the finding.


90% accuracy, presumably on their test set, seems less impressive than it may sound.


Yeah, 90% means 1 in 10 people get falsely accused of planning to commit a crime every week.

How much would it suck for 10% of the population to spend years of their life in prison because some software that sometimes can't tell if a letter in a PDF is a 5 or an S decided they were gonna rob a gas station?



If you can't see what's wrong with this, you aren't intelligent.

There aren't many ideas I'd make this claim about but this is one of the most obvious cases where it is true.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: