The IRA was killing someeone every other day.
They were fans of habit.
Target, leaves house at 8am every morning.
Takes the same route to work.
So easy for them to drive up in a motorcycle when victim stuck in traffic.
Repeat lather Rinse.
Algorithm researcher Mirco Musolesi and his team recently tested this in the UK.
Thoughts reminiscent of Phillip K. Dick’s Minority Report, and all the moral trappings that come with it.
Spurred by a Nokia initiative, and the 6,000 Euro prize money, Musolesi started researching a mobility tracking project (using volunteers furnished by Nokia).
Mobility itself, every day routine, is quite predictable.
Where Musolesi and his team ran into problems was unpredictable, unscheduled modes of mobility.
Parmy Olson, of Forbes, reports, “When the algorithm was simply tracking the volunteer, it could predict their future GPS coordinates to within roughly 1,000 square meters.
When the prediction took into account additional information from a single friend, the error rate improved by several orders of magnitude.”
Indeed, with the addition of friends and acquaintances, Musolesi was able to predict where his targeted principle would be in 24 hours, within a range of error of 20 meters.
The idea started with Nokia’s “Dedicated Challenge,” issued in early November 2011.
The challenge called on researchers to write algorithms that would use existing information to infer certain personal details about their data users.
Ostensibly, Nokia wants this information so it can, for example, sell Facebook a better ad-experience for it’s users.
Nokia’s “Dedicated Challenge” was just that, ‘dedicated,’ to three ends in particular:
– Semantic Place Prediction: ‘Why’ are you where you are. Is it a restaurant? A Tea Party meeting? Planning the next Zuccotti occupation?
– Next Place Prediction: Leaving that Scientology meeting to visit your male masseuse?
– Demographic Attributes: White? Wealthy? Possibly an actor?
The moral implications of algorithms that infer, intuitively, your future location and it’s purpose, your age, race, gender, your acquaintances and close friends, are quite astounding.
Police are interested in the tech for crime prevention reasons. I wouldn’t doubt if the NSA was interested for the National Security applications, and knowing their history of loosely defining the words “threat” and “national security,” I don’t dare wonder how they’ll be applying it.