A number of US banks have unleashed AI-powered cameras capable of both facial recognition and general behavioral pattern analysis, hinting at a wider rollout in retail stores and elsewhere, leading to a big drop in customer trust.
This cheery approach clashes a bit with the secrecy of the banks that are already using such technology, noting that they were already leveraging facial recognition on mobile circuits. So, why not leverage it in the real world?
Reuters explained City National Bank of Florida, JP Morgan Chase, and Wells Fargo were conducting trials of AI surveillance systems, but declined to say when, where, or on what basis the recording takes place.
City National specifically mentioned it would be trailing 31 sites using facial recognition software that potentially could “spot people on government watch lists” that sounds like a lawsuit waiting to happen.
There are some positive uses for the mini-cameras in banks. As the economy burrows into a black hole, it’s not unusual to spot someone snuggled up with a sleeping bag inside a bank’s ATM vestibule, where an AI camera can at least distinguish them from an inanimate object while they try to get some sleep.
Have to remove those undesirables! However, this is one of those distinctly American problems where the authorities instinctively go for the wrong solution. What have people been turned into, some kind of job stealing monster?
Surely with nearly 60 empty homes for each homeless person, it makes more sense to merely match up a person to a superfluous shelter rather than deploy an Orwellian network of semi-sentient cameras bent on stopping people from sleeping inside banks.
But the bogus solution has already generated multiple job opportunities by this point – the guy who installs the camera and does tech support on the camera and the guy who sits in his car or in the bank waiting for an alert of suspicious activity inside the vestibule.
It’s certainly one explanation, Chase has admitted running a behavioral testing pilot in Harlem, long a black mainstay of New York City, and while the bank hemmed and hawed about the risk of being seen as racially insensitive, it ultimately went with the location anyway for convenience’s sake.
And regarding the homeless, a security executive at a mid-sized Southern bank, interviewed by Reuters, actually gushed about the kind of innovative new measures to combat the homeless setting up shelters in their vestibules.
Noting that loitering-detection systems, sirens and strobe lights, and even outdoor-facing cameras designed to detect and deter “suspicious activity” immediately outside the bank during closing hours.
However, the banks insisted they didn’t want to stop people from seeing shelter, but ultimately, convenience won the day once again.
Overall, facial recognition continues to inhabit a gray area. In particular, “smart” doorbells like Amazon’s Ring that – often unbeknownst to users – comprise their own ad hoc surveillance camera network.
By feeding their data to law enforcement without knowledge or consent of the user, acting outside the law, similar to the myriad cameras dotting city streets but less obvious about their activity.
There are more ways to get on such watch lists than ever, and fewer ways to get of. And as the number of lists grows, so does the information they contain, along the lines of the disturbingly comprehensive online advertising categories employed by Google and Amazon to track their customers.
While Chase, for instance, has insisted it has no plans on using “facial, race, and gender recognition” during its latest test of software that aims to identify behavioral patterns of both customers and workers at some of its Ohio locations, the bank does not have a strong inclination toward telling the truth.
Just ask the algorithms themselves – would they lie to you?
RT. com / ABC Flash Point Spying News 2021.