Smart assistants could soon come with a 'moral AI' to decide whether to report their owners for breaking the law. That's the suggestion of academics at who say that household gadgets like the Amazon Echo and Google Home should be enhanced with ethical smart software.
Devices would then have an internal 'discussion' about suspect behaviour, weighing up conflicting demands between the law and personal freedoms, before arriving at the 'best' course of action.
And of course, the
personal freedoms will end up secondary - if they are considered at all!
'If we want to avoid Orwellian outcomes it's important that all stakeholders are identified and have a say, including when machines shouldn't be able to listen in. Right now only the manufacturer decides.'
But what if they don't want to
avoid Orwellian outcomes? Then only the manufacturer will decide, as they admit the situation is now.
In Technological slavery, I've wrote that the elites intend to use intelligent cameras to live-analyze everyone's movement in regards to "suspicion". Of course, this only works on the assumption that people will act "suspicious" outside. They could easily keep their illegal acts in the house, which would be safe from the prying eyes. That problem is beautifully solved with these so-called smart assistants, and the house becomes enemy territory like the rest of the world already is.
Law enforcement responded to a 911 call last October from an Eden Prairie woman who had just been robbed in her home. Her husband, Oukham Oudavanh, 63, suffered a heart attack and died at the scene, and the suspects made off with $50,000 belonging to the couple's popular food markets, Shuang Hur in Minneapolis and St. Paul.
Police wanted Google to identify all mobile devices in the area of the crime scene over a 6-hour time window. They also wanted location data for every cellphone in dense, urban areas surrounding the food market businesses over a 33-hour window.
If a device's location, movement, or timing established probable cause, investigators could go back to court and get a second warrant ordering Google to reveal the name of the cellphone's owner.
When Google provides location data in response to one of these warrants, police "put that location data into the software and then map out a 'profile of life' of where they go, where they travel, and where they were the night of the crime," Bruley said.
It presented a tough probable cause analysis, and police were asking for a lot: the identity of the phone's owner, billing information, phone numbers, and two months' of their web browsing history and location data. Google was also put under a non-disclosure order, restricted from telling the user that this information would be divulged for at least six months.
Asked earlier this week whether that suggests Brooklyn Park police got a data dump on the wrong person, Bruley cautioned against assuming that, saying that investigations take time. On Thursday, Bruley said police have closed the investigation, because Google ultimately could not track the data point.
So, whenever a crime happens, the police can now get location data from everyone that was near the scene. If your
profile of life fit their criteria for "suspicious" they will also get massive amounts of your personal data. This is regardless of whether you are innocent (as was the person who's data they got in this case) or not. And you will never know any of this is happening. This is similar to what I've predicted in Technological slavery, but I was mostly thinking about the data from intelligent cameras, not cellphones. But that will come soon enough as well - they will use every advantage the technological slavery system provides them. Of course this will be marketed as a triumph in regards to safety, even though "crime" is arbitrary and can change anytime at the whim of the rulers of this world.
Saini found years-old messages in a file from an archive of his data obtained through the website from accounts that were no longer on Twitter.
But, in our tests, we could recover direct messages from years ago — including old messages that had since been lost to suspended or deleted accounts.
As if we needed more assurance - "deleting" something from social media does not mean it's actually removed. The smart thing to do is to assume that all the tech giants are keeping your data forever. Recall that, as I've wrote in Technological slavery - you can get arrested for apparently innocent stuff you post on social media - so it's better to avoid them.
The replacement of people by robots continues (as I've written before). The whole article goes to great lengths to deride us:
That’s largely because Bbox has eliminated one of the most expensive elements of operating a business: labor.
We're expensive and useless, see?
But there is good news for those who prefer the familiarity of a first name basis with their favorite carbon-based baristas. The robots have names. “Jarvis” is the machine that positions orders for customer pickup
But so you don't feel bad, we'll give the robots some humanity too.
“It’s more efficient than having a normal barista do it by hand,” said Scelzi. “It knows how long it takes a cup to be filled. It knows where to place it.”
And a person apparently doesn't know these things...
While the company is in beta, visitors will likely interact with at least a few organic life forms at the registers. Just don’t call them baristas.
Organic life forms - how insulting. Why not just call us people?
Clearly, we are being conditioned to think we are inferior so that the robot revolution is accepted easier. What isn't mentioned is that the robot can only execute a specific program - don't expect anything responsive from them. But that is the way these cafes are supposed to be designed - you order "X" and get it, that's it - robbing you of the rich world of human communication. They also want to get into other markets:
“We’d like to be able to do burgers and burritos and Chinese food,” he said. “Basically anything that there is like a QSR — a quick serve restaurant — or fast food restaurant that exists today, we would like to be able to do that as well.”
And outcompete them:
Becker hopes those eventual burrito and burger machines can serve a higher quality product than places like McDonald’s or Burger King, but for the same prices, or lower.
So you can expect most food serving places to be mechanized soon. What does that entail? How about not being able to pay with cash:
Customers place their order via Bbox’s mobile web app (an iOS app is in development) or via tablet outside the café. They create a customer profile, including phone number and payment information.
We're being dehumanized and enslaved in the name of progress. How sad.
Another "privacy-based browser" bites the dust!
This afternoon, users posted to Y Combinator’s Hacker News that the protection in Brave browser does not block tracking scripts from hostnames associated with Facebook and Twitter. This is shown by the source code for the tracking_protection_service.h file that contains a comment informing that a tracking protection white_list variable was created as a “Temporary hack which matches both browser-laptop and Android code”.
The list of whitelisted hostnames are: connect.facebook.net connect.facebook.com staticxx.facebook.com www.facebook.com scontent.xx.fbcdn.net pbs.twimg.com scontent-sjc2-1.xx.fbcdn.net platform.twitter.com syndication.twitter.com cdn.syndication.twimg.com
Look how they justify themselves: (from https://brave.com/script-blocking-exceptions-update/) - (archive)
Brave aims to maintain a working Web, while reducing or eliminating the invasive tracking that has become so ubiquitous online.
So which one is the priority? Since your site claims it's the privacy...
For example, Facebook and Twitter both contain widgets which web authors can integrate into their online properties. These widgets aim to make it easier for users and publishers to connect by allowing users to authenticate through Facebook or Twitter, rather than creating and maintaining an account with the publisher themselves. The exception list covered by several news outlets allows both of these widget sets to operate on a leash. They can load, but they cannot access local data on the client so as to track the user.
Who gives a shit whether they can access local data? You've now associated your browsing history with your Facebook and Twitter accounts - and you worry about some local data?! There's no worse tracking than the one attached to your name!
For many publisher implementations, blocking the script request would break Facebook-based OAUTH and Facebook likes and shares.
Yes, of course blocking Facebook tracking would mean you can't authenticate through it. And by whitelisting it, Brave choose
a working web over the privacy of its users - proving they, like Mozilla, are just another malicious agent pretending otherwise.
38-year-old Kate Scottow, of Hitchin, Hertfordshire, said she was “arrested in my home by three officers, with my autistic ten-year-old daughter and breastfed 20-month-old son present”
I was then detained for seven hours in a cell with no sanitary products (which I said I needed)
So, according to the elites, posting some alleged insults on the Internet is worse than storming someone's house, kidnapping him in front of his children and throwing him into a cell unfit for an animal ("no sanitary products"). Imagine the stress on the chlidren, especially the breastfed one who's fully dependent on his mother after all. Not only that - she's had her DNA and fingerprints taken; say something the elites don't like? Prepare to go into their database and be "suspicious" forever! They also took away her laptop and mobile phone and didn't give it back after more than two months. But the most important thing is - as I spoke in technological slavery - that this was only allowed to happen because that's the nature of the non-anonymous Internet. If this was said in real life, nothing could have happened to her. So the lesson is, don't post anything attached to your real name on the web. It might come back to bite you in the ass.
The rebranded, National Integrated Identity Management System (NIIMS) now requires all Kenyans, immigrants, and refugees to turn over their DNA, GPS coordinates of their residential address, retina scans, iris pattern, voice waves, and earlobe geometry before being issued critical identification documents.
This is a clear step towards the kind of world I've described in my Technological slavery article. Since you need ID to do anything pretty much (unless you go live in the woods), you will have to give up all the above data. Imagine the complete slavery and hopelessness that could result. Intelligent cameras will easily detect you by the earlobe geometry, and the voice data might even identify you through an anonymously uploaded YouTube video. Finally, they might simply replace IDs altogether with the iris scans and such - creating a world where anyone can be turned off at the elites' whim. For now this is just Kenya (a test run?), but they will for sure bring this system to other countries eventually.
Imagine smart toilets in the future that will be analyzing human waste in real time every day. You don't need to be going to visit a physician every six months. If any sign of disease starts showing up, you'll be able to catch it much faster because of urine analysis and stool analysis.
Again, more technological slavery. Will this actually help people? I can imagine it being used to scare you into thinking you've got some imaginary disease and need some expensive drug. Even if it is not intented to be used this way, it will be easier to worry about some disease you allegedly have if you are reminded about something being wrong every time you take a shit. And stress is the biggest killer, as they say.
Early tests of the system have resulted in 76 percent accuracy, so well below where it needs to be. iBorderCtrl hopes to increase that to 85 percent.
This one means lost jobs, but more importantly, it will be more demeaning, dehumanizing (imagine having your facial expressions analyzed by a machine...) When this gets good enough, I can see it being very hard to fool.
We found that portraits provided the best way to illustrate our point, which is that algorithms are able to emulate creativity
Another one of supposed creative human endeavors goes down, supporting the thesis in my AI article.