You don’t have to be a lawyer to be familiar with your Miranda rights. The phrase is used frequently on films and TV shows and you’ll be familiar with it even if you’ve never been cautioned – or Mirandized as it is known. The wording used when a person is read the Miranda Warning is deliberately clear and direct:
“You have the right to remain silent. Anything you say can and will be used against you in a court of law. You have the right to an attorney. If you cannot afford an attorney, one will be provided for you. Do you understand the rights I have just read to you? With these rights in mind, do you wish to speak to me?”
From the point at which you are Mirandized, you can choose to remain silent in order to avoid saying anything which might mean that you’d be incriminating yourself. Additional protections mean that anything that you say in confidence to an attorney will be protected by client-attorney privilege, but anything you say to friends or family, or even a cell mate, won’t.
You may, however, have fallen under such suspicion that a wire tap could have been ordered against you. This means that even before you were Mirandized, people might have been listening to what you say, but they won’t have started listening until after they obtain a court order. It is therefore intrusive, but not retrospective. And it is limited to what you say on the phone.
Last Friday, however, a New Hampshire judge ruled that law enforcement officials can review the Amazon Echo speaker recordings of a man suspected of murdering two women. On the surface, this appears entirely reasonable, but privacy experts say that this is the “tip of the iceberg of a huge problem” when it comes to protecting our privacy.
If you fall under suspicion and have a digital assistant in the home, like Amazon’s Echo, Google’s Home, and Facebook’s Portal devices, then an order could be sought against you to review everything you have said at home. This is not only intrusive, but it is also retrospective, potentially going back to whenever you installed the device, and it is not limited to what you say on the phone. This includes whatever you have ever said in the comfort of your own home.
It is not as if there’s any room for misinterpretation here, not as if we ever joke about things in our own home, or as if we could ever shout ‘kill, kill’ aloud when excitedly playing a video game.
The trust issue
I am no lawyer and am not seeking to offer legal advice in any way, but as a privacy campaigner and technology expert, I can see a real need for regulation here. This comes at a time when trust in technology and social media giants is at an all-time low.
In the past most consumers simply trusted that technology would work and that companies would use their data responsibly. A series of high profile incidents has shaken this trust and it is going to take years to recover from this and to rebuild the level of trust.
For software and technology companies, the link between data privacy and corporate responsibility is relatively straightforward. For the very first time, industry analyst firm Gartner has named digital ethics and privacy as one of the top 10 strategic technology trends for 2019. The Gartner report says that, “any discussion on privacy must be grounded in the broader topic of digital ethics and the trust of your customers, constituents and employees. While privacy and security are foundational components in building trust, trust is actually more than just these components. Trust is the acceptance of the truth of a statement without evidence or investigation. Ultimately an organisation’s position on privacy must be driven by its broader position on ethics and trust. Shifting from privacy to ethics moves the conversation beyond, ‘are we compliant’ toward ‘are we doing the right thing’”.
Even in non-tech industries, however, privacy has become a major issue. 80% of UK consumers surveyed by FleishmanHillard Fishburn have stopped using the products and services of a company because the company’s response to an issue does not support their personal views.
The research report from FleishmanHillard Fishburn entitled ‘The Dying Days of Spin’ looked at the issues that were most important to consumers across all industries and sectors (not just tech). Many of the issues that it found to be of greatest concern, such as healthcare and education, were ones that consumers expected the government to act on. Interestingly, though, the main issues that consumers expected companies to act on are now security and privacy, surpassing things like diversity and sustainability that had previously topped this list.
In addition, a recent Harris Poll, conducted in partnership with Finn Partners, revealed that data privacy is now the number one issue that Americans (65%) believe companies should be addressing, followed by access to healthcare (61%), supporting veterans (59%), education (56%) and job creation (56%).
If their customers want brands to take a stand on data security and privacy – seeing it as more important than either the brands’ diversity or sustainability efforts, then brands need to take it seriously also. If brands want to be in tune with their customers, then they need to be taking action on digital ethics and to do so NOW.
To be on the right side of the debate here, brands need to be making cybersecurity part of their corporate culture. They also need to take a stand on privacy rather than waiting until after an incident, by which time it will be too late to salvage their brand.
One company that has definitely found itself on the wrong side of the argument here is Facebook. Not only was it fined by the UK’s Information Commissioners Office for the Cambridge Analytica scandal, but it suffered the largest single one day loss of share capital in history ($119 billion) when its shares dropped 20% after one recent disclosure.
On top of this, French NGO The Internet Society of France is pursuing a class action suit against Facebook, saying that the social network was violating users’ privacy despite the enactment of strict new EU rules under GDPR. It hopes to get 100 million of Facebook’s 278 million users in France and across the EU to back the lawsuit.
And the recent launch if its new Portal device, Facebook’s equivalent to Amazon’s Echo Show, has received brutal reviews, being called the “Eye of Sauron” and “the worst tech device of the year.” Not only does the Portal device collect data about who you call and which apps you use used for targeted ads, but whatever you say in your home could be used against you subsequently if you ever fell under suspicion of a crime.
It is no wonder that there is growing pressure for federal privacy regulation, but we need to ensure that this also includes strict regulation of the devices that are listening to us in our own homes.
The thing is, if you ask Alexa what your ‘Miranda Rights’ are then she will be most helpful, but if you ask her how your right are impacted by having a listening device in your home, she’ll be unable to help. We urgently need some regulation here, to answer such important questions for us all.