Jump to content

Opinion - Big Tech Needs to Use Hazardous Materials Warnings


steven36

Recommended Posts

The technology sector has a hazardous materials problem, beyond the mountains of electronic waste it generates. More immediately, Big Tech fails to warn users when its products and services are hazardous. Users are long overdue for a clear, concise rating system of privacy and security risks. Fortunately, tech can learn from another industry that knows how to alert consumers about the dangers of improperly storing and leaking toxic products: the chemical industry.

 

 

https://s7d7.turboimg.net/sp/926f7f9240b4e6c5e560ddb6e3d5b46e/acb0.jpg



Nearly sixty year ago, the chemical industry and its regulators realized that simple communication of hazards is critical to safety. Material Safety Data Sheets, the chemical equivalent of technology user terms and conditions, have offered descriptions of those hazards since the early 1900s. But as the industry evolved, it became clear, sometimes tragically, that end users rarely read these lengthy technical volumes. A quick reference was required.

Enter the fire diamond, the now ubiquitous, universally understood symbol of chemical safety. You’ve seen them on propane tanks, chemical containers, and laboratories: cartoon rhombuses divided into colored quadrants, each filled with a number, between 0 and 4, indicating a substance’s toxicity (blue), flammability (red), and reactivity (yellow). Introduced in 1960 by the National Fire Protection Association, the diamond, officially called NFPA 704, is the standard for communicating the most basic and essential safety information of hazardous materials in the United States. Even if users don’t read the safety data sheet, they are greeted by this bright, unavoidable summary of material hazards every time they look at the container.

Whereas the chemical industry and its regulators have worked to ensure clearer warnings, the tech industry has worked to make it increasingly difficult for consumers know what hazards their products pose (hello, FaceApp). As technology companies use and misuse the personal data they collect in increasingly sophisticated ways, user agreements have only become longer and more byzantine. Facebook, for example, has terms of service and related policies that stretch for over 35,000 words, about as long as The Lion, The Witch, and the Wardrobe, and as bewildering as Narnia. Buried within are clauses that have significant privacy implications such as granting Facebook a “non-exclusive, transferable, sub-licensable, royalty-free, and worldwide license to host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of your content.”

License agreements, like toxicology studies, provide valuable information, but they’re of little use when users need to quickly know what they’re getting themselves into. When emergency personnel are considering using a chemical product, they immediately need to know: Will it explode? Will it poison me? Will it burn me? Right away, the fire diamond answers. When considering a new app or service, tech users have similar questions: How much of a security risk is this? What data is collected and stored? Do I have any control? Will it poison me? Will it burn me? To find those answers, a user often first has to jump into the fire.

Besides the self-interest of entrenched tech industry players, there is no excuse for the need to read dozens of pages of dense text to learn the dangers of a product when that information can be condensed into a few numbers and color-coded blocks. If users are to rapidly adopt new services and technologies and to bear responsibility for understanding the content and implications of the burdens posed by license agreements of those technologies, then a transparent and standardized method of hazard communication is required.

Who should administer this? It could be a mandatory regulatory framework (from the FTC or Consumer Product Safety Commission) or a voluntary independent rating system created from accreditation bodies or industry watchdogs like the Electronic Frontier Foundation.

What should it look like? There are myriad design options, but one would be to create a tech safety diamond. Instead of stating physical harm, this warning system must summarize the key aspects of data collection, user control, data use, and data handling, to let users know if it’s worth the risk.

Blue: For data collection, the technology equivalent of toxicity, a low rating would indicate that the service would gather only names, IP addresses, or other basic information, while a high rating would mark the hoarding of deeply personal and potentially dangerous information like voice recordings or detailed location data.

Yellow: User control, the parallel to reactivity, is perhaps the simplest to rate, once a service has my data, can it be fully deleted, and if not, to what extent will it persist?

Red: Data use, or flammability, is extremely difficult to summarize in a single number, but low ratings would correspond to in-house uses for the service’s essential functions, high ratings would indicate aggressive third-party sharing, strong intellectual property claims on user content, or use of data to sculpt user behavior.

White: Data handling, which would range from secure storage and encryption (0) to unaccountable third parties (4).

Clear warnings will empower users to make better-informed decisions. With them, we wouldn’t need to reconsider only after we learned about the next phone company and app selling our location data to the highest bidder, or an insecure IoT device allowing bad actors to peer into our bedrooms. And perhaps companies will think twice before offering another service that would be labeled with the equivalent of a skull and crossbones.

 

Source

Link to comment
Share on other sites


  • Replies 5
  • Views 499
  • Created
  • Last Reply
20 minutes ago, mkc21 said:

yeah but what do you do with old phones? It's hard to really wipe them out

It you can't clean up your phone  you need to put iit n the garbage  .They hundreds of guides out there how to clean malware off a phone, but if you  bought one of those cheap ones that come with a image with preinstalled malware  you should of bought a better phone. That why PCs will always be better than phones .

 

Quote

 

There's no official way to install a "pure Android" ROM: the only official ROM is that provided by the manufacturer.

If you want to start from a clean slate, you should look for an AOSP-based custom (unofficial) ROM.

 

If you to lazy to do that put  it in the garbage and buy a better phone .Old phones  going have shitty battery life and things anyway. :clap:

Link to comment
Share on other sites


18 hours ago, steven36 said:

It you can't clean up your phone  you need to put iit n the garbage  .They hundreds of guides out there how to clean malware off a phone, but if you  bought one of those cheap ones that come with a image with preinstalled malware  you should of bought a better phone. That why PCs will always be better than phones .

 

If you to lazy to do that put  it in the garbage and buy a better phone .Old phones  going have shitty battery life and things anyway. :clap:

 

I mean cleaning my data before disposing them

Link to comment
Share on other sites


Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...