Ethical Data: Balancing User Privacy and Trust
Louis Rossmann
Oct 8, 2024
We get that privacy policies of most all modern tech companies are intentionally vague, long, and abusive to consumers; these are written to confuse rather than inform you. In the 90s, nobody cared; back then, there was a distinct differentiation between spyware and software. Once upon a time, these things weren’t one and the same.
Our users have damn good reasons to be wary. As Captain Yossarian said, “Just because you’re paranoid doesn’t mean they aren’t after you.” (Heller, Joseph. Catch-22. Simon & Schuster, 1961.) People feel anything from suspicious to hostile at any suggestion of telemetry or monitoring.
At FUTO, we’re committed to breaking this spiral. Software companies abuse you to make money because they believe you won’t pay for it; the belief that you won’t pay for it leads to other developers adopting abusive business models. We think good software can be both good AND respectful of your privacy.
Telemetry is a dirty topic that causes intense paranoia because of how companies have continually abused it. Some examples of why we understand our users’ healthy paranoia:
In April 2024, the FCC fined AT&T, Sprint, T-Mobile, and Verizon a total of nearly $200 million for illegally sharing customers’ location data without their consent. AT&T was fined 0.02% of their net profit for one year for this behavior; sending a message to every company that abusing your privacy for profit is not only acceptable, but profitable, as it will go unpunished. They sold your information to third-party aggregators who then resold it to other location-based service providers.
Let’s not forget about Facebook’s psychological experiments on its users. In 2014, Facebook revealed that it had conducted an experiment on almost 700,000 people. Facebook manipulated the content in people’s feeds to study “emotional contagion;” how people’s emotional state changes when reading happy or sad news. This influenced the emotions users’ posts were written with.
This experiment was possible because Facebook’s privacy policies & terms allowed them to screw with their users without permission, and hell if it was written so that you’d understand it. This is done by design; with regards to reading terms of service, the process is the punishment. Imagine if someone was already depressed, anxious, on edge, or close to committing acts of violence. Opting such people into these experiments with no knowledge or understanding of their psychological state is something Zuckerberg had no problem with.
Recently, General Motors was caught sharing driving data with insurance companies without obtaining proper consent from their customers . GM monetized your driving data, collected without your consent, and sold it to insurance companies. These insurance companies hiked people’s rates based on a shitty AI’s interpretation of your driving proficiency.
According to the attorney general’s investigation, GM used deceptive tactics to trick users into opting into this data sharing, implying it was necessary to take ownership of the vehicle, and financially incentivized salespeople to push customers into these abusive agreements. Sometimes GM presented the opt-in as beneficial to customers; claiming it offered discounts on insurance premiums, in spite of doing the exact opposite!
It doesn’t have to be this way. Data can be done in a way that follows strict ethical principles. Users of our products have brought up concerns regarding our commitment to protecting user privacy. Most of our apps simply don’t collect any but for others there is some data collected. With regards to Grayjay, here’s how we handle data:
-
Source Code Transparency: All source code is available for anyone to review. Within hours of someone asking this question, we provided them a link to the direct page where they may review the code in question themselves. You can see exactly what our software does, how it handles your data, and verify that there’s no funny business going on.
-
Straightforward Privacy Policy: Our privacy policy is less than two pages long & written in plain, conversational, 6th grade-level-writing English. No legalese; it just makes sense.
-
Minimal Data Collection: We collect a single packet of data upon opening the app which has the following in it. This can be verified by looking at the source code itself.
- The version of the app you’re using and the OS SDK version, to help us improve the app.
- A randomly generated unique identifier, which doesn’t link back to your identity or specific device.
- The model and make of your phone, so we can ensure compatibility and address issues specific to certain devices.
The data we collect helps us improve the application, so we can live up to our “sixth pillar” of the five pillars of FUTOey software ( IMO, the most important one) ; don’t suck.
-
Finding & Fixing Bugs: So often we hear people say “Grayjay version X doesn’t work with version Y of Android.” By knowing which Android versions & app versions are having issues, we can fix bugs ahead of time before they become widespread problems, before features in the app that are based on this codebase can continue to be built on top of said bugs, making them more difficult to untangle later.
-
Overcoming Underreporting: Most crashes and bugs never get reported to us, which makes troubleshooting difficult. The data we collect helps us identify problems that might go unnoticed.
-
Checking App Health: Having knowledge of sudden spikes or drops in app usage helps us stay on top of potential issues. For instance, if a plugin stops working, or a change in a content website’s delivery methods starts affecting other apps, we can spot these trends early. If we have a doubling of users in one day, it might mean that other apps have stopped working for a certain platform. If that’s the case, we need to figure out why before that same thing happens to us!
The minimal data we use has been designed in a straightforward, honest way, with the approach we’ve taken ensuring that it is impossible for us to profile you. We get that privacy is a giant concern, particularly with an application you use to view videos on all sorts of topics; if collected unethically this would make it very easy to profile users.
By making source code available, writing a very basic & readable privacy policy, & only collecting a tiny amount of non-identifying data, we hope to demonstrate that respecting your privacy & delivering a good product aren’t mutually exclusive.