ETHIC WALLS for DATA MONOPOLIES?

By Guillermo Monge

An “Ethics Wall”[1], more commonly known as “Chinese Wall”[2], is a term used to describe a barrier that separates information from two sub-components within the same party, due to conflicts of interest or separation of sensitive confidential information.

These ethics walls serve purposes such as separating strategic confidential corporation information that M&A teams may hold from the brokers/traders on the investment line of business of many financial firms; or referring to the separation between advertising and editorial branch in journalism firms.

With this in mind, let us consider technology firms, specially the big corporations that hold vast amounts of user information garnered from multiple components of their business. For the sake of not being too vague, let me particularize to Google’s case, but they are not the only multi-tentacle data-collecting service player we deal with.

Google has an incredible ability to capture information of a single user’s search history, location and routing (Google Maps, Nest), email (Gmail), likes and dislikes (YouTube, Google+),etc. This ability, paired with their processing capabilities, has help them be one of the most active and influential players in our technological society. in my opinion privacy should be managed individually, not set as fixed; it is not my intention to judge the ethical considerations and practical benefits of Google.

Consider a scenario in which ethics walls are set up between different services. Each distinct service does not have to be isolated from all, but rather we could set up a set of categories in which services within the same category could share information, but not with other services outside. Although more lenient towards privacy and (it could be argued) with some cost on “performance”, this scenario still proves a fixed imposition on users.

Consider, then, the following improved scenario: these categories, or more fittingly for this scenario “boxes” or “privacy environments”, are set by the user. Therefore the user has complete control to which information can transpire to other applications and which should remain in place. These “privacy environments”, very much like scopes in programming languages, can be defined hierarchically, allowing for information to flow from​ container into containee, but not vice versa.

Recommender systems, in which information of other users are used to find similarities and suggest options according to their actions, would need to include these filters as well. A simplistic mathematical view of these systems is as sparse matrices (users x products) in which we try to predict the score of missing entries. These matrices could include a filter in the columns according to the user’s preference, in order to comply with this “user-defined ethics-wall” scenario.

In conclusion, it is my belief that privacy on the over-reaching data-collecting services of any corporation should be openly discussed and more options should be available for customers, aside from “using the service logged out”.

REFERENCES

[1] ​ Chinese Wall:​ ​ Wikipedia entry
[2] ​ Ethics Wall vs. Chinese Wall etymology discussion​ : ​ Legal Ethic Forum entry

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s