Privacy And Web 3.0: Implementing Trust and Learning from Social Networks

After having shifted from Web 1.0 to Web 2.0, scientists welcome the advent of Web 3.0, an environment where meaning is added to data. While in the Semantic Web people are no longer users, but part of the emerging applications, producers, subjects and beneficiaries of the Big Data, however, opaque processing of personal data poses tremendous risks and dangers for individuals. Given the new era of Big Data this paper studies firms‘ purposes and practices to detect some emerging privacy risks. Moreover, theories that deal with social networks are examined to conclude that, even if people state that they value their privacy, however, they often disclose a huge volume of personal information. Taking into account that today‘s European concept of privacy is conceptualized in negative terms this paper also proposes the implementation of trust and loyalty into the privacy concept through flexible fiduciary laws. Furthermore, data portability is discussed to detect its potential as a strategic feature, a key tool that will enhance trust. Finally, further scenarios and proposals are submitted, in our attempt to answer the question whether the European concept of privacy could be re-shaped for the benefit of individuals.


Introduction
In 1983, Time magazine nominated the Personal Computer as -the machine of the year‖ to announce the entry of the Informational Age into our homes 1 . In 2006, a computer was again displayed in the above magazine's cover, albeit, this time the computer screen was a mirror reflecting the person of the year: -You.‖, the very user, the hero of the Information Age 2 .
After having shifted from Web 1.0 (First Era) 3 to Web 2.0 (Second Era) 4 , scientists speak of the Semantic Web 5 (Web 3.0) 1 For instance, job seekers reveal online information that they would not disclose during an -offline interview‖, while minors may reveal sensitive data, like their sexual orientation or their ethnic origins, just by updating their status, or posting a comment, or uploading images 13 . In fact, people can disclose their deepest secrets and, thus, some authors argue that privacy needs to be reshaped. Is it about secrecy 14 ? Is it a right to be let alone 15 ?
Today's users can access social networks' platforms and disclose their personal data 16 , such as their physical location (for instance, GPS or IP location), while they may not be aware that the very location -of their mobile device-is constantly being recorded, regardless of use -or non use-of the device 17 . Moreover, people's behaviors and relationships have changed as well. Our kids may have -more friends‖ than us -and we are not one of them-while, at the same time, a social network's platform may be regarded as a place where minors (and people in general) exchange information and communicate with their peers; enter a parent and the party is over 18 . There is no distinction between lovers, schoolmates, and strangers and they are all, thus, sorted into same group (-friends‖) 19 .
Given the above radical changes, this paper studies the processing of personal data in the age and the economy of Big Data to detect purposes -and risks-of such practices. Furthermore, the hedonic use of social networks and several theories (such as the social capital theory, altruism and reciprocity), with regard to such networks, are examined to argue that, Maras, Social Media Platforms: Targeting the -Found Space‖ of Terrorists, Journal of Internet Law, August 2017, pp. 3-9. Available at https://www.researchgate.net/publication/321549900_Social_Media_Platforms_Targeting_the_Found_Space_of_Terrori sts. Social networks, however, can also be used to achieve several socially useful purposes. For instance, with regard to their significant role in disaster management, see Jooho Kim, Makarand Hastak, Social networks analysis: Characteristics of online social networks after a disaster, in International Journal of Information Management, February 2018, Vol. 38(1), pp. [86][87][88][89][90][91][92][93][94][95][96]. Available at https://www.researchgate.net/publication/322175764_Social_network_analysis_Characteristics_of_online_social_netw orks_after_a_disaster. For their effects on academic achievement, see Robert M. Bond, Volha Chykina, Jason J. Jones, Social network effects on academic achievement, The Social Science Journal, Volume 54, Issue 4, December 2017, pp. 438-449, available at https://www.sciencedirect.com/science/article/pii/S0362331917300605. 13 See, amongst others, A. Acquisti, C. Fong, An Experiment in Hiring Discrimination Via Online Social Networks, July 17, 2015. Available at SSRN: https://ssrn.com/abstract=2031979 or http://dx.doi.org/10.2139/ssrn.2031979/. 14 For instance, Strahilevitz argues that privacy is not about secrecy. Indeed, a sexual intercourse needs more than one, people tell others about medical ailments to unburden, and sharing most intimate information with those, who are expected to keep it secret, promotes friendship and intimacy. See Strahilevitz Lior, A Social Networks Theory of Privacy, December 2004, U Chicago Law & Economics, Olin Working Paper No. 230; U of Chicago, Public Law Working Paper No. 79, at. p. 5. Available at SSRN: https://ssrn.com/abstract=629283 or http://dx.doi.org/10.2139/ssrn.629283. However, as others claim, there is a relation between privacy and secrecy (- […] right to secrecy […] to limiting the knowledge of others about oneself […] […] any information relating to an identified or identifiable natural person ("data subject"); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person […] although people may state that they value their privacy, albeit, they rarely refuse to share personal information. Current concept of privacy is brought to the discussion table to comment on its negative approach and to propose the introduction of trust and loyalty through flexible fiduciary laws. Moreover, data portability, one of the most important rights that the GDPR introduces, is studied to support its potential as a strategic element that will safeguard and enhance trust. Finally, further proposals are submitted to support that the European concept of privacy could, indeed, be re-shaped for the benefit of individuals.

Personal Data in the Age and the Economy of Big Data
Some authors referring to the wealth of data flooding the digital environment -often described as Big Data-have defined Web 3.0 as -Semantic Web technologies integrated into, or powering, large-scale Web applications‖ 20 . Web 3.0 could be understood as a phenomenon in which individuals are no longer users; they are part of the applications that emerge and disappear; they are also producers, subjects and beneficiaries of Big Data 21 . In particular, Big Data refers to the exponential growth and availability of data in an environment, where the three -V‖ characteristics are identified; the Volume of data, which is collected and processed, the Velocity, meaning the speed, with which data is being produced and processed, and the Variety of sources 22 . This environment provides further opportunities for understanding or predicting individuals' behavior and, thus, firms can expand their knowledge about a person without her knowledge or consent 23 . Indeed, people have little or no idea with regard to what or ways in which data is collected, processed, shared or exchanged 24 with third parties 25 .
In 2018, life, including social connections or even love 26 , happens online and, hence, it is difficult to name an aspect of 20  . Some authors add more characteristics, such as the -Veracity‖, which refers to the way data should be used in order to create necessary trust and ensure reliability. See Maria Giannakaki, The value of information in the age of ‗Big Data': from Web 1.0 to Web 3.0, id, at p. 262. Others identify two additional dimensions of Big Data; variability and complexity. The former is evidenced by the fact that data flows can be highly inconsistent with periodic peaks, while the latter is manifested in the nature of Big Data itself. It is not only structured but also unstructured and coming from multiple sources. See Richard Herschel, Virginia M. Miori, Ethics & Big Data, Technology in Society 49 (2017), pp. 31-36, at p. 31, mentioning that - […] Big Data is all about capturing, storing, sharing, evaluating, and acting upon information that humans and devices create and distribute using computer-based technologies and networks […]‖ and pointing out that we are now generating 2.5 quintillion bytes of data, so much that 90% of the data in the world today has been created in the last couple of years. Available at https://www.researchgate.net/publication/314463176_Ethics_Big_Data. 23 For some ethical issues with regard to privacy, confidentiality, transparency and identity, see Jonathan H. King & Neil M. Richards, What's Up With Big Data Ethics? 2014, Radar, O'Reilly Media, available at http://radar.oreilly.com/2014/03/whats-up-with-big-data-ethics.html. 24 A secondary market has been created with regard to personal data: Data brokers may be defined as professionals who operate on a secondary market, such as businesses that facilitate the circulation and enrichment of data. See Commission Nationale De L' Informatique Et Des Libertés (CNIL), 36 th Activity Report, 2015, To Protect Personal Data, Support Innovation, Preserve Individual Liberties, pp. 31-32 (-Data brokers: the oil and the iceberg‖). Available at https://www.cnil.fr/sites/default/files/atoms/files/cnil_rapport_2015_gb.pdf. As CNIL puts it, data brokerage aims to aggregate data, then redistribute it for a variety of purposes, which are focused -amongst others-on commercial targeting (for example, direct marketing, advertising, customer experience enhancement) or checking people's characteristics (namely trustworthiness, creditworthiness, identity etc). 25  present-day society in which online access does not play a part. Thereafter, during innumerous online activities, a huge volume of personal data is produced, collected 27 and processed 28 .
Does an item of information, e.g. one's pattern of sleep, which may be provided while using a smart phone app, constitute personal data?
In the age of Big Data, processing of a huge volume of data 29 not only enables firms to draw innumerous conclusions that relate to one person but also encourages identification of an individual. So when the above pattern of sleep relates to an individual, who can be identified, it does constitute personal data 30 . This means that it is not the actual identification, but the capacity to identify one person 31 that makes the data personal.
So, why do firms process our personal data?
Today, what we deal with is the monster of a -free‖ Internet 32 paid for by advertising targeted on the basis of an unprecedented level of surveillance of human lives 33 . Data processing enables firms not only to identify an individual or detect her activities 34 but also to profile 35 natural persons and target groups, to which firms address personalized ads 36 .
For example, Google used billions of credit-card transaction records to prove that its online ads are prompting people to make purchases even when they happen in brick-and-mortar stores 37 37 See Elizabeth Dwoskin, Craig Timberg, Google now knows when its users go to the store and buy stuff, May 23, Measurement) matches goods, which are purchased in traditional stores, to the -clicking‖ of online ads (-Bricks to Clicks‖). Thus, the firm is aware of whether a consumer bought the product, on the ad of which she clicked 38 .
Another good example is Target, a firm which not only collected, but also produced personal data, meaning the information that a consumer was pregnant. This was, actually, true and the very consumer had not known 39 .
So, processing of personal data aims, amongst others, at commercial targeting, including direct marketing and advertising 40 , or checking (or predicting) 41 people's characteristics, such as trustworthiness, creditworthiness or identity.

2017, The Washington
Post, available at https://www.washingtonpost.com/news/the-switch/wp/2017/05/23/google-now-knows-when-you-are-at-a-cash-registerand-how-much-you-are-spending/?utm_term=.b97032baeb8b. 38 In 2017, the Electronic Privacy Information Center (EPIC) asked the Federal Trade Commission (FTC) to examine lawfulness of Google's program. See Brian H. Lam and Cynthia J. Larose, United States: FTC Asked To Investigate Google's Matching Of Bricks To Clicks, September 25, 2017, Mondaq, available at http://www.mondaq.com/article.asp?articleid=630914&email_access=on&chk=2167746&q=1536832. It is worth noting that, very recently, Google reported that, in 2017, it took down more than 3.2 billion ads that violated its advertising policies. This included 79 million ads, which aimed to send people to malware-laden sites, 66 million -trick-to-click‖ ads, and 48 million ads, which attempted to get people to install unwanted software. Google also reported that it blocked 320,000 publishers and blacklisted about 90,000 websites and 700,000 mobile apps for violating Google's policies. See Scott Spencer, An advertising ecosystem that works for everyone, Google, Mar. 14, 2018, available at https://blog.google/topics/ads/advertising-ecosystem-works-everyone/. 39 […] the Privacy Rule also permits "covered entities" to disclose or share patient information, also without consent, with other covered entities for treatment or reimbursement purposes, and with vendors and contractors who sign an agreement […]‖), available at https://law.stanford.edu/publications/clinical-genomics-big-data-and-electronic-medical-records-reconciling-patient-righ ts-with-research-when-privacy-and-science-collide/. 40 See, for example, recent Youtube's practices with regard to advertising: Chaim Gartenberg, YouTube plans to annoy music listeners into subscribing by playing more ads, ‗Frustrate and seduce' users into signing up, The Verge, March 21, 2018, available at https://www.theverge.com/platform/amp/2018/3/21/17147800/youtube-streaming-service-lyor-cohen-ads-music-industr y-spotify-free. 41  To do so, people are profiled and sorted into groups 42 and, thus, further discrimination 43 issues are raised 44 .
Setting aside the above practices and risks, let us now move on to social networks' environment to examine the extent, to which people value their privacy, and study some motivations that encourage us to share personal information.

The Hedonic Use of (and Several Theories on) Social Networks
Social networks, such as Facebook or Twitter, look for ways to collect a huge volume of data that relate to individuals' online behavior. The more accurate the data collected, the more effective the targeted advertising efforts, which are to be fueled 45  are living, encourages users to visit such websites more than ten times per day 50 . As many authors argue, privacy attitudes may quite often be in stark contrast with privacy behaviors 51 and this could be explained by powerful hedonic motivations 52 , which encourage users to share their information.
Indeed, social networks can be understood as -hedonic information systems‖, meaning that their primary goal is self-fulfillment that enables users to experience fun 53 . In this context, people use networks for hedonic purposes 54 , such as sharing information and personal data -like personal images-or playing games, watching movies, and so forth. Thus, this use may relate to gratification, meaning escapism or fantasy, social interaction, achievement or self-presentation 55 (and, thus, recognition).
Others support the social capital theory, in accordance with which mutual support, shared norms, social trust and sense of mutual obligations could be detected in social networks 56 . Social capital could, in fact, be understood as the value that an individual may derive from belonging to a community. Moreover, some authors regard -Ubuntu‖, the central concept of social and political organization in African philosophy, as a way to understand users' behaviors in social networks 57 . This concept consists of the principles of sharing and caring for one another 58 : To be a human is -to affirm one"s humanity by recognizing the humanity of others and, on that basis, establish humane respectful relations with them‖ 59 and -to be‖ is -to belong‖ 60 . Ubuntu worldview is a community-based mindset, opposed to Western libertarianism and individualism, and close to communitarianism 61 . It is based on values of intense humanness, caring, and associated values to ensure a happy and qualitative community life in a spirit of family. This means that privacy might be considered as less important from this perspective 62 . Hence, as some authors have observed, an image that is shared in social networks and which shows, for instance, one person close to another could, not raise issues of privacy but, prove such relationships, through which one can feel healthy and achieve self-fulfillment 63 .
Furthermore, altruism, as the acts of caring about the well-being of other individuals without any expectations and regardless of any direct benefit to oneself 64 , could be found in peoples' behavior, while interacting in social networks. For example, individuals may send (e.g. birthday) virtual gifts to their digital friends or might spend some time to write a decent post on a friend's -wall‖ -for others to see it and-to -honor‖ this person. The above actions are, of course, undertaken without the expectance of something in return; the very deed is itself the -reward‖, the intrinsic enjoyable act that comes from helping others 65 .
Finally, reciprocity 66 could also be regarded as an important element of peoples' interaction in social networks. Namely, a -friend request‖ could be treated as a transaction, a request to trade personal data with the expectation that the -offer‖ will be accepted. A -status update‖ could be treated as an attempt, undertaken by an individual, who aims to catch attention. Thus, reputation could motivate people to participate in social networks 67 . One could also argue that competition 68 , the desire to win in interpersonal situations, could be detected in the above interactions, since people might see it as a pleasure act to e.g. acquire as many friends and followers as possible or to win in games and applications.
Although people may state that they value their privacy, they rarely refuse to share their personal data. Indeed, the above hedonic uses and theories assert that individuals use social network systems to fulfill their need for entertainment or relationships and identity construction. To some, this overrides their privacy concern 69 . So, could there be another way to conceptualize privacy?

Negative Conceptualization of Privacy: A Need for Trust and Loyalty
One could argue that trust, as the willingness to accept vulnerability to the actions of others 70 , is an essential element of any activity, in which people are involved, such as friendship, commerce, and so forth. Trust can be detected everywhere, since people trust their lawyers or doctors, or that the train will arrive safely to the correct destination. Without trust, politics, commerce and many other fields of everyday life would probably fail. So, could rules be provided to safeguard trust, with regard to privacy policies?
Today, privacy is conceptualized in negative terms. For example, legislators focus on potential harm or data breach, and attention is drawn to the capacity of a person to opt-out. One could argue that privacy is regarded as a -tax on profits‖, albeit, if trust were incorporated in privacy rules and policies, the latter could encourage information relationships.
As noted above, personal data is collected and processed in ways an individual may not understand or know. This creates confusion and further clouds the image. Thus, it is a problem not only for consumers, but also for governments and firms; when there is no trust, people share less information; when less information is shared, it is hard to achieve economically and socially useful purposes 71 .
Under the current regime, rules, which govern personal data, protect -amongst others-interests in informational self-determination 72 . As authors consistently claim, the right to the protection of personal data refers to control over -the processing of-personal data 73 . Thereafter, the key tool for successfully and effectively exercising control is the subject's consent 74 to the above processing.
This concept of control and consent focuses on the harm that has to be avoided or the consent that has to be obtained. In this context, some regard social networks' advertising practices as something creepy 75 , or speak of -surveillance‖ that aims to subject individuals to criminal punishment or deny access to health, work and other essential fields 76 . In other words, attention is mainly devoted to harm, rather than opportunities, and, hence, privacy is treated as a negative element that has to be balanced against innovation, efficiency or security. However, control and privacy self-management is in practice impossible and innumerous jokes on terms of use 77 can prove this. Furthermore, the -choice‖ to disclose personal data is an illusion; one has no choice to opt-out of profiling by firms, of whose existence she is unaware.
On the other hand, opting-out of the alleged -firms' and governments'surveillance 78 would mean opting-out of society 79 . So, what if trust were enabled, with regard to privacy laws, to allow people to safely share their personal data for the benefit of both individuals and firms?
Trust is the key element of healthy relationships and societies 80 . It has been defined as a willingness to rely on another, and, in particular, as -a psychological state comprising the intention to accept vulnerability based upon positive expectations of the intentions or behavior of another‖ 81 . To put it simply, trust is a state that enables an individual to be willing to make herself vulnerable to another party, to rely on another, despite potential risks -that the latter will act in a way that can harm the former 82 .
In the context of privacy, trust would mean the willingness to become vulnerable to a person or entity by sharing personal data. In this case, the party (individual), who discloses her information, would be the -trustor‖ 83 , the act of disclosing data would be the -entrusting‖, and the recipient of the data would be the -entrustee‖ 84 . Indeed, in present-day societies, we are all -trustors‖, since we entrust firms, when disclosing information to, for instance, a search engine or any other digital firm. Furthermore, one becomes vulnerable, when she faces the risk of misuse or unauthorized disclosure of her data. Namely, vulnerability may refer to the potential risk of an employee being fired, or the risk of selling data to third parties.
If the concept of trust were introduced in privacy regime and laws, it could further enable honesty and loyalty. This could be achieved through flexible fiduciary 85 laws 86 , the main goal of which is to protect against exploitation of a vulnerability created by trusting another 87 . Hence, these rules impose duties of loyalty and care. In particular, a fiduciary is a person who has a relationship of trust with a party -the beneficiary-and who is authorized to hold something valuable (for instance, the beneficiary's assets) and manage them on the beneficiary's behalf 88 . So, fiduciary laws could, indeed, apply in a flexible way 89 to protect individuals. Moreover, an affirmative obligation of honesty could be introduced, since fiduciaries have duties of disclosure, care and loyalty, while they are also obliged to keep the beneficiary informed 90 .
This way, individuals (as beneficiaries) would very likely, not only know what information would be disclosed but also, understand both information and processing techniques. Furthermore, firms (as fiduciaries) could very well be obliged to consult with individuals (their beneficiaries) and give them the opportunity to express their -best interests‖ (or even opinions) in accordance with which data would be shared 91 . Maybe such rules could oblige firms to implement internal policies and other safeguards, such as employee training. Contracts to forbid e.g. re-identification of anonymized 92 data might also be introduced. If this were the case, trust would be enhanced and, thus, more information would be safely disclosed for the benefit of both firms and humans.
Fiduciary laws are, indeed, flexible and could, thus, apply to processing of personal data. One more reason to be very optimistic, with regard to the above potential to enhance trust, is the novelty introduced by the GDPR: the right to data portability.

The Right to Data Portability: a Novelty and a Potential to Enhance Trust
One of the most important rights that the GDPR introduces is the right to data portability 93 . Under Article 20 of the GDPR, the data subject shall have the right to receive the personal data concerning him or her, which he or she has provided 94 to a controller, in a structured, commonly used and machine-readable format 95 and have the right to transmit those data to another controller without hindrance 96 from the controller to which the personal data have been provided, where the processing is based on consent or on a contract, and where the processing is carried out by automated means 97 . While data subjects exercise their right to data portability they should have the right to have the personal data transmitted directly 91  The right to data portability 99 can be regarded as an economic right, which aims to let individuals -share wealth‖ created by new technologies 100 and benefit from digital services. One of its purposes is to create a competitive market environment that will enable consumers to switch providers 101 . The above right, however, aims, not only to enforce competition and consumer protection but also, to promote interconnection of services and interoperability 102 . Hence, user-centric platforms could be introduced and developed for the benefit of individuals' interests 103 . So, the right to data portability is not just an economic right, but it also aims to enhance individuals' rights 104 and promote transparency and minimization of unfair and discriminatory practices 105 . 98 See Article 20(2) of the GDPR. Under Article 20(3-4) of the GDPR - […] The exercise of the right referred to in paragraph 1 of this Article shall be without prejudice to Article 17. That right shall not apply to processing necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller. […] The right referred to in paragraph 1 shall not adversely affect the rights and freedoms of others […] […] the right to data portability is also an important tool that will support the free flow of personal data in the EU and foster competition between controllers. It will facilitate switching between different service providers, and will therefore foster the development of new services in the context of the digital single market strategy […]‖. 103 See Article 29 Data Protection Working Party, Guidelines on the right to data portability, id, at p. 3. 104 See Recital (68) of the GDPR (- […] To further strengthen the control over his or her own data, where the processing of personal data is carried out by automated means, the data subject should also be allowed to receive personal data concerning him or her which he or she has provided to a controller in a structured, commonly used, machine-readable and interoperable format, and to transmit it to another controller […]‖). 105 See Article 29 Data Protection Working Party, Opinion 03/2013 on purpose limitation, Adopted on 2 April 2013, at p. 47.
approach consent -and overcome weaknesses relating to inadequate or inaccurate information.
Similar proposals have been submitted to effectively and successfully exercise the -American--right to try‖; a right that allows terminally ill patients to request access to early stage experimental medical products, such as drugs or experimental treatments, directly from the producer, removing the approval of the Food and Drug Administration 115 . Justifications for the above right (to try) laws are based on the ethical principles of autonomy, beneficence and justice 116 .
These proposals could very well constitute a prescription for the implementation of flexible solutions and the activation of mechanisms in other fields, where consent of the individual plays an important role in examining whether specific procedures are lawful. In particular, after having studied personal data processing practices that firms conduct, several risks and dangers were detected 117 . Moreover, users' behaviors in social networks revealed that, while individuals may believe that they value their privacy, albeit, they often disclose a huge volume of personal data on a daily basis. Current negative conceptualization of privacy calls for a positive approach by introducing the concept of trust to safeguard individuals' rights and interests. This could be achieved by applying flexible fiduciary laws, in conjunction with the promising new right to data portability.
Indeed, re-shaping current European privacy model could be a welcome proposal; it is not only a model that has been regarded as -the triumph of individualism‖ 118 , where consent may abrogate the unjust nature of data processing (a processing, which is by definition unlawful), but it is also a model that permits consent to be given by -a single-mouse-click‖ on terms of use, i.e. by ticking a box in a website 119 . If trust were introduced, such contradictions would probably be avoided.
In present-day societies, the protection of privacy may not relate that much -or may not relate just-to the -right to be let alone‖, which Warren and Brandeis 120 defined many years ago. Maybe, it relates more -or relates especially-to trust and loyalty that shall govern relationships between firms and individuals.
Perhaps, trust should be implemented in the European model of privacy, where -consent‖ is any -freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her‖ 121 , to add values of loyalty and reciprocity in relationships between data subjects and processors. Decision making process could, then, focus on individuals and their needs for transparent processing of personal data for the benefit of the data subjects. This way, individuals would very likely share wealth created by Big Data, while at the same time opaque procedures of data processing would come to an end 122 .