Firstly, one such risk is that a user's personal information might not be directly known, but it could be revealed through lots data collection from different sources or even uncover new information, thus violating the users privacy. A person who does not wish to be recognized online can therefore be identified through such means. Secondly, the paper details that many applications only consider the installation of "information security, privacy and data protection" only as an afterthought, rather than in the beginning, at the drawing board.
This compromises the application's security and the users privacy substantially, as he capabilities of these measures would be reduced. Thirdly, with the advent of cloud computing and social networking services, the likelihood of users becoming "locked-in" to a particular TIT service provider increases because it becomes more difficult to import and export their information to other service providers. The lack of data portability here also means that users do not have control over their own data. The paper mentions that there are t-von. O general principles that should be followed in the policy making of TIT. Firstly, the TIT should not "violate human identity, human integrity, human rights, privacy or individual or public berries". Secondly, individuals should have control of all their personal information created or processed within the TIT, unless such an action violates the first principle. With regards to this, the paper illustrated four methods to reaching the objectives:
Privacy, data protection and information security risk management
Privacy by design and Privacy by default
Data protection legislation: harmonistic/coherent application/enhanced enforcement
Each of the above four options address various challenges related to TIT. The first option is not simply a technology-focused idea, as the paper states that it loud also be important to look into other measures, such as legal, regulatory, procedural and organizational. The main idea of this option is to avoid leaving said protection measures till the end of the development process as an afterthought, but to be included at the planning stage, with an adherence to best practices to avoid or reduce common risks.
The second option operates on the basis that while the technology might not be the factor that puts privacy and security at risk, but the way that it is created and implemented. It claims that applications should not collect data necessary to its functions, and that users should be made aware of what information would be gathered from them as well as what said information would be used for. Users should also be informed on how to exercise their rights, and the applications should adhere to data protection principles.
The flashlight application example mentioned before was in clear violation of this, and this real life example further enforces the need for an option such as this. From the technical standpoint, the paper states that personal data protection should be defined, such as in-built privacy options and mechanisms to inform ND educate users on data processing, although the challenge would be to do such things while operating within limited processing power and/or memory of the applications.
The third option focuses on the legal aspect of data protection, such as strengthening, clarifying and harmonize the powers of data protection authorities in order to make sure that all legislations are enforced, and not just pick and choose specific laws to be followed. The paper also states that violations should be sanctioned significantly to deter people from making applications that would neglect such issues. This is to ensure transparency of applications and for users to have control over their own data.
The "concept of indirectly identifiable data" also has to be improved and elucidated to avoid uncertainty in legislations. The last option, standardization, allows ease of conformity with legal requirements and certification due to the clarity provided from it, thus being cap bled of educating users on how to exercise their rights and allowing them to make informed choices. One weakness with standardization is that "standards are voluntary and non-binding", and thus it might not be very effective. It would require measures which are more binding.
The impact of these options is the building of trust between consumers and the applications. Trust is important in this online environment because without trust, consumers are less likely to buy and use new applications, thus slowing progress of the invention of new technologies, economic growth, and cause the public sector to take longer to benefit from digitizing its services. The paper concludes that having a binding law with more data protection enforcement is the best option to achieve the goals for TIT to ensure that the applications are trustworthy and compliant with user rights.