MARC보기
LDR00000nam u2200205 4500
001000000435096
00520200227114728
008200131s2019 ||||||||||||||||| ||eng d
020 ▼a 9781392811092
035 ▼a (MiAaPQ)AAI27627861
035 ▼a (MiAaPQ)NCState_Univ18402036938
040 ▼a MiAaPQ ▼c MiAaPQ ▼d 247004
0820 ▼a 004
1001 ▼a Andow, Benjamin Eric.
24510 ▼a Privacy Risks of Sensitive User Data Exposure in Mobile Ecosystems.
260 ▼a [S.l.]: ▼b North Carolina State University., ▼c 2019.
260 1 ▼a Ann Arbor: ▼b ProQuest Dissertations & Theses, ▼c 2019.
300 ▼a 164 p.
500 ▼a Source: Dissertations Abstracts International, Volume: 81-05, Section: B.
500 ▼a Advisor: Reaves, Bradley
5021 ▼a Thesis (Ph.D.)--North Carolina State University, 2019.
506 ▼a This item must not be sold to any third party vendors.
520 ▼a Mobile applications frequently collect and share a wide-range of privacy-sensitive user data. Such data is an extremely valuable commodity for legitimate business purposes, but also for nefarious and illicit purposes. Therefore, identifying the privacy risks of exposing privacy-sensitive user data to applications has been a topic of great research interest over the past decade. However, prior works in this domain are plagued with significant limitations that result in incomplete or imprecise approximations of the privacy risks. These limitations are mainly due to an incomplete characterization of the types of data that applications request and the context-insensitivity of their privacy policy analysis techniques, or lack thereof. In this dissertation, we characterize and identify privacy risks resulting from disclosing privacy-sensitive user data to applications, addressing much of the limitations of prior works. In particular, we analyze how privacy-sensitive user data is obtained and used, how privacy disclosures are discussed, and whether all uses of such data are disclosed. First, we characterize the space of privacy-sensitive user data sources to understand what types of data applications are requesting from users. We design and implement UiRef, an analysis framework to resolve the semantics of user input requests, and apply it to study privacy risks associated with user input requests in 50,162 Android applications from Google Play. Our analysis uncovers several concerning developer practices, including insecure exposure of account passwords and privacy violations due to non-consensual disclosures of privacysensitive user input to third parties. Second, we characterize the semantics of sharing and collection statements within privacy policies to understand how privacy practices are being disclosed. We demonstrate the importance of holistic analysis of privacy policies through our identification and formalization of self-contradictory policy statements. We design and implement PolicyLint to extract sharing and collection statements from privacy policies and identify self-contradictory policy statements. We perform a large-scale study on the privacy policies for 11,430 Android applications using PolicyLint and find that around 14.2% have potentially deceptive and ambiguous privacy policies due to self-contradictory statements. Third, we combine insights gained from our prior studies to provide a formal specification for an entity-sensitive (e.g., first-party vs. third-party) and negation-sensitive (e.g., collect vs. not collect) flow-to-policy consistency model. We design and implement POLICHECK to perform a large-scale study on 13,796 Android applications and their corresponding privacy policies and find that up to 42.4% of applications either incorrectly disclose or omit disclosing their privacy-sensitive data flows. Our characterization of the problem space, formal specifications, analysis techniques, and insights drawn from our empirical studies lay the foundation for identifying privacysensitive user data, precisely and soundly reasoning over privacy policies, and evaluating flow-to-policy consistency at scale. The findings from our empirical studies highlight significant privacy risks associated with exposing privacy-sensitive user data to applications and provide concrete examples of such cases that impact tens-to-hundreds of millions of users. Our results show the poor state of privacy disclosures for Android applications, which demonstrates the need for additional oversight and auditing by application markets and regulatory agencies. Further, our findings identify several future areas of research in assessing and preventing privacy risks, such as analyzing consent, improving the usability aspects of creating privacy policies, and introducing stronger privacy protection mechanisms.
590 ▼a School code: 0155.
650 4 ▼a Computer science.
690 ▼a 0984
71020 ▼a North Carolina State University.
7730 ▼t Dissertations Abstracts International ▼g 81-05B.
773 ▼t Dissertation Abstract International
790 ▼a 0155
791 ▼a Ph.D.
792 ▼a 2019
793 ▼a English
85640 ▼u http://www.riss.kr/pdu/ddodLink.do?id=T15494614 ▼n KERIS ▼z 이 자료의 원문은 한국교육학술정보원에서 제공합니다.
980 ▼a 202002 ▼f 2020
990 ▼a ***1008102
991 ▼a E-BOOK