Hostname: page-component-586b7cd67f-r5fsc Total loading time: 0 Render date: 2024-11-26T08:09:42.108Z Has data issue: false hasContentIssue false

The Use of Data Mining by Private Health Insurance Companies and Customers’ Privacy

An Ethical Analysis

Published online by Cambridge University Press:  10 June 2015

Abstract:

This article examines privacy threats arising from the use of data mining by private Australian health insurance companies. Qualitative interviews were conducted with key experts, and Australian governmental and nongovernmental websites relevant to private health insurance were searched. Using Rationale, a critical thinking tool, the themes and considerations elicited through this empirical approach were developed into an argument about the use of data mining by private health insurance companies. The argument is followed by an ethical analysis guided by classical philosophical theories—utilitarianism, Mill’s harm principle, Kant’s deontological theory, and Helen Nissenbaum’s contextual integrity framework. Both the argument and the ethical analysis find the use of data mining by private health insurance companies in Australia to be unethical. Although private health insurance companies in Australia cannot use data mining for risk rating to cherry-pick customers and cannot use customers’ personal information for unintended purposes, this article nonetheless concludes that the secondary use of customers’ personal information and the absence of customers’ consent still suggest that the use of data mining by private health insurance companies is wrong.

Type
Special Section: Bioethics and Information Technology
Copyright
Copyright © Cambridge University Press 2015 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Notes

1. Tavani HT. Ethics and technology: Controversies, questions, and strategies for ethical computing. 4th ed. Hoboken, NJ: John Wiley; 2013.

2. Borna, S, Avila, S. Genetic information: Consumers’ right to privacy versus insurance companies’ right to know a public opinion survey. Journal of Business Ethics 1999;19:355–62.CrossRefGoogle Scholar

3. Danna, A, Gandy, O. All that glitters is not gold: Digging beneath the surface of data mining. Journal of Business Ethics 40;2002:373–86.CrossRefGoogle Scholar

4. Angst, CM. Protect my privacy or support the common good? Ethical questions about electronic health information exchanges. Journal of Business Ethics 2009;90:169–78.CrossRefGoogle Scholar

5. Raychaudhuri, K, Ray, P. Privacy challenges in the use of eHealth systems for public health management. International Journal of E-Health and Medical Communications 2010;1(2):1223.CrossRefGoogle Scholar

6. Al-Saggaf, Y, Islam, Z. A malicious use of a clustering algorithm to threaten the privacy of a social networking site user. World Journal of Computer Application and Technology 2013;1(2):2934.Google Scholar

7. Al-Saggaf, Y. The mining of data retrieved from the eHealth record system should be governed. Information Age 2012 Nov/Dec:46–7.Google Scholar

8. Sarathy, R, Robertson, CJ. Strategic and ethical considerations in managing digital privacy. Journal of Business Ethics 2003;46(2):111–26.CrossRefGoogle Scholar

9. Van Wel, L, Royakkers, L. Ethical issues in web data mining. Ethics and Information Technology 2004;6:129–40.CrossRefGoogle Scholar

10. See note 5, Raychaudhuri, Ray 2010.

11. Yoo, I, Alafaireet, P, Marinov, M, Pena-Hernandez, K, Gopidi, R, Chang, J, et al. (2012). Data mining in healthcare and biomedicine: A survey of the literature. Journal of Medical Systems 2012;36:2431–48.CrossRefGoogle Scholar

12. See note 4, Angst 2009.

13. See note 2, Borna, Avila 1999.

14. See note 9, van Wel, Royakkers 2004.

15. Hildebrandt, M. Who is profiling who? Invisible visibility. In: Gutwirth, S, Poullet, Y, de Hert, P, de Terwangne, C, Nouwt, S, eds. Reinventing Data Protection? Berlin: Springer; 2009: 239–52.CrossRefGoogle Scholar

16. Willison, DJ, Keshavjee, K, Nair, K, Goldsmith, C, Holbrook, AM. Patients’ consent preferences for research uses of information in electronic medical records: Interview and survey data. British Medical Journal 2003 Feb 15:326–73.Google ScholarPubMed

17. See note 1, Tavani 2013.

18. http://phiac.gov.au (last accessed 3 Aug 2014).

19. privatehealth.gov.au (last accessed 3 Aug 2014).

20. http://www.comlaw.gov.au/Details/C2012C00590 (last accessed 3 Aug 2014).

21. Twardy, C.Argument maps improve critical thinking. Teaching Philosophy 2004;27(2):95116.CrossRefGoogle Scholar

22. Rationale. Learn. Austhink; 2012; available at http://rationale.austhink.com/learn (last accessed 14 Aug 2014).

24. It is important to note that this argument draws its information from the results of the qualitative interviews, the search of the governmental and nongovernmental websites, and the literature. It is beyond the scope of this article to offer a fully fleshed argument.

25. This is the main contention of the argument.

26. See note 1, Tavani 2013.

27. Al-Saggaf, Y, Islam, Z. Privacy in social network sites (SNS): The threats from data mining. Ethical Space: The International Journal of Communication Ethics 2012;9(4):3240.Google Scholar

28. Moor, J. Towards a theory of privacy in the information age. Computers and Society 1997;27:2732.CrossRefGoogle Scholar

29. Rachels, J. Why privacy is important. Philosophy & Public Affairs 1975;4:323–33.Google Scholar

30. Mill, JS. On Liberty: Annotated Text, Sources and Background Criticism. New York: W.W. Norton; 1975 [1859].Google Scholar

31. Sar, RK, Al-Saggaf, Y. Contextual integrity’s decision heuristic and social network sites tracking. Ethics and Information Technology 2013;15(4):112.Google Scholar

32. See note 2, Borna, Avila 1999. See note 3, Danna, Gandy 2002. See note 4, Angst 2009.