JDR Vol.6 No.5 pp. 498-505
doi: 10.20965/jdr.2011.p0498


Internal Security Issues Related to Automatic System Malfunction and a Model to Explain Foresight of Experts and Non-Experts

Soichiro Morishita*, and Hiroshi Yokoi**

*Interfaculty Initiative in Information Studies, The University of Tokyo, Hongo 7-3-1 Bunkyo-ku, Tokyo 113-0033, Japan

**Graduate School of Informatics and Engineering, The University of Electro-Communications, Chofugaoka 1-5-1, Chofu-shi, Tokyo 182-8585, Japan

April 5, 2011
August 10, 2011
October 1, 2011
internal security issue, engineering ethics, automatic machines, computer anxiety

Accidents or malfunctions in automatic systems often raise questions about the possibility of the system’s designer being able to foresee such problems. In general, the opinions of experts are given more credence than the opinions of non-experts. If objective evidence shows that a malfunction could not have been foreseen even by experts, the possibility of prediction is assumed to have not been possible. Experts can make a proper decision based on expertise related to the automatic machine coverage. However, non-experts might underestimate the coverage and become careful about handling of the system. When a malfunction that an expert cannot foresee occurs in such a situation, and results agree by chance with the forecast of a non-expert, engineers are questioned beyond reason about their “responsibility” – a trend particularly marked in relation to computer systems. As described in this paper, the case in which an Okazaki City Library user was arrested is an appropriate case study for this problem. Given the perspective of design of automatic machines and engineering ethics, we discuss it as an internal security issue.

Cite this article as:
S. Morishita and H. Yokoi, “Internal Security Issues Related to Automatic System Malfunction and a Model to Explain Foresight of Experts and Non-Experts,” J. Disaster Res., Vol.6, No.5, pp. 498-505, 2011.
Data files:
  1. [1] Chauncey Starr, “Social benefit versus technological risk,” Science (New York, NY), Vol.165, No.899, pp. 1232-1238, 1969.
  2. [2] P. Huber, “Safety and the second best: The hazards of public risk management in the courts,” Colum. L. Rev., Vol.85, pp. 277-1833, 1985.
  3. [3] F. B. Cross, “Public role in risk control,” Envtl. L., Vol.24, pp. 887-943, 1994.
  4. [4] Zhanfei Zhou, Soichiro Morishita, Hiroshi Mikami, and Taketoshi Mishima, “Groupware administration system based on unix,” In Technical Report of IEICE FACE2001-16, pp. 5-8. IEICE, 2001 (in Japanese).
  5. [5] Soichiro Morishita, Hidetsune Kobayashi, and Taketoshi Mishima, “Representing knowledge in fault tree analysis for building the internal security system,” In Technical Report of IEICE SITE2002-23, pp. 29-32. IEICE, 2002 (in Japanese).
  6. [6] Soichiro Morishita and Taketoshi Mishima, “Automated administration system for internal security with fault tree analysis and shared signature,” In Proceedings of the ITC-CSCC 2003, International Technical Conference on Circuits/System, Computers and Communications (ITC-CSCC2003), pp. 1432-1435, Phoenix Park, Kang-Won Do, Korea, July 2003.
  7. [7] Soichiro Morishita, Hiroki Hashiguchi, and Taketoshi Mishima, “Formalization of knowledge for automatic administration system to the internal security problem,” In Technical Report of IEICE SITE2008-80, Vol.108, pp. 213-218. IEICE, 2009 (in Japanese).
  8. [8] Soichiro Morishita and Taketoshi Mishima, “Consideration about categorize under the internal security problem,” In Technical Report of IEICE SITE2009-4, Vol.109, pp. 63-66. IEICE, 2009 (in Japanese).
  9. [9] Soichiro Morishita and Hiroshi Yokoi, “The internal security issues in automatic systems – case studies and a review from the viewpoint of engineering ethics –,” In Technical Report of IEICE, No. SITE2010-47, pp. 27-30. IEICE, 2010 (in Japanese).
  10. [10] LIBRAHACK, “Librahack: Okazaki library incident from viewpoint of the suspect,”, November 2010 (in Japanese).
  11. [11] Hiromitsu Takagi, “Diary of hiromitsu takagi@home,”, July 2010 (in Japanese).
  12. [12] Katsuyuki Maeda, “Diary of a system administrator,”, July 2010 (in Japanese).
  13. [13] Japanese Standard Association, “Information technology – Security techniques – Code of practice for information security management,” JIS, Q 27002:2006 edition, 2006 (in Japanese).
  14. [14] J. Mirkovic and P. Reiher, “A taxonomy of DDoS attack and DDoS defense mechanisms,” ACM SIGCOMM Computer Communication Review, Vol.34, No.2, pp. 39-53, 2004.
  15. [15] J. J. Beckers and H. G. Schmidt, “The structure of computer anxiety: a six-factor model,” Computers in Human Behavior, Vol.17, No.1, pp. 35-49, 2001.
  16. [16] J. J. Beckers and H. G. Schmidt, “Computer experience and computer anxiety,” Computers in Human Behavior, Vol.19, No.6, pp. 785-797, 2003.
  17. [17] T. Inagaki, K. Nagai, Y. Nagato, A. Nakamura, F. Shoji, and T. Sumiya, “Multipurpose Independent-Study Environment for Information Technology Based Education and Training,” In 2nd International Conference on Information Technology Based Higher Eduction and Training, 2001.
  18. [18] J. F. MacGregor and T. Kourti, “Statistical process control of multivariate processes,” Control Engineering Practice, Vol.3, No.3, pp. 403-414, 1995.
  19. [19] Learned Hand, “United states v. Carroll Towing Co.,” 159 F.2d 169, 173, 2nd Cir. 1947.
  20. [20] G. C. Keating, “Reasonableness and rationality in negligence theory,” Stanford Law Review, Vol.48, No.2, pp. 311-384, 1996.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, IE9,10,11, Opera.

Last updated on Feb. 17, 2020