JACIII Vol.14 No.5 pp. 475-486
doi: 10.20965/jaciii.2010.p0475


The Search for a Search: Measuring the Information Cost of Higher Level Search

William A. Dembski* and Robert J. Marks II**

*Center for Science & Culture, Discovery Institute, Seattle, WA 98104, USA

**Dept. of Electrical & Computer Engineering, Baylor University, Waco, TX 76798, USA

September 19, 2009
April 1, 2010
July 20, 2010
No Free Lunch Theorems, active information, active entropy, assisted search, endogenous information

Needle-in-the-haystack problems look for small targets in large spaces. In such cases, blind search stands no hope of success. Conservation of information dictates any search technique will work, on average, as well as blind search. Success requires an assisted search. But whence the assistance required for a search to be successful? To pose the question this way suggests that successful searches do not emerge spontaneously but need themselves to be discovered via a search. The question then naturally arises whether such a higher-level “search for a search” is any easier than the original search. We prove two results: (1) The Horizontal No Free Lunch Theorem, which shows that average relative performance of searches never exceeds unassisted or blind searches, and (2) The Vertical No Free Lunch Theorem, which shows that the difficulty of searching for a successful search increases exponentially with respect to the minimum allowable active information being sought.

Cite this article as:
W. Dembski and R. II, “The Search for a Search: Measuring the Information Cost of Higher Level Search,” J. Adv. Comput. Intell. Intell. Inform., Vol.14, No.5, pp. 475-486, 2010.
Data files:
  1. [1] W. A. Dembski and R. J. Marks II, “Conservation of Information in Search: Measuring the Cost of Success,” IEEE Trans. on Systems, Man and Cybernetics A, Systems and Humans, Vol.39, No.5, pp. 1051-1061, September 2009.
  2. [2] C. Schaffer, “A conservation law for generalization performance,” Proc. Eleventh Int. Conf. on Machine Learning, H. Willian and W. Cohen, San Francisco: Morgan Kaufmann, pp. 295-265, 1994.
  3. [3] T. M. English, “Some information theoretic results on evolutionary optimization,” Proc. of the 1999 Congress on Evolutionary Computation, 1999, CEC 99, Vol.1, pp. 6-9, July 1999.
  4. [4] D. Wolpert and W. G. Macready, “No free lunch theorems for optimization,” IEEE Trans. Evolutionary Computation, Vol.1, No.1, pp. 67-82, 1997.
  5. [5] M. Koppen, D. H. Wolpert, and W. G. Macready, “Remarks on a recent paper on the ‘no free lunch’ theorems,” IEEE Trans. on Evolutionary Computation, Vol.5 , Issue 3, pp. 295-296, June 2001.
  6. [6] Y.-C. Ho and D. L. Pepyne, “Simple explanantion of the No Free Lunch Theorem,” Proc. of the 40th IEEE Conf. on Decision and Control, Orlando, Florida, 2001.
  7. [7] Y.-C. Ho, Q.-C. Zhao, and D. L. Pepyne, “The No Free Lunch Theorems: Complexity and Security,” IEEE Trans. on Automatic Control, Vol.48, No.5, pp. 783-793, May 2003.
  8. [8] W. A. Dembski, “No Free Lunch: Why Specified Complexity Cannot Be Purchased without Intelligence,” Lanham, Md.: Rowman and Littlefield, 2002.
  9. [9] J. C. Culberson, “On the Futility of Blind Search: An Algorithmic View of ‘No Free Lunch’,” Evolutionary Computation, Vol.6, No.2, pp. 109-127, 1998.
  10. [10] W. A. Dembski and R. J. Marks II, “Conservation of Information in Search: Measuring the Cost of Success,” IEEE Trans. on Systems, Man and Cybernetics A, Systems and Humans, Vol.39, No.5, pp. 1051-1061, September 2009.
  11. [11] W. Ewert, W. A. Dembski, and R. J. Marks II, “Evolutionary Synthesis of Nand Logic: Dissecting a Digital Organism,” Proc. of the 2009 IEEE Int. Conf. on Systems, Man, and Cybernetics. San Antonio, TX, USA, pp. 3047-3053, October 2009.
  12. [12] W. Ewert, G. Montaez, W. A. Dembski, and R. J. Marks II, “Efficient Per Query Information Extraction from a Hamming Oracle,” Proc. of the the 42nd Meeting of the Southeastern Symposium on System Theory, IEEE, University of Texas at Tyler, pp. 290-297, March 7-9, 2010.
  13. [13] W. A. Dembski and R. J. Marks II, “Life’s Conservation Law: Why Darwinian Evolution Cannot Create Biological Information,” in Bruce Gordon and William Dembski, editors, The Nature of Nature, Wilmington, Del.: ISI Books, 2010.
  14. [14] W. A. Dembski and R. J. Marks II, “Bernoulli’s Principle of Insufficient Reason and Conservation of Information in Computer Search,” Proceedings of the 2009 IEEE Int. Conf. on Systems,Man, and Cybernetics, San Antonio, TX, USA, pp. 2647-2652, October 2009.
  15. [15] M. Hutter, “A Complete Theory of Everything (will be subjective),” (in review) Arxiv preprint arXiv:0912.5434, 2009,, Dec. 20, 2009.
  16. [16] B. Weinberg and E. G. Talbi, “NFL theorem is unusable on structured classes of problems,” Congress on Evolutionary Computation, CEC2004, Vol.1, 19-23, pp. 220-226, June 2004.
  17. [17] J. Bernoulli, “Ars Conjectandi (The Art of Conjecturing),” Tractatus De Seriebus Infinitis, 1713.
  18. [18] A. Papoulis, “Probability, Random Variables, and Stochastic Processes,” 3rd ed., New York: McGraw-Hill, 1991.
  19. [19] T. M. Cover and J. A. Thomas, “Elements of Information Theory,” 2nd Edition, Wiley-Interscience, 2006.
  20. [20] N. Dinculeanu, “Vector Integration and Stochastic Integration in Banach Spaces,” New York: Wiley, 2000.
  21. [21] I. M. Gelfand, “Sur un Lemme de la Theorie des Espaces Lineaires,” Comm. Inst. Sci. Math. de Kharko., Vol.13, No.4, pp. 35-40, 1936.
  22. [22] B. J. Pettis, “On Integration in Vector Spaces,” Trans. of the American Mathematical Society, Vol.44, pp. 277-304, 1938.
  23. [23] W. A. Dembski, “Uniform Probability,” J. of Theoretical Probability, Vol.3, No.4, pp. 611-626, 1990.
  24. [24] F. Giannessi and A. Maugeri, “Variational Analysis and Applications (Nonconvex Optimization and Its Applications),” Springer, 2005.
  25. [25] D. L. Cohn, “Measure Theory,” Boston: Birkhäuser, 1996.
  26. [26] R. M. Dudley, “Probability and Metrics,” Aarhus: Aarhus University Press, 1976.
  27. [27] P. de la Harpe, “Topics in Geometric Group Theory,” University of Chicago Press, 2000.
  28. [28] P. Billingsley, “Convergence of Probability Measures,” 2nd ed., New York: Wiley, 1999.
  29. [29] W. Feller, “An Introduction to Probability Theory and Its Applications,” 3rd ed., Vol.1, New York: Wiley, 1968.
  30. [30] R. J. Marks II, “Handbook of Fourier Analysis and Its Applications,” Oxford University Press, 2009.
  31. [31] M. Spivak, “Calculus,” 2nd ed., Berkeley, Calif.: Publish or Perish, 1980.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, IE9,10,11, Opera.

Last updated on Feb. 20, 2019