Skip to main content Accessibility help
×
Hostname: page-component-586b7cd67f-2brh9 Total loading time: 0 Render date: 2024-11-25T20:12:58.977Z Has data issue: false hasContentIssue false

5 - Stochastic Local Search

Published online by Cambridge University Press:  30 April 2024

Deepak Khemani
Affiliation:
IIT Madras, Chennai
Get access

Summary

Search spaces can be huge. The number of choices faced by a search algorithm can grow exponentially. We have named this combinatorial explosion, the principal adversary of search, CombEx. In Chapter 4 we looked at one strategy to battle CombEx, the use of knowledge in the form of heuristic functions – knowledge that would point towards the goal node. Yet, for many problems, such heuristics are hard to acquire and often inadequate, and algorithms continue to demand exponential time.

In this chapter we introduce stochastic moves to add an element of randomness to search. Exploiting the gradient deterministically has its drawbacks when the heuristic functions are imperfect, as they often are. The steepest gradient can lead to the nearest optimum and end there. We add a tendency of exploration, which could drag search away from the path to local optima.

We also look at the power of many for problem solving, as opposed to a sole crusader. Population based methods have given a new dimension to solving optimization problems.

Douglas Hofstadter says that humans are not known to have a head for numbers (Hofstadter, 1996). For most of us, the numbers 3.2 billion and 5.3 million seem vaguely similar and big. A very popular book (Gamow, 1947) was titled One, Two, Three … Infinity. The author, George Gamow, talks about the Hottentot tribes who had the only numbers one, two, and three in their vocabulary, and beyond that used the word many. Bill Gates is famously reputed to have said, ‘Most people overestimate what they can do in one year and underestimate what they can do in ten years.’

So, how big is big? Why are computer scientists wary of combinatorial growth? In Table 2.1 we looked at the exponential function 2N and the factorial N!, which are respectively the sizes of search spaces for SAT and TSP, with N variables or cities. How long will take it to inspect all the states when N = 50?

For a SAT problem with 50 variables, 250 = 1,125,899,906,842,624. How big is that? Let us say we can inspect a million or 106 nodes a second. We would then need 1,125,899,906.8 seconds, which is about 35.7 years! There are N! = 3.041409320 × 1064 non-distinct tours (each distinct tour has 2N representations) of 50 cities.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2024

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

  • Stochastic Local Search
  • Deepak Khemani, IIT Madras, Chennai
  • Book: Search Methods in Artificial Intelligence
  • Online publication: 30 April 2024
  • Chapter DOI: https://doi.org/10.1017/9781009284325.006
Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

  • Stochastic Local Search
  • Deepak Khemani, IIT Madras, Chennai
  • Book: Search Methods in Artificial Intelligence
  • Online publication: 30 April 2024
  • Chapter DOI: https://doi.org/10.1017/9781009284325.006
Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

  • Stochastic Local Search
  • Deepak Khemani, IIT Madras, Chennai
  • Book: Search Methods in Artificial Intelligence
  • Online publication: 30 April 2024
  • Chapter DOI: https://doi.org/10.1017/9781009284325.006
Available formats
×