Tags: random search*

Random search algorithms are a class of optimization methods that utilize randomness to explore the search space efficiently, particularly for complex global optimization problems that involve continuous and/or discrete variables. These algorithms are often employed when traditional methods fail due to the complexity of the problem, such as nonconvex, nondifferentiable, or discontinuous objective functions. Here are some key points about random search algorithms:

- **Iterative Process**: They incorporate a random element within their iterative procedures, which helps in navigating large and complex search spaces.
- **Categories**: These algorithms can be categorized into global (exploration) versus local (exploitation) search methods and instance-based versus model-based approaches.
- **Examples**: Notable examples include simulated annealing, genetic algorithms, particle swarm optimization, and ant colony optimization, among others.
- **Trade-offs**: They typically trade-off a guarantee of finding the optimal solution for a quicker convergence to a good solution with probabilistic convergence results.
- **Applications**: Widely used across various fields, such as engineering, scheduling, and biological systems, random search algorithms are particularly effective in "black-box" global optimization problems where the underlying structures are unknown.

0 bookmark(s) - Sort by: Date ↓ / Title /

  1. The paper describes a method for discovering stable structures of materials using first-principles electronic structure methods, specifically density functional theory (DFT). The approach, called ab initio random structure searching (AIRSS), is applied to find structures of solids, point defects, surfaces, and clusters, with new results for iron clusters on graphene, silicon clusters, polymeric nitrogen, hydrogen-rich lithium hydrides, and boron.

  2. This article by Zelda B. Zabinsky provides an overview of random search algorithms, which are particularly useful for tackling complex global optimization problems with either continuous or discrete variables. These algorithms, including simulated annealing, genetic algorithms, and particle swarm optimization, leverage randomness or probability in their iterative processes, often falling under the category of metaheuristics. Such methods are valuable for problems characterized by nonconvex, nondifferentiable, or discontinuous objective functions, as they offer a trade-off between optimality and computational speed. Random search algorithms can be categorized by their approach to exploration versus exploitation, and their application spans various fields, including engineering, scheduling, and biological systems. They address challenges where traditional deterministic methods struggle, particularly in the absence of clear structures distinguishing local from global optima.

  3. This discussion explores the effectiveness of simulated annealing compared to random search for optimizing a set of 16 integer parameters. The author seeks to determine if simulated annealing provides a significant advantage over random search, despite the parameter space being too large for exhaustive search. Responses suggest plotting performance over time and highlight the ability of simulated annealing to escape local optima as its main strength.

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: tagged with "random search"

About - Propulsed by SemanticScuttle