computational complexity


Also found in: Dictionary, Acronyms, Wikipedia.

computational complexity

(algorithm)
The number of steps or arithmetic operations required to solve a computational problem. One of the three kinds of complexity.
References in periodicals archive ?
The integrated of HPCL software will be an alternative software system to speed the large sparse computational complexity and optimize the visualization quality for predicting, visualizing and observing some parameter characteristics of the unique combination of rubber nanocomposite to save resources, time consuming and space allocation during the injection molding process and manufacturing the assembled products
An overall classification of the various techniques that intend to reduce computational complexity is described followed by a brief description about the approaches that aim to handle scalability problems in ontology matching, the methods that use machine learning techniques, techniques that uses semantic web for the determination of matches and the techniques that aim to reduce the search space using various other approaches.
In this section, we give performance analysis of the proposed assessment scheme, such as computational complexity, time complexity and applicability, and so on.
The issue of computational complexity is addressed in the following subsection.
However, the key relationship that we have here that is used extensively in computational complexity analysis is the following chain of inequalities (which, for our Algorithm 1 is true when [Alpha] = 1 and [Beta] = 16):
Table 1: Computational complexity comparison between PTS, SLM and proposed schemes.
East Bay) present material they developed for a graduate course on computational complexity.
The report has been published on the website of the Electronic Colloquium on Computational Complexity.
Among his topics are search and computational complexity, structural approaches leading to natural language understanding and related topics, representing and manipulating uncertainty, neural networks as biologically inspired computing, and genetic algorithms and other evolutionary computing concepts.
Computational complexity of breaking PGP Disk encryption makes algorithms using exclusively computers' CPUs only effective when executed concurrently on numerous workstations.
Though grid-cell methods have reached a high degree of maturity, their computational complexity is very high, and they suffer massively from the curse of dimensionality.

Full browser ?