Examples of using Average-case in English and their translations into Portuguese
{-}
-
Colloquial
-
Official
-
Medicine
-
Financial
-
Ecclesiastic
-
Ecclesiastic
-
Computer
-
Official/political
The average-case analogue to NP-completeness is distNP-completeness.
Generic case complexity is similar to average-case complexity.
Average-case complexity: This is the complexity of solving the problem on an average.
There are three primary motivations for studying average-case complexity.
A comparison sort must have an average-case lower bound of Ω("n" log"n") comparison operations.
Many algorithms with bad worst-case performance have good average-case performance.
The aim of this project is to study average-case lower bounds in restricted classes of Boolean circuits, such as monotone….
Such data-dependent algorithms are analysed for average-case and worst-case data.
The average-case performance of algorithms has been studied since modern notions of computational efficiency were developed in the 1950s.
Also P≠ NP still leaves open the average-case complexity of hard problems in NP.
A distributional problem(L', D') is distNP-complete if(L', D') is in distNP and for every(L, D) in distNP,(L, D)is average-case reducible to L', D.
Advantages include: Comparable performance: Average-case performance is as efficient as other trees.
Third, average-case complexity allows discriminating the most efficient algorithm in practice among algorithms of equivalent based case complexity for instance Quicksort.
This remains the most efficient algorithm known for solving the problem, andfor certain distributions of inputs its average-case complexity is even better.
This popular sorting algorithm has an average-case performance of O(n log(n)), which contributes to making it a very fast algorithm in practice.
This remains the most efficient algorithm known for solving the problem, andfor certain distributions of inputs its average-case complexity is even better, On log log n.
For most problems, average-case complexity analysis is undertaken to find efficient algorithms for a problem that is considered difficult in the worst-case.
Thus, it is desirable to study the properties of these algorithms where the average-case complexity may differ from the worst-case complexity and find methods to relate the two.
Now, both average-case analysis and benchmarks are useful in certain settings, but for them to make sense, you really have to have domain knowledge about your problem.
The fact that all of cryptography is predicated on the existence of average-case intractable problems in NP is one of the primary motivations for studying average-case complexity.
Average-case analysis requires a notion of an"average" input to an algorithm, which leads to the problem of devising a probability distribution over inputs.
In some cases(e.g. large primeorder subgroups of groups(Zp)×) there is not only no efficient algorithm known for the worst case, but the average-case complexity can be shown to be about as hard as the worst case using random self-reducibility.
Second, average-case complexity analysis provides tools and techniques to generate hard instances of problems which can be utilized in areas such as cryptography and derandomization.
The project has as starting point works on monotone complexity developed by Razborov, Alon and Boppana, among others,as well as average-case results for monotone circuits developed by Rossman in distributions of random graphs.
The number of steps that the algorithm performs can be much smaller than, so on average(for constant) its performance is oreven depending on the random distribution on automata chosen to model the algorithm's average-case behavior.
As mentioned above, much early work relating to average-case complexity focused on problems for which polynomial-time algorithms already existed, such as sorting.
Development and choice of algorithms is rarely based on best-case performance: most academic andcommercial enterprises are more interested in improving Average-case complexity and worst-case performance.
In 1973, Donald Knuth published Volume 3 of the Art of Computer Programming which extensively surveys average-case performance of algorithms for problems solvable in worst-case polynomial time, such as sorting and median-finding.
The fundamental notions of average-case complexity were developed by Leonid Levin in 1986 when he published a one-page paper defining average-case complexity and completeness while giving an example of a complete problem for distNP, the average-case analogue of NP.
First, although some problems may be intractable in the worst-case, the inputs which elicit this behavior may rarely occur in practice,so the average-case complexity may be a more accurate measure of an algorithm's performance.
