Everywhere I see (on SO and other sources), amortized analysis is usually applied for data structures only. For example for dynamic array or splay tree. However I have not seen application of amortized analysis for pure algorithms. Is it make sense to say about amortized analysis for algorithms? Amortized analysis assumes a series of operations which is true for data structures, but not for algorithms.
CodePudding user response:
It only makes sense to amortise the costs of a series of operations if there is some mutable state which allows computations done in one operation to affect the performance of computations in another operation. If the performance of a series of operations can be analysed independently as the sum of the performances of the individual operations, each of which does not depend on the other operations performed, then no amortisation is being done.
So in that sense, amortised analysis only applies to data structures, not to pure algorithms. However, this assumes a very broad definition of "data structure" which encompasses literally any mutable state. For example, it is debatable whether the local variables of a coroutine, which persist while the coroutine's execution is paused, should be considered a data structure.
Likewise, we normally wouldn't consider a function decorated with a LRU cache to "be" a data structure, rather we would say it "uses" a data structure for the cache, and when analysing the performance of a series of invocations (some of which hit the cache, others of which miss), we wouldn't say that we're analysing the data structure the cache uses, we'd say we're analysing a memoised algorithm.
CodePudding user response:
In many string algorithms you use amortisation to analyse the running time. Building a border array or a suffix tree in linear time, for example, or traversing a simulated suffix tree from a suffix array.
Suffix trees and suffix arrays are data structures, of course, but you don’t use amortisation in the usual sense of adding up the cost of a series of modifications; you use it when constructing them or for algorithms traversing them. You will have some expensive and some cheap steps, and amortisation makes it easier to count up the total running time.
Generally, you use it in algorithms where it is hard to bound the time used in each single step, because it can depend on the data, but where you, via various amortisation arguments, can bound the total. You do it a lot in string algorithms, which I am mostly familiar with, but it pops up all over the place.
Of course, it only makes sense when you have to do a series of operations from some well-defined starting point (so you don’t start with expensive operations and then stop). But there are many more applications than modifying data structures.