Last time I talked about the difference between simulation and models for dps computation. Today I am going to talk about gear optimization and the associated challenges.
Gear optimization is interesting because it belongs to a class of problems called NP-Complete which are a class of problems that we do not yet have a way to solve in a reasonable time* and it may simply be not be solvable in a reasonable time. What makes these problems hard is that we do not have a way to guarantee an optimal solution without checking all possible solutions.
To demonstrate this consider just the question of reforging and leave gems and enchants alone. For a given (non-trinket) item there are 6 possible reforges, 2 stats on the item are each reforgable to one of 3 other stats. Setting aside trinkets there are 14 pieces to a gear set. That means there are 6^14 possible combinations or 78 billion possibilities. If you could check 1 million possibilities per second it would take just under 22 minutes to check every combination.
With some care we can shrink this space somewhat by applying some logic, for instance we should never reforge something to a lower EP value stat unless it is a capped stat. Best case scenario (every item with our two most valuable non-capped stats, haste, mastery for assassination or combat) this lowers our total problem space to 4^14, which would only take about 4 seconds which isn’t too bad.
Realistically we’re probably looking at around 5-5.2 possible reforges per item and 1 million possibilities per second is probably a bit optimistic for most people’s home computer so we’re probably looking at around 5-10 minutes which doesn’t seem bad until you realize that if you are looking at a variety of gear configurations you need to do this 5-10 minute reforge each time.
You may be wondering, if this takes so long to solve how do the various optimizers work as fast as they do? The answer is they don’t attempt to give you an optimal solution. These problems in the class NP-Complete have been a thorn in the side of computer scientists for a long time and large set of useful questions such as path finding (think Google Maps) fall into this class so there are lots of approximate solutions. None of these solutions can guarantee optimality but they can get “good enough.” The folks at AMR claim that their optimization algorithm (which considers more than just reforges) gets within 0.1% of optimal.
Talking about all the various approaches to approximate optimization is well beyond the scope of this post but as an example one approach would be a greedy cap based solution. In this approach you would greedily start by attempting to cap all capped stats in order of highest to lowest values. Once the capped stats were handled you’d simply reforge the remaining items from lowest to highest value stat. This approach probably wouldn’t be that good and would likely result in some pretty substantial over or under capping of stats but it should be a decent example of how an approximate algorithm would function.
So optimization is non-trivial, but it gets harder still. As pretty much everyone who has used ShadowCraft this expansion has seen EP values do not stay fixed across the stat space of single gear set. That is, a set of reforges may change the EP values such that the set of reforges is no longer optimal. As everyone who has used ShadowCraft knows this can cause some substantial problems for static value optimizers and will cause the optimizer to ring back and forth between two points when the suspected optimal is in between.
This is a difficult problem and one that we as a theorycrafting community are still grappling with. The obvious solution is to make the optimizer aware of these shifts. AMR recently unveiled an improved EP value system that handles caps of stats other than hit and expertise for certain classes. The system, while a clear improvement over the old system it still has some limitations. It works quite well for explicit haste caps like most casters have but for something like balanced haste and mastery like assassination rogues currently favor it doesn’t work so well.
Admittedly this is hard problem, partially because it is badly under-defined. What does it mean for stats to be balanced or more generally conform to some arbitrary ratio? If optimal is exactly on the ratio how much EP fall off is it to be 1 off? These answers are not obvious nor are they static, changes in gear will almost certainly change how your gear responds to ratios. This really gets to the heart of a broader problem with static gear optimizers. Static gear optimizers require relationships between current stat amounts and EP values and these may be gear specific themselves or at least not readily generalizable.
The obvious solution is an integrated calculation and optimization system like ShadowCraft however doing this in a performance conscious manner is difficult. ShadowCraft uses a static optimizer partially because of performance issues, it would, in theory be feasible to create an iterative reforger that would recompute EP values after each reforger. It would essentially do a local search, similar to the systems used by other optimizers I suspect, for optimal recalculating as it went. In this approach a system would make a change or set of changes toward a higher dps point and then recompute EP values repeating until a static point was found.
The big issue with an iterative optimzier is performance. When talking about ShadowCraft server limitations are an issue. Since 5.2 the ShadowCraft engine has gone down more often than anyone would like partially because of the increasing complexity of the model. Allowing someone to automatically run the model many times in short succession would likely increase strain on the server even more. I don’t pretend to have a good solution to this gear optimization problem but if I do come up with something I’ll post it here.
I was planning to talk about a couple other things, the difficulty of upgrade recommendation but this is even longer than yesterday’s post so that will have to wait for another time.
*Reasonable time here means polynomial time or less. A more detailed discussion of computational complexity theory is beyond this post, the linked wikipedia articles have more information if you are curious.