Aaron N. Tubbs bio photo

Aaron N. Tubbs

Dragon chaser.

Twitter Facebook Google+ LinkedIn Github

10150:

Well, I’m not real happy with my solution for doublets. First I built up a graph for the entire dictionary using an STL map. Just building the graph exceeded the 3-second time limit, never mind trying to solve it. Switched to hash_map. Still too slow. Broke the thing out into a separate vector and hash_map/unordered_set as appropriate. Still too slow. Built an array of dictionaries based on word length, and built (up to) 16 separate dictionaries using the same approach as before, got my parse time to 2.25 seconds. Solved the problem with a presentation error in 2.26 seconds. Doesn’t compile on the book judge, I’m not wasting any time looking into it, guessing it’s just compiler/library availability differences. The presentation error doesn’t bother me, beyond morbid curiosity, so I’m moving on.