I have no real need to improve it, it\'s just for fun. Right now it\'s taking about a second on a list of about 200K words.
I\'ve tried to optimize it as much as I know
you can convert more lists to generators:
all( [word_count[letter] <= rack_count[letter] for letter in word] )
...
sum([score[c] for c in word])
to
all( word_count[letter] <= rack_count[letter] for letter in word )
...
sum( score[c] for c in word )
In the loop, instead of creating the rask set on every iteration, you can create it in advance, and it can be a frozenset.
rack_set = frozenset(rack)
scored = ((score_word(word), word) for word in words if set(word).issubset(rask_set) and len(word) > 1 and spellable(word, rack))
The same can be done with the rack_count dictionary. It doesn't need to be created on every iteration.
rack_count = count_letters(rack)