I have a list of pairs:
[0, 1], [0, 4], [1, 0], [1, 4], [4, 0], [4, 1]
and I want to remove any duplicates where
[a,b] == [
set(map(frozenset, lst))
If the pairs are logically unordered, they're more naturally expressed as sets. It would be better to have them as sets before you even get to this point, but you can convert them like this:
lst = [[0, 1], [0, 4], [1, 0], [1, 4], [4, 0], [4, 1]]
lst_as_sets = map(frozenset, lst)
And then the natural way of eliminating duplicates in an iterable is to convert it to a set:
deduped = set(lst_as_sets)
(This is the main reason I chose frozenset in the first step. Mutable sets are not hashable, so they can't be added to a set.)
Or you can do it in a single line like in the TL;DR section.
I think this is much simpler, more intuitive, and more closely matches how you think about the data than fussing with sorting and tuples.
If for some reason you really need a list of lists as the final result, converting back is trivial:
result_list = list(map(list, deduped))
But it's probably more logical to leave it all as sets as long as possible. I can only think of one reason that you might need this, and that's compatibility with existing code/libraries.