I have a JavaScript that deals with with detection whether the page is in frames or not. I used top.frames[] etc. and everything works fine.
In this script I notice
Here's the explanation and example from the MDN page for window.self:
if (window.parent.frames[0] != window.self) {
// this window is not the first frame in the list
}
window.self is almost always used in comparisons like in the example above, which finds out if the current window is the first subframe in the parent frameset.
Given that nobody is using framesets these days, I think it's okay to consider that there are no useful cases for self. Also, at least in Firefox, testing against window instead of window.self is equivalent.