:hover on all elements? really?).
I was very disappointed at the purported DOM improvements were limited to making attributes work correctly, making
document.getElementById() work correctly, and adding the
contentDocument property to iframes. You’ll excuse me if I can’t get overly excited about those. There’s still no DOM Level 2 support and we’re stuck with the IE event model. Would it really be that hard to create a facade for DOM Level 2 events, even if you didn’t support event capturing yet? Those of us in software engineering do this sort of thing all the time.
My initial reaction to the XDomainRequest object was, “what the hell?” I didn’t understand why this couldn’t have just been
XMLHttpRequest such that when it sees “http://” at the beginning of a URL it knows it’s a cross-domain request (similar to Firefox 3′s implementation). But the more I thought about it, the more I grew to like Microsoft’s approach. Same- and cross-domain requests are very different in terms of the information transmitted (cookies or no?) and dependencies (higher risk of failure). Given how important cross-domain access restrictions are, it seems like a good idea to make this a separate object altogether so there is no way to trick one object into thinking it should send cookies when it couldn’t;
XDomainRequest doesn’t have the capability to send cookies so you’ll never need to worry about it. Overall, I’ve come to believe that making cross-domain requests explicit on the client-side is a very good thing. It forces you, as the developer, to really understand that you’re doing something different and it minimizes the opportunities for mistakes by the browser.
Oh, and in case anyone didn’t notice, Microsoft has released a timebombed IE8 Virtual PC image so you can try out IE8 without worrying about destroying your machine. I still have memories of installing IE4 and then spending the next two days trying to breathe life back into my computer.