FrameLoader keeps a hash table of every URL loaded by that frame so it doesn't send the URL to the client more than once. There are two problems.
The first is that a malicious site could repeatedly load long, invalid URLs (or even valid ones would be OK too). These would all be saved for the life of that frame in the hash table. I can make Safari use about 1MB per second of memory by having the onLoad of an image set the URL to a random string ~1000 bytes long (triggering the onload again). However, there are many ways to waste memory in a browser, and this isn't a big deal.
The more practical problem is when a legitimate application loads a lot of URLs. I attached to the process in a debugger when I had been using Gmail all day. The hash table had 4255 URLs in it. Gmail often loads random-looking URLs, and these are all stored in the hash table until I close that tab.
I looked at the length of URLs loaded. Ad URLs are ~296 chars, inbox URLs are ~112 chars, and I got a bunch that were ~150 that I don't know what they are. If we say the average is 150, multiply by 2 since the hash table uses 16-bit strings, and my hash table is using 1.276 MB just for the strings!
It seems like this could be an easy memory win by somehow simplifying the conditions that these URLs are saved, or removing it all together and sending all URLs to the client application (I'm not actually sure why this hash table is necessary).