Yeah, it's a bit silly, really.
That said, I recently got an old 500Mhz machine with 128mb RAM running a small linux variant called Puppy, for a friend's kids. It surfs the internet, deals with OpenOffice, and does just about everything they need a computer for - on completely free, ancient hardware. Makes you wonder how much money and resources we waste just to keep performance up on ever-ballooning applications (being bloated, most of the time, by features we'd never need anyhow).
The most insane thing at them moment is that a normal laptop CPU alone uses 35W alone when the GPU uses around 14W, and if you think about it, even a 5W CPU would do the tasks you need but a 35W GPU would let your lappy play some awesome games. As another example I have a cheap&old AMD dual core 45W and a good HD5770 and I'm able to run every new game MAXED. And yeah I tried to make the explanation simple so don't ask too much or the conversation will get too extensive.
Most of the blame of all these insanities starts with the new OS's (Vista+) and screen makers (no, really, why would everyone need a 1920x1080 screen on a 15' laptop besides making it impossible to read a single word when the GPU on those machines sometimes struggles to run a slightly more intensive application at 1024x768), i got XP PRO SP2 running quite nicely on a 92MB RAM Pentium 75MHz and 4GB HDD before.
Regarding FF7 the game was originally developed for a console (Playstation 1) that had a 2x CD reader, 33.8MHz CPU, a max resolution of 640×480, 2MB of RAM and 1MB of Video RAM.
Of course during those days, console games were usually optimized for the system they were built for (not that much anymore tho since the XBox360, PS3 and Wii came out) and took 3~5 years (takes 1~3 years nowadays) to develop.
So... yeah the requirements 'couldn't be worlds apart higher than the original system.