I remember running Vista betas (which should be at least as demanding as Windows 7 RTM) on a 1.9GHz P4 Willamette with 1GB of RDRAM and ... what was that card... it was an ATI All-in-wonder but I can't remember if it was a 9700 or 9800... and, for a test system, it was fine. Would I have tried running a real complement of applications in 1GB of RAM? Probably not...
And I remember rescuing a P4 desktop, putting Windows 7 on it and selling it to a friend (otherwise a Mac user) who needed Windows for something in, oh, 2014 or so. I forget how much RAM. It was fine...ish.
But... as others have said, why would your friend want to do this? This isn't even that great an XP system, let alone a 7 system. If you go for something like a C2D/C2Q or Ivy Bridge, you can have a retro system that will absolutely scream at XP, be solid at 7, and be able to run 10 and even unsupported 11 passably. And C2D systems, at least with meh motherboards, are probably still plentiful out there. Does he need a good GPU, which limits the ability to reuse random old business desktops?