It seems the technique works by generating a similar looking person via StyleGAN2 and morphing them to look like the person in the photo. Since StyleGAN2 is trained on a database of modern people,...
It seems the technique works by generating a similar looking person via StyleGAN2 and morphing them to look like the person in the photo. Since StyleGAN2 is trained on a database of modern people, this technique is kind of like what would Abraham Lincoln look like if they showed up in the FFHQ dataset. I suspect people in the FFHQ dataset would tend to have less blemishes or otherwise unattractive features than people in the past, due to a number of reasons (such as better modern nutrition and skin care, or spending more time inside).
Many historical people are captured only in old, faded, black and white photos, that have been distorted by the limitations of early cameras and the passage of time. This paper simulates traveling back in time with a modern camera to rephotograph famous subjects. Unlike conventional image restoration filters which apply independent operations like denoising, colorization, and superresolution, we leverage the StyleGAN2 framework to project old photos into the space of modern high-resolution photos, achieving all of these effects in a unified framework. A unique challenge with this approach is capturing the identity and pose of the photo's subject and not the many artifacts in low-quality antique photos. Our comparisons to current state-of-the-art restoration filters show significant improvements and compelling results for a variety of important historical people.
This is really cool, although I'd be really interested in seeing them take a photo with an antique camera now, complete with the noise and missing wavelength sensitivity and then run the...
This is really cool, although I'd be really interested in seeing them take a photo with an antique camera now, complete with the noise and missing wavelength sensitivity and then run the restoration process on that to give us an idea of how accurate this actually is. Obviously we'd be missing the aging of the photograph, but still.
It seems the technique works by generating a similar looking person via StyleGAN2 and morphing them to look like the person in the photo. Since StyleGAN2 is trained on a database of modern people, this technique is kind of like what would Abraham Lincoln look like if they showed up in the FFHQ dataset. I suspect people in the FFHQ dataset would tend to have less blemishes or otherwise unattractive features than people in the past, due to a number of reasons (such as better modern nutrition and skin care, or spending more time inside).
Link to paper, examples.
Abstract
This is really cool, although I'd be really interested in seeing them take a photo with an antique camera now, complete with the noise and missing wavelength sensitivity and then run the restoration process on that to give us an idea of how accurate this actually is. Obviously we'd be missing the aging of the photograph, but still.
This is way cooler than the title suggests.