• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Photoshop Unblur!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
hard to believe if pictures in the OP are the actual examples they used. I gotta check out the videos when I get back home.
 
image-blurred.jpg
&
cat_blurred.jpg
cat_blurred.jpg


Enhance, enhance, enhance...... 🙄
 
Last edited:
I know, I know. It's too much to ask for a bunch of off-topicers to read the actual article.

No, the picture shown by the daily mail was NOT used. That is complete bullshit on their part.

What the new filter does is to analyze a given picture. Using the way the color is smeared in the image (by comparing different colors and looking for similar motions between the different colors) it generates a mapping of how the camera was moved while the shutter was opened. It then uses that motion map to squash the colors back to where they belong.

The results are phenominal. However, they do not create detail where there was no detail before (like in the daily mail images).

This will be a tremendous help especially in dark photos with long shutter speed, and in high speed photos where flash cannot be used. The researcher did state that it requires a LOT of processor power to do this, and I can understand why.
 
I know, I know. It's too much to ask for a bunch of off-topicers to read the actual article.

No, the picture shown by the daily mail was NOT used. That is complete bullshit on their part.

What the new filter does is to analyze a given picture. Using the way the color is smeared in the image (by comparing different colors and looking for similar motions between the different colors) it generates a mapping of how the camera was moved while the shutter was opened. It then uses that motion map to squash the colors back to where they belong.

The results are phenominal. However, they do not create detail where there was no detail before (like in the daily mail images).

This will be a tremendous help especially in dark photos with long shutter speed, and in high speed photos where flash cannot be used. The researcher did state that it requires a LOT of processor power to do this, and I can understand why.

I can see how it can be done with motion blur, but totally agree you can't add detail where there is none.

zoom, enhance, now zoom some more, now enhance...
 
man, this would come in handy after a heavy night of drinking and just before leaving the bar....

"hold, baby - just me 'unblur you' before we leave...."
 
I don't believe this. How can the filter add so much information that is just not there in the blurred shot? It just doesn't make sense, something is fishy here.
 
I don't believe this. How can the filter add so much information that is just not there in the blurred shot? It just doesn't make sense, something is fishy here.

Because the data is still in the picture - it comes down to putting it back together again....

\not totally sold on it - but i'm interested...
 
yes, it was Dwight Shrute.

Is it just me, or is that the most annoying set-up for a conference? seems so damn...intentionally hip. I wanted to barf.

That guy in a the chair making smartass comments needs shut the fuck up. Interesting technology, if you can estimate the shake of the camera, you can probabilistically recompose the image.
 
Last edited:
The raw data is there - it comes down to the program that can reconstruct it...

no, you are 100% wrong.

the "raw" data is the blurry picture, if that is the original one. there is no "raw" data as far as an un-blurry picture goes, in the blurry photo if that is the source.

there can be estimates made based on algorithms (as shown with this demo) but there will be no details that will be even REMOTELY close to the sun-flower picture shown in the article. that right there is 100% bullshit.

i've actually done some image processing and wrote an algorithm for a "magic wand" back in my matlab days that stuck to the border of the object in the image, so i have a tad of knowledge in this area.

this is basically just going to be a glorified sharpening filter. again, it will work better on certain types of blurry photos, but it won't give you nearly the detail that they are leading you to believe in that article with the flower picture.
 
Well that's funny, because the picture in the OP definitely isn't motion blur.

If you watch the Adobe video they show the estimated blur kernels, and they're all motion blur esque kernels.

An out of focus blur kernel looks more like a Gaussian blur.

You can't deconvolve an image, but with a kernel with structure like a motion blur kernel you have a prayer and being able to reconstruct something close to the original image (I have no idea how they're doing this part - there's been a ton of work in it but nothing looks anywhere near as good as what Adobe just demoed), but with a Gaussian kernel there is nothing you can do.
 
Back
Top