esafak 2 hours ago

Let's see them replicate some famous lens blurs, like those of a Leica, and see if we can tell apart the simulation from the reality. Imagine if we could transform the bokeh after the fact! That would be awesome.

  • klysm an hour ago

    They kinda did that but for a canon lens

mcdeltat 6 hours ago

I always wondered why lens blur is considered hard computationally. Lens mathematics seems pretty well understood given that we can create quite complex lens designs with incredible performance (take a look at modern DSLR lenses, they often have 10+ elements). And in general blurs (e.g. Gaussian) are not complex algorithms. Are there situations where lens blurs are easier/harder? I heard for 2D images it's hard to add blur - seems true because most smartphone artificial bokeh is horrible despite significant effort there. Presumably because depth information is missing? Is it easier for raytraced 3D renders?

  • meindnoch 6 hours ago

    Lens blur without a depth map is an ill-posed problem. So the computation goes into faking depth information somehow.

  • zipy124 6 hours ago

    It is exactly that. Blur is a function of depth related to the focal distance, so without depth you cannot "blur".

  • AlecSchueler 6 hours ago

    You've got it exactly. It's difficult to recalculate depth when you only have two dimensions. In a 3D environment it's as simple as you would expect.

zokier 6 hours ago

Different camera models are interesting field of research. For example NASA uses CAHVORE model for scientific cameras to correct for all sorts of distortions, and I think OpenCV has also it's own models. And now we have this novel model of lens blurs that can be added to the mix.

This is all very relevant if you want to do e.g. 3d reconstruction or view synthesis from some images, I imagine you can do much better the better knowledge you have of the camera.

bborud 3 hours ago

Computing lens blur ought to be easier to achieve with more modern camera systems that add a LIDAR to capture a depth map. Does, for instance, Apple use their LIDAR system on the iPhone to do this?

  • klysm an hour ago

    Yeah once you have depth faking t is a lot easier. I find this most interesting from a correction perspective

  • Analemma_ an hour ago

    They do, and it has made some noticeable improvements. Compared to when it first came out, "portrait mode" on recent iPhones is a lot less likely to blur individual hairs on a person, or keep the background seen through a lock of hair in focus. But IIRC the iPhone lidar can only distinguish something like 16 depth layers, and at the end of the day the blurring is still computational and "fake", I don't know if it will or can ever reach parity with what a large lens can do.

Daub 8 hours ago

I have always found it odd how in VFX we spend a lot of time degrading our perfect 3D renders: motion blur, film grain, sensor noise and lens blur, which I would call a defocus. I am interested in the application of this research, and imagine a library of typical cameras and their associated blurs. We have similar libraries of film grain and sensor noises.

  • klysm an hour ago

    Similar concepts happen in music production. It’s best to start with a perfect signal and degrade it so it’s not a one way door.

hengheng 6 hours ago

I can't imagine there being enough information for true fingerprinting of individual devices. With ten million iPhones being made per month, surely the blur patterns have to have some overlap?

  • JackC 2 hours ago

    Amusingly this makes them more like actual fingerprints, which also lack enough information for "true fingerprinting" -- there seems to be little scientific knowledge of the error rates in matching human fingerprints in court. "Many have said that friction ridge identification is only legally admissible today because during the time when it was added to the legal system, the admissibility standards were quite low."[1]

    [1] https://en.wikipedia.org/wiki/Fingerprint

adastra22 8 hours ago

This would seem to have huge forensic applications.

Does this lens blur change over time for a given phone?

  • spaqin 7 hours ago

    It doesn't, unless the camera is damaged. How the blur looks is a consequence of the lens' optical design.

    • zokier 6 hours ago

      I would assume the fingerprint would be extremely sensitive to alignment and positioning of the optical elements, and I don't find it far fetched that those could shift minutely during the lifetime of a device without getting to a point where the camera is actually damaged.

      • jagged-chisel 4 hours ago

        Only if adhesives allow such movement

        • zokier 13 minutes ago

          No mounting is perfectly rigid and non-deformable.