The truth is that iPhones are not cameras in the traditional sense. Computational photography is a term that describes imagery formed from digital data and processing as much as from optical information. Each picture registered by the lens is altered to bring it closer to a pre-programmed ideal, which is what an executive at a consulting firm in the Washington, D.C. area did in late 2020. She told me recently that the 12 Pro has been a disappointment and that she feels a little deceived. Some of the photos that she takes of her daughter turn out blurry. She showed me a picture of the girl's feet that were messy. When she uses her older digital single-lens-reflex camera, she sees what she sees on the camera and in the picture. She said that she was serious and that she would make it less smart. She has taken to carrying a phone from the line of phones from the company for the sole purpose of taking pictures. Gregory Gentert, a friend who is a fine-art photographer in Brooklyn, told me that he has tried to photograph on the iPhone when the light gets bluish around the end of the day. He said that the device sees the things he is trying to photograph as a problem to solve. Digital noise is eliminated by the image processing, which may be the reason for the smudginess in the photos of her daughter's gymnastics. The fix creates a distortion more noticeable than the original. The team at Apple agreed to give me information about the camera's latest upgrades. A staff member explained that when a user takes a photograph with the newest iPhones, the camera creates as many as nine frames with different levels of exposure. A feature called Deep Fusion, which has existed in some form since the beginning of the year, combines the clearest parts of all the frames into a single image. This process is an extreme version of high-dynamic range, or H.D.R., a technique that previously required some software savvy. Each image is analyzed by the camera with the help of a graphics-processing unit, which picks out specific elements of a frame, and exposes each one differently. I found that the image processing on the 12 Pro and 13 Pro created skies that looked like a video game or an animation film, because clouds stand out with more clarity than the human eye can see. Andy Adams told me that H.D.R. is a technique that should be applied carefully. The average photo on the Apple device mimics artistry without ever getting there. We are all pro photographers now, but that doesn't mean our photos are great.