It used to be that a new camera needed a new sensor to get improved image quality. Not any more
(Image credit: Gareth Bevan / Digital Camera World)
The new Sony A1 II is making headlines for its incredible speed and AI-driven subject recognition. There's no doubt it's an incredible flagship camera, but there's one thing Sony is a little less keen to promote: its 50.1MP stacked CMOS sensor has been directly carried over from the original A1, a camera now almost four years old.
Reusing a sensor in this way is nothing new though. There have been plenty of DSLR and mirrorless cameras that have inherited the sensor from the preceding model. But what I think makes the A1 II's sensor 'recycling' more noteworthy is this is a flagship camera, with a price tag that's high even by flagship standards.
There are plenty of reasons why Sony may have chosen to retain the original A1's sensor. For starters, it's a darned good sensor! It's incredible that any sensor is able to maintain a high enough readout speed to facilitate 30fps continuous shooting at 50 megapixels, and it's arguable whether we actually need a sensor more potent than that right now. Some may resent the A1 II lacking global shutter, given this is already present in the slightly older (and significantly cheaper) a9 III. Masanori Kishi, general manager of Sony's Lens Technology & Systems Division, has answered this criticism, along with the general critique that the A1 II's sensor isn't new:
"The sensor used in the α1 has a fast readout speed comparable to a global shutter, and has very little image distortion. It was a sensor with high potential to begin with. This time, we improved the image processing algorithm to improve image quality. The α1 II has evolved to be worthy of a new flagship camera." (translated from the original Japanese)
And here we have the core reason why the A1 II's sensor wasn't changed: "we improved the image processing algorithm to improve image quality". It's long been the case in the camera phone world that the physical image sensor doesn't necessarily impact image quality to the extent people would assume.
The Samsung Galaxy S24 Ultra, for instance, uses the same 200MP primary image sensor as was in the S23 Ultra. Google is the master of recycling image sensors, though, as it used the same sensor in the Pixel's primary camera for four consecutive phone generations (Pixel 2 XL, Pixel 3 XL, Pixel 4 XL and Pixel 5). And yet despite this hardware stagnation, the image quality in these subsequent Pixel generations still noticeably improved, all thanks to constantly evolving image processing.
Get the Digital Camera World Newsletter
The best camera deals, reviews, product advice, and unmissable photography news, direct to your inbox!
This is what we see in the case of the A1 II: Sony has not only tweaked and improved the image processing for the AI II, it's also harnessed the power of AI, so you get more reliable autofocus and an improved hit rate. Given that the A1 II is priced exactly the same as the original A1, I don't think that reused image sensor is anything close to a deal-breaker, especially when it's still so capable, and when you consider many other aspects of the new camera have been improved.
Ben is the Imaging Labs manager, responsible for all the testing on Digital Camera World and across the entire photography portfolio at Future. Whether he's in the lab testing the sharpness of new lenses, the resolution of the latest image sensors, the zoom range of monster bridge cameras or even the latest camera phones, Ben is our go-to guy for technical insight. He's also the team's man-at-arms when it comes to camera bags, filters, memory cards, and all manner of camera accessories – his lab is a bit like the Batcave of photography! With years of experience trialling and testing kit, he's a human encyclopedia of benchmarks when it comes to recommending the best buys.