I’ve started to look into a dedicated camera but one thing I’ve noticed is that most of them have trouble shooting at 4k 60fps and those that do seem to have a lot of rolling shutter issues. Why is that? I’ve heard it’s due to the larger sensors but I feel like it’s more a processor issue than a sensor one right?
For example, a DSLR sensor is not all that different than most other camera sensors. The main difference is what is being done on the sensor versus what is broken out for external access.
I’m certainly no expert here, but I tried building an astro photo setup old school style with some old we cams. None of the sensors I had available broke out the features I needed. I could have done some external image stacking but there were a lot of errors in the compressed output from the module. I basically learned I need to buy a sensor based on the features available in the Linux kernel driver to do what I wanted to do, and that randomly chosen cheap webcams didn’t have very much low level access.
From the hardware side, it is a ton of data output that can be challenging to handle and process quickly enough. The frequencies are quite high and that makes circuit design challenging too. It is easier to drop stuff from the stream earlier and output a much smaller final product like image. At least, that was my experience as a maker that was mostly playing in a space that is over my head in such a project.
https://www.youtube.com/watch?v=28S47EE_opA Start at about 4:00 min