- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
cross-posted from: https://lemmy.crimedad.work/post/64887
cross-posted from: https://pixelfed.crimedad.work/p/crimedad/677711492673009332
A practice shot of the moon.
I’m trying to get ready for the eclipse that’s coming up. Supposedly the exposure settings you need for a good shot of a full moon will work for a total eclipse.
My setup is a little bit janky: it’s a 400mm Sigma telephoto lens intended for a Minolta 35mm SLR mounted to a Nikon DSLR adapter mounted to a Nikon D3000 DSLR, which has a partial frame sensor.
Samsung would give far better shots idiots smh.
/s (incase someone can’t tell)
I mean, probably? A new Samsung doesn’t have a big telephoto lens, but the coatings on the optics is mine don’t seem to be in good shape. Also, I have to manually focus and I’m only able to get it good enough through the viewfinder. Furthermore, the sensor in my Nikon probably just isn’t as good.
They’re joking as newer Samsung phones “use AI to upscale Moon photos” when in reality, they’re essentially faking Moon photos by replacing what people take with higher resolution ones they have.
Link
Web Archive Link
I was making fun of samsung moon pic scandal hence the /s. Your work is really good.
I had to look it up and I completely missed that controversy. I realize I’m late to the game, but that’s absolutely insane that a camera app would essentially replace what it thinks is a bad photo by the user with an AI generated photo of what it thinks the user actually intended to produce. Such a process is dishonest, arguably unethical, and stupid! If the app thinks the user took a bad photo of the moon, it should just say so with a prompt including a link to a good, public domain photo.
And thank you! As far as I know, I did not have any help/interference by AI lol!