• Deconceptualist@lemm.ee
    link
    fedilink
    English
    arrow-up
    58
    ·
    edit-2
    2 months ago

    Yeah so maybe both Neptune and Uranus look hazy and dull green in visible light. So what?! They’re both entire worlds, significantly larger and vastly different in climate and tilt and composition and weather phenomena than our own, and we’re lucky enough to have seen them up close with probes. They have layered structure that can be seen in other wavelengths (e.g. infrared) and so many mysteries we haven’t yet conceived. Plus a ton of moons each that are weird and fascinating in their own right. They’re not the least bit boring to a curious mind.

  • psycho_driver@lemmy.world
    link
    fedilink
    arrow-up
    56
    arrow-down
    2
    ·
    2 months ago

    When the developers thought their model was so far out of the potential view distance of the player they didn’t bother making a texture for it.

  • PhlubbaDubba@lemm.ee
    link
    fedilink
    arrow-up
    39
    ·
    2 months ago

    How did it take until 2023 to discern the true color of a planet we’ve known about since before humans found Antarctica?

    • Deconceptualist@lemm.ee
      link
      fedilink
      English
      arrow-up
      62
      ·
      2 months ago

      Serious answer: the sensors in telescopes and probes don’t work exactly like human eyes. They pick up a different range of frequencies than our cone cells in the first place, and don’t have the same sort of overlapping input curves. There’s a lot of tricks and techniques in converting an image into the same sort of thing we’d see with the naked eye. You can sorta think of it like translating Japanese into English; there’s no perfect formula and it requires some creative interpretation no matter what.

      The popular images that get published all over are simplistic composites and never really reflect the actual data astronomers rely on, so that was never a hindrance to scientific progress. It suddenly made the news because a research group decided to reevaluate the old data and reinterpret it against calibrations from other equipment (e.g. Voyager probe vs. the Very Large Telescope here on Earth). There’s a general interest factor in “wow that looks so much different than the old pictures”, when the underlying data really hasn’t changed.

      • Skua@kbin.social
        link
        fedilink
        arrow-up
        37
        ·
        2 months ago

        To add to this, we apparently always knew. The famous blue image is more or less the correct hue, but the saturation has been absolutely blown out like a clickbait youtube thumbnail in order to show faint features more clearly. Somewhere along the line we stopped mentioning that that had been done. Irwin and co just just re-calculated it to get the most accurate version yet, because we’ve got a lot more data to work with now than we did back when Voyager 2 did its fly-by

        • Deconceptualist@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          2 months ago

          Sort of? My understanding from reading a handful of articles is that Neptune has a bluish haze layer that’s absent on Uranus, but it’s fairly subtle and the overall color of both is a pretty similar frosty light green. So it’s not just that it got oversaturated but that that particular blue hue got applied to the whole planet and not just a thin layer.

      • wandermind@sopuli.xyz
        link
        fedilink
        arrow-up
        18
        ·
        2 months ago

        Furthermore, it’s not that the original scientists failed to produce true-color images. The original published images of Neptune had deliberately enhanced colors to better show some of the features of the cloud surface, and the description text of the images said as much. But that nuance was quickly forgotten and everybody just took the deep blue coloring to reflect the actual color of the planet, which spread to depictions of the planet everywhere.

    • Stovetop@lemmy.world
      link
      fedilink
      arrow-up
      34
      ·
      2 months ago

      Similar reason why these three photos look like slightly different colors: image sensor type, quality, and postprocessing.

      Older images, less accurate. Especially if the type of image a system is capturing is not meant to be consistent with the human eye.

      • booly@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        2 months ago

        not meant to be consistent with the human eye.

        Even then, postprocessing is inevitable.

        As the white/gold versus blue/black dress debate showed, our perception of color is heavily influenced by context, and is more than just a simple algorithm of which rods and cone cells were activated while viewing an image.

      • OutlierBlue@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        How would they even know unless the ones doing the colour-blindness study are part of the 3%

  • lath@lemmy.world
    link
    fedilink
    arrow-up
    24
    arrow-down
    1
    ·
    2 months ago

    TFW you thought you were cool, but it turns out you had a yee yee ass haircut all along.

    • Vytle@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      Oh my god dude i forgot about ball mice. I don’t know how I let that slip my mind wth