• QuadratureSurfer@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Can you link to how they estimated it? I gave it a quick search and my results didn’t seem to be anything useful.

    It doesn’t matter what language it’s in or which encoding we chose. It’s not an exercise in optimization.

    In a way, it really does matter or else these numbers are meaningless and won’t mean the same thing to someone from the future (or past). Just think about how big 15GB was ~20-30 years ago (before compression became popular for websites). Telling someone in that time period that there were 13-15GB worth of information in a library would have severely underestimated just how much information was actually contained (unless were strictly talking only about text here).

    Why would they store it in 4k?

    I think you misunderstood my question. I wasn’t stating that they would (or should) store it in 4k, just wondering if their data storage estimates included maps/drawings/paintings that could have been in the library as well as what sort of quality they would have used for that kind of storage. Images can easily use up tons of data depending on what format you’re using.

    Let me try explaining it in a different way. Imagine we had a small device that created a pocket dimension while also being able to shrink objects inserted into it down to about half it’s size. Let’s also say that the pocket dimension was big enough to store 100m³ And somehow this fits the entire contents of the Library of Alexandria. Let’s call that device a pokédim. Some research paper could say, “the entire contents of the Library of Alexandria can fit in 100m³ of a pokédim!”

    A few years go by and the pokédim gets an upgrade and it can shrink objects down to 1/100th of it’s original size. Now, the problem with someone reading that previous statement is that it is no longer relevant.

    What the statement should have been is, “the entire contents of the Library of Alexandria can fit in 100m³ of a pokédim when shrunk down to half its size!”

    • RedditWanderer@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Ok chatgpt, you just wanted to string words together knowing very little how computers work, or how studies/conclusions work for that matter.

      • QuadratureSurfer@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        No chatGPT (or any LLM) used for any of my replies to you.

        But, if you could please link to the study/conclusion so that I could read about it, I would greatly appreciate that. Especially since you seem to have easily found it after a quick search.

        I am honestly wanting to know more.

        • RedditWanderer@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 months ago

          For one, this post isn’t about any study. Second, dude said supposedly, there is “some studies”, you’ll have to go find it. The chat gpt thing was a joke that youre just stringing words together, not that gpt actually wrote it.

          The questions youre asking at laughable no matter the study. What good is a study that says it takes a TB to store it, if they went out of their way to use 4k images? Can i make a study saying they are wrong because I used 8k images? It’s not a world record or a challenge, so you use average, favorable measures for things unrelated to the question, which is the actual specific size to the GB. All these studies do is average, so we can get a sense of how much it is today.

          The method would imply statistics drawn from market averages, like the size of an ebook or whatever, it wouldn’t contain any of the info you mentioned. Your post is kinda right out of /r/iamverysmart, and you’ve made no effort to look these things up.