• SlopppyEngineer@lemmy.world
    link
    fedilink
    arrow-up
    106
    ·
    5 months ago

    They didn’t start it with rocks. The first calculators used gears. Those were hard to reprogram. So they started using relais. That worked but was very slow. Then they found out that lamps (vacuum tubes) could take the place of relais but these wore down too fast. Then someone figured out that rock stuff (silicium) could do the same as a vacuum tube. After that it became a race to make them as small as possible to cram more of them together.

    • Pelicanen@sopuli.xyz
      link
      fedilink
      arrow-up
      24
      ·
      5 months ago

      I took a course in computing systems engineering which was basically going all the way from semiconductors up to operating systems and it was incredibly interesting.

      One of the things that surprised me was how easy it was to abstract away the lower-level complexity as soon as you got one step up. It’s kind of like recursive Lego pieces, you only have to design one piece then you can use a bunch of those to design another piece, then use a bunch of those to design another, and so on. By the end you have several orders of magnitude of the fundamental pieces but you don’t really think about them anymore.

      • drosophila@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        5 months ago

        The thing about real world processor design though is that all those abstractions are leaky.

        At higher levels of design you end up having to consider things like the electrical behavior of transistors, thermal density, the molecular dynamics of strained silicon crystals (and how they behave under thermal cycling), antenna theory, and the limits and quirks of the photolithography process you’re using (which is a whole other can of worms with a million things to consider).

        Not everyone needs to know everything about every part of the process (that’s impossible), but when you’re pushing the limits of high performance chips each layer of the design is entangled enough with the others to make everyone’s job really complicated.

        EDIT: Some interesting links:

        https://www.youtube.com/watch?v=U885cIhOXBM

        https://www.youtube.com/watch?v=ljZt_TQegHE

        https://www.youtube.com/watch?v=rdlZ8KYVtPU

    • Aceticon@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      Generally it’s only the wizards that deal with the physical side - such as rock shapping and rock combining - that get magic smoke, though if they did their part wrong the wizards that make rocks think might get it as can the people playing Skyrim using the thinking rocks.

        • Aceticon@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          5 months ago

          You have to melt the whole thing back to raw rock and remake it from scratch, which is usually more work than just grabbing some fresh raw rock to build a new one so it’s seldom done.

        • archon@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          How do you get the yolk back in the egg? Well, just ask the chicken of course (or the Wizard in this case).

    • archon@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      Anon also forgot to infuse the lightning pixies after inscribing the runes, tsk tsk.

      They’re the ones who let the smoke out if you anger them!

  • BurnedDonut@ani.social
    link
    fedilink
    arrow-up
    56
    ·
    5 months ago

    When I learned about how they are making the new CPUs it blew my mind. Dropping a microscopic droplet of metal and blasting it with lasers to a stencil like thingy to create the nanometer circuitry. I was like how the fuck did you even thought about doing that?.. Technologies like these are really marvelous.

    • cucumber_sandwich@lemmy.world
      link
      fedilink
      arrow-up
      59
      ·
      5 months ago

      You start with macroscopic photolithography, add material science of semiconductors and then iterate a million times. It didn’t start at nanoscale.

      • BurnedDonut@ani.social
        link
        fedilink
        arrow-up
        17
        ·
        5 months ago

        Give me a break… I’m still trying to wrap my head around how transistors work. For a layman this is like magic.

          • Aceticon@lemmy.world
            link
            fedilink
            arrow-up
            7
            ·
            5 months ago

            It’s essentially the same as somebody with a couple of cans of spray piant and a handfull of carboard sheets with cutouts spray paimting a muti-color tag on a wall.

            As the logo kept getting smaller and smaller and the errors of the process of just putting that cardboard in front of the wall and spraying the whole thing had too much imperfection for tiny logos, they had to come up with more and more tricks to get it to still do tiny logos without those logos ending up too distorted.

      • captainlezbian@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        Exactly, and “we need this as small and precise as possible” means “can lasers do it?” As an engineer I default to fast and precise means computer guided laser if possible

        • cucumber_sandwich@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          They use electron beams and extreme UV light nowadays. Lasers are not necessarily the best light source, even at other wavelengths.

    • bamboo@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      59
      arrow-down
      2
      ·
      edit-2
      5 months ago

      One Switch can have two states. Switch on is a 1 and switch off is a 0. Group 8 switches together and you get a byte. Miniaturize the switches and put 8 trillion of them into the size of a fingernail, and ta-da you have a 1TB micro SD card.

      Wire up two switches so that a light bulb only will go on when both switches are on (1). This wiring creates an AND gate. Adjusting the wiring so that if either of the switches are on, the light turns on. This wiring is an OR gate.

      Channing the output of the lightbulb and treating it like a new switch allows you to combine enough AND and OR gates to make other logic blocks, NOT, NAND, XOR, etc.

      Combine enough logic blocks and you can wire up a circuit so that you can add the value of two switches together, and now you can start to perform addition.

      This all naturally evolves to the point where you can play Skyrim with the most degenerate porn mods.

    • pigup@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      5 months ago

      Simply put, the switching doesn’t do anything by itself. It’s the meaning we assign to the arrangement of on-off switches. Much like flag signals, the flags don’t do anything besides be visible and locatable. Yet, we can establish a communication protocol with flags, lights, fingers on a hand, etc. this signaling is done electronically with many layers of meaning and complexity, and nowadays at unfathomable scale and speed.

  • niktemadur@lemmy.world
    link
    fedilink
    arrow-up
    26
    ·
    5 months ago

    Well… since you put it that way, it is quite staggeringly improbable, isn’t it?

    “Through these terse, inter-connected runes, an invisible magic flows. You cannot change the rune, as then the spell will be broken.”
    “Where does the magic come from, mommy?”
    “From the highest point in the invisible topology of this magic, Billy: the Hoover Dam/Niagara Falls”.

  • halcyoncmdr@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    5 months ago

    No it’s even worse. We taught the rock how to think, and now force it to think what we want it to think. Millions of thoughts that we want, every second.

    • groet@infosec.pub
      link
      fedilink
      arrow-up
      9
      ·
      5 months ago

      How is software without a CPU useful? Its literally a list of instructions for a CPU.

      Also a CPU can still calculate stuff if you just send electrical signals to the right connections. Software is just a way for the CPU to keep going and do more calculations with the results.

      • 𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍@midwest.social
        link
        fedilink
        arrow-up
        9
        arrow-down
        2
        ·
        5 months ago

        Software is algorithmic instructions. We wrote and executed algorithms by hand long before we had calculating machines; and when we did get computers that could run more complex algorithms, they didn’t have CPUs. They had vacuum tubes (there were even simpler programmable purely mechanical computers before even vacuum tubes). CPUs didn’t come along until much later; we’d been writing software and programming computers for decades before the first CPU.

        And even if you try to argue that vacuum tubes computers had some collection of tubes that you could call a “CPU” - which would be a stretch - then it still wouldn’t have been made from silicon (rocks) as in the OP post.

        But before the first calculating mashing, people are writing algorithms - what software literally is - and executing them by hand long before we had calculating machines to do it for us. Look up how we calculated the ranging tables for artillery in WWII. Algorithms. Computed by hand.

        The word “computer” literally comes from the word for the people (often women) who would execute algorithms using their brains to compute results.

        • zalgotext@sh.itjust.works
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          5 months ago

          I think you’re conflating “algorithm” with “software”. You’re right in saying that algorithms can be computed by hand, but I don’t think anyone would refer to that as “running software”. The word “software” implies that it’s run on “hardware”, and hardware usually implies some sort of electronic (or even mechanical*) circuit, not pen and paper and a human brain.

          • 𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍@midwest.social
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            5 months ago

            Say I agree with your distinction - or restriction. There was still software written for, and programmed into, general-purpose, Turing-complete calculating machines long before there are CPUs.

            So let’s look at the technical details of the word. The term “Software” was coined in 1958 by John Tukey. The computers in use at that time were machines like the IBM 704, the PDP-1, and the UNIVAC 1107; these are all vacuum tube computers that contained no silicon microchips and no CPUs. Even technically, the term “software” predates silicon and CPUs.

            Non-technically, I disagree with your premise on the basis that it’s often been argued - and I agree with the argument - that humans are just computers with software personalities programmed by social conditioning, running on wetware and a fair bit of firmware. And there’s increasing evidence that there’s no real CPU, just a bunch of cooperating microorganisms and an id that retroactively convinces itself that it’s making the decisions. Even if the term “software” wasn’t coined until 1958, software has been a thing since complex organisms capable of learning from experience arose.

            Unless we’re all living in a simulation, in which case, who knows if software or hardware really exist up there, or whether there’s even a distinction.

            • aubeynarf@lemmynsfw.com
              link
              fedilink
              arrow-up
              3
              ·
              5 months ago

              They called the box with all the tubes in it that executed instructions a “CPU”; memory, CPU, and IO subsystems were distinct and well-defined.

              I feel like you mean “microprocessor”

          • naeap@sopuli.xyz
            link
            fedilink
            arrow-up
            3
            arrow-down
            2
            ·
            5 months ago

            Software runs on processing power. Doesn’t matter if it’s mechanical, electrical or biological computing power.

            The important part is, that something is processing it.
            And although by now software development through abstraction feels disconnected from just specialised algorithms: everything will break down into numbers and some form of algorithm to process the information

        • areyouevenreal@lemm.ee
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          5 months ago

          We also had machines and computers based on relays and other electro mechanical devices earlier than even vacuum tubes. If you follow Technology Connections he breaks down the inner workings of a pinball machine using that technology, but programmable machines have also been made with it.

  • ColdWater@lemmy.ca
    link
    fedilink
    arrow-up
    12
    ·
    5 months ago

    People who are coding Skyrim porn mods are smarter than the one who invented CPU /s

  • TehWorld@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    5 months ago

    The classic response is that you have to capture lightning first to apply to the rick that you want to do the thinking.