• j4k3@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    1 month ago

    IIRC there was a blog post article in the last year-ish about a researcher at Intel that had been working on using FPGA’s for AI. IIRC, they mentioned that the issue they could not overcome was power and scaling. I seem to recall them mentioning that analog had a similar issue with scaling and throughput, but that is the weakest aspect I recall in my abstractions when my main curiosity is why large FPGA’s are not the present goto tech.

    I’m on the edge of my understanding here, but I think the issue with analog is the depth of tensor rank dimensions. Analog is great and super efficient for the first few dimensions, but AI can have many rank dimensions, and there is no telling what the future of algorithms will hold. Building for optimising the way models work presently is practically guaranteed to result in a useless product by the time it is fabricated and brought to market.