• coredumperror@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The photo shows the car getting a 16kW charge rate while driving at 5mph. So you could get 16kWh of energy if you drove on this kind of road for an entire hour, at 5mph. Pretty worthless.

    I’d love to know what sort of charge rate you’d get while driving at 35 mph, the standard speed on surface streets. Based on what little I know about inductive charging, I’d guess that the charge rate will be dramatically slower.

    But if you get even 5kW while driving at 35 mph, I think that’d actually be pretty useful. Assuming that you get a significant efficiency increase at such a speed (5 mi/kWh seems reasonable), then going 35mph would use up 7KwH of energy per hour of driving. If you can restore 5 of those kWh from the roadway itself, you’re getting back over 70% of your energy usage while you drive.

    The question then, however, is “How efficient is the energy transfer?” Because I doubt it’ll be very good. It’ll be rather expensive to pay for that “charge while you drive” power if you’re only getting 30-40% efficiency (at a guess). Some folks might be willing to pay that premium, but by the time this tech is ready to go to market, I imagine most people won’t really feel the need to “charge while you drive”.

    • lessquiet@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      All wireless chargers are inefficient and there isn’t a known way to make them efficient. There are some theoretical things you could do, like beam steering of the EM field so that more of the energy intersects with the receiving coils, but in practice that’s so insanely expensive as to be fantasy for something as large as a roadway. The efficiency will also never be as good as running the current through a wire.