Yes, but, the microwave issue is particular to the specifications and requirements for microwaves.
Most people will believe a microwave is a Faraday cage and thus no wavelength greater than half the grid size can escape. As microwaves are 3cm they cannot escape.
The trouble is, it’s not that simple. The truth is that most microwaves have strict, but substantial leakage allowance. If and it’s a big if these days, they are actually local RF regulations compliant.
While a microwave may have a total leakage allowance measure in milli-watts. As milli-watts of 2.4Ghz is not going to pose any health hazards and won’t case any major RF interference on the larger scale such as would affect radio comes, datalinks and telecoms.
However, even a few leaked milliwatts of 2.4Ghz microwave is going to play havoc with your 2.4Ghz Wifi.
Not all microwaves are as bad as others. The way to test your own microwave is pretty simple. Just put your phone inside the microwave and test your Wifi signal. If you get one, your microwaves is shit. Get a better one. If you can’t see or contact the phone while it’s inside the microwave, chances are your microwave with not interfere with your Wifi.
In 99% of cases, 10G makes no sense in home servers.
Because in 99% of cases “homes” typically have predominantly 1G networking and Wifi as the “Access network” with only a handful of users on it, usually not at the same time.
To upgrade this “1G” standard home network to anything faster will require the infrastructure be made available for higher speeds AND that the client devices are capable of taking advantage of it.
Concretely speaking, you will need to introduce a “switch” with higher speeds as well as upgrading the NIC in any currently 1G devices to the faster speed as well.
Practically and budget consciously, 2.5Gb is the present day sweet spot for consumer/pro-sumer “home” networking. It is relatively cheap, compared to 10G which “can” require a rewire if the CAT5e is substandard.
I have a single 10G link. It goes between two 1G + 2.5Gb switches. I used a simple “DAC” copper link between them.
I have 2 additional 10G ports, so I could link the two Proxmox servers which would take advantage of that speed during migrations and backups, but nothing else on the network is 10G capable or even close to needing to be 10G capable.
10G hardware typically costs a lot more to run. A 10G port might consume twice the power for the same data, even if it’s never hit more than 900mbit/s.
Use-cases which could justify a house-wide 10G upgrade might be, video streaming. I don’t mean playing a remote MP4 off the NAS via plex. I mean real time video streaming, such as lossless video conferencing, remote desktop with full 4K 60FPS playback capabilities. HDMI over Ethernet and so forth. Live uncompressed security camera footage etc. Remote gaming, ie. having a single beast of a gaming server with enough bandwidth to stream 2 clients at 60FPS. More bandwidth = less compression = less CPU = less packetisation delay = less latency = better gaming.