Coax isn't really manufactured for any particular frequency limit. How it works is the higher the frequency the higher the attenuation which means the shorter the cable. Taken another way, if you know how long your cable has to be, you choose one based on the highest frequency used and how low a signal you can tolerate at that frequency.
So sure, the old RG-59 can be used to 2Ghz but after 100 feet you would be over 16db down. (Most charts don't go over 1Ghz because that's considered the limit of it's usefulness.) This is similar to voltage drop on a pair of conductors, you can supply them with 120V but what is left at the other end when you consider the load. It works the same way- if you want to have the most voltage at the other end you use bigger wire or in this case a bigger lower loss cable.. . . .
Most application of coax in homes (not interconnections ) from the street to modem are for broadband.
No. . ., cables are not "custom " made to handle the particular frequency they are intended to be used. You select the highest rated frequency they can handle.
Again, for home use you won't (and you can't) exceed 2000 Mhz because ISPs don't even transmit close to 2000Mhz.
It's more like cables made for 600 volts but you are only using it for 480 volts or less.
Attenuation does occur anytime you transmit voice or data from the source to its destination.
All these issues are really non-issue because of data that ride on higher frequency can be boosted or demodulated any which way you can.
The enormous broadband that we now have-- can be modulated and demodulated and channels can be separated for voice, video and data. Adding to that, the awesome resolutions on video are something we haven't seen before.
That's the improvement over the invention based on Heaviside experiment . . . the physicist and mathematician who first experimented on it.
Losses in the small range is not a big concern as I have said --since the increase in frequency can be utilized to compensate or correct for the loss.
Comparing data that rides on the wire cannot be compared with power that is delivered from power source to an appliance that uses that power.
You can have voltage without (that much) power. The important component is the frequency where you can have data to ride on.
Now, don't confuse this as something like:
How can power travel without current?
This violates OHM’s LAW. Would it not?
Well, data delivery doesn't depend on current. All it needs is continuity for "PACKETS" to be able to travel from the packet creator (the server) to its particular address --embedded in the packet and encoded on the request when the "handshake" between client and server was initiated.
Continuity is assured and can be monitored by a small resistance at the furthest end-- that will cause a very small amount of current to run through the wire. This small current is NOT there to be used in running the electronics at the destination. The signal riding on the wire are those ZEROes and ONES (binary.)
Think of old telegraph DOT and DASH signals.
So, you can discard the thought that data is lost because of the size of conductor or the power (amperes) is insufficient like toasting our morning bread.
Electrical engineering is mostly concerned on how we can run our machines with the power at our disposal--and not about the attenuation in decibels.
We will leave that to the physicists and scientists who have too much time on their hands to tinker with them.