News:

we are back up and running again!

Main Menu
Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - DaveG

#1
Thanks for the responses to my question.  I still think that I must be missing something however as the technology has moved on immensely from the time when we had to use big lumps of iron and copper to handle large amounts of low voltage power.
For example
1)    We now have very slick and extremely cheap Grid Tie Inverters available that will implement a conversion from a voltage wild input (anything from 10V to 55V for the units I've seen and used) to standard 110 or 220 AC outputs at your choice of frequency.  In addition they will implement maximum power point tracking - great for getting the maximum out of an un-optimised alternator and even better because we don't need to worry about tightly governing the engine speed.

2)   The GTI can be used off the grid if you wish by paralleling it with a small inverter so you don't need the hassle of making a deal with the power company.

3)   You can use any size (voltage) stack of batteries you like with the appropriate charger but this means that we can trade higher battery voltage for lower cable current and hence far lower losses in the system.  For example given a 0.01 ohm cable and terminal resistance and a 2.4kW load at 12V (200 amps) we get about 400W developed outside the load.  With the same load at 48V (50 amps) we get 25W losses.  That's a reduction from losing 16% of the energy to losing only 1%.

4)   Copper isn't getting any cheaper - electronics is!

5)    And, given that wiring together lumps of electronics is far easier than welding lumps of metal together and getting them to balance (even if not as much fun) I just wonder still why we would want to do that?

Sorry if this is slightly off topic but I can't find anywhere on the board that is addressing this.

Best regards  Dave
#2
I've been hovering around this forum for a while (learning from the experts) but I find myself wondering more and more why there is so much emphasis on low voltage systems.  For example, generating 210 amps at 24 volts is going to involve substantial I^2 R  losses however tightly the connectors are torqued up and transmitting it more than a few feet will require pretty thick cables.  Even 0.1% loss is still 60W or so and that's a lot of heat to lose safely.  Is there a reason (perhaps particular to the USA) that I'm not appreciating?