Soldato
This question is confusing me. I can't work it from the supporting information I've been given, so I'm asking here. The question is:
A 10V, 250A power supply feeds to an inductor or 120H, and operates as a voltage source set to 10V. The leads to the inductor have a resistance 20m ohms.
- What is the initial rate at which the current increases, just after the power supply is turned on?
- When the current is approaching 250A, what is the rate at which the current increases?
All I've been given that seems relevant is V=L.(dI/dt) What would dt be? I guess all I need is the figure for dt just after the point the power is turned on, then I can transpose for dI? But I can't figure out what dt would be? Or am I being totally retarded here and getting the wrong end of the stick completely?
What do I use as a value for t (time)?
It's not homework as such, just an exercise in understanding. AND I WANT TO UNDERSTAND!!
A 10V, 250A power supply feeds to an inductor or 120H, and operates as a voltage source set to 10V. The leads to the inductor have a resistance 20m ohms.
- What is the initial rate at which the current increases, just after the power supply is turned on?
- When the current is approaching 250A, what is the rate at which the current increases?
All I've been given that seems relevant is V=L.(dI/dt) What would dt be? I guess all I need is the figure for dt just after the point the power is turned on, then I can transpose for dI? But I can't figure out what dt would be? Or am I being totally retarded here and getting the wrong end of the stick completely?
What do I use as a value for t (time)?
It's not homework as such, just an exercise in understanding. AND I WANT TO UNDERSTAND!!
Last edited: