2019-11-28 02:54 AM
The application is a solenoid driver, 12V 0.5A running PWM 1Khz. It's a very basic low side application, diode across the coil. The signal is 5V logic from a microcontroller.
I have read the datasheet, and the application note AN4798. to be honest some of the calculation are a bit above me.
What i am trying to confirm, is the procedure for selecting the correct gate resistor value. From the datasheet i can see minimum input series impedence is 330R (Rin), i can see the input current is +/- 20mA (Iin), i also see in the application note that digital inputs to the MOSFET require Rprot which is selected as 1K<Rprot<(Vohuc,min - Vsupply,min) / (Iiss,max + Iih,max).
Before reading the application note i spent a VERY long time looking at general calculations for gate current, charge and switching times etc, but have yet to be able to make any of these calculations work with this device (like Q=t*I).
I have checked the actaul switching slope times on an oscilloscope, and at the 330R they look fine, but i want to be able to understand the correct procedure and actaully prove it.
Can anyone help with the the information required to complete this calculation?
One further point is the use of a pulldown resistor on the MOSFET input, seems to be used a lot on many devices, and shown in some datasheets, but no mention on this device?
Thanks