Hello vimal.
You want to say that you connected 4
in series right? The positive of the battery goes to the anode of the first led, then the cathode of this led to the anode of the other led and so on, and then the cathode of the 4th led back to the negative of the battery. is that correct? We avoid connecting leds in parallel. It is a bad practice
So i trust that you connect them in series. worst case scenario: Battery voltage 14.6V. Which means that each led gets about 3.65V which is already too much! One of the LEDs will always be weaker than the others and it will always be the first to die.
So, i give you 2 simple solutions. First is the (crappy) resistor (as you mentioned). Play it safe, under-power the LEDs, after all 40W are already enough! Say that you want to provide not more than 3.2V for each LED, a total of 12.8V in series. You connect the resistor to "dump" the extra 14.6-12.8=1.8V... Here is the calculation:
R=V/I = 1.8/3 = 0.6 Ohms
3 amperes will go through this resistor, so it has to be able to dissipate:
P=I
2 x R = 3
2 x 0.6 = 5.4W
So you want a resistor around 0.6 ohms (i would go for a little bit higher though, say 0.8 or 1 Ohm) rated 10 Watts. Remember that dissipating 5.4W is enough to give you a burn if you touch this resistor. Keep it away from plastics, heat-sink it if necessary.
---------------
Solution #2 - a transistor linear driver. You can use such a driver to "simulate" a resistor. The benefit is that with the transistor you automatically regulate its resistance so that the current through the LEDs is kept constant. The previous solution with the resistor has the disadvantage that any fluctuation on the voltage will appear as change in brightness on the LEDs. A constant current always keeps the current through the LEDs constant (hence the name) regardless of the voltage.
Here is a good constant driver for your case:
Transistor - MOSFET Constant Current DriverFor mosfet you can use the IRF520 or 540. Any NPN transistor will do. RG can be around 470 ohms. RS will control the current. In your case I'd keep it as low as 2.6 amperes (keep leds underpowered):
RS = 0.6 / 2.6 = 0.23 ohms
The 0.23 ohms resistor must be able to dissipate that much power:
P = 3
2 x 0.23 = 2.07 watts
So, choose a resistor around 0.2 ohms (or higher) at 5 watts
Again, the mosfet HAS TO BE HEAT-SUNK because it will be called to dissipate all the power required to keep the LEDs properly biased, that will be more than 5W 9as we calculated before). The tab will BURN!!!
good luck