### 20Wp battery-stored solar energy

#### Objectives

In a big picture, the solar energy cost is reducing and competitive with other energy sources. A recent soar of new solar farm installation thanks to continuing price reduction of the panels. For IoT application, solar energy provide a unique option that no other source can: small footprint, simple and remote/standalone setup. This post will not go in depth on technicality of solar energy system but rather how to get one simple system running in a short time, just for fun.

Four components for IoT systems are:

• Solar panel: the most popular is monocrystalize; polycrystalize is higher cost but a higher conversion efficiency
• Solar charger: step-down converter to the battery
• Battery: acid-lead based; lithium-based where electrons are stored
• DC converter to the application voltage: 3.3V, 5V, 12V are those #### How to design specification of each component

This is not a straight forward approach. Most often, it starts with the client or end-user energy requirement. For example, you need to run a device for one day or three days. This certainly is an oversimplification. Let start with a device that requires about 200mA, 3.3V running, and and 3-day operation in case there is no sunshine. The energy needed is:

$$E(Ws) = 0.2(A)*3600(s/h)*72(h/3days)*3.3(V) = 171072(J)$$

Assume that the net energy stored in the battery with 90% efficiency, and another 75% is the efficiency converting from the solar panel to charge to the battery. The battery requirement is:

$$(171072(Ws)/0.9)/3.7(V)/(3600s/h) = 14(Ah)$$

or for 2000mAh cell such as 18650, you need 7 cells. Remember, 7 cells to run a device that requires only 200mA for 3 days. Also, the energy needed to fully charged the 14Ah battery is:

$$14(Ah)/0.75 = 18Ah$$

Secondly, the variation of solar flux introduces another layer of uncertainty.  The flux is not the same in every location on the Earth's ground, and not every time for one location. Even with one location, the angle or dilatation of the panel directly leads to a different amount of energy harvested. For a simple approach, let say you went out and bought a 20W solar panel for \$15.

If the solar panel is operated at 18V and rated at 20Wp. In a particular location, the solar flux is 1000W/m2. The conversion efficiency of the solar panel is 10% given un-optimize tilt and unstable solar flux during that day which translated to a net of 500W/m2 of solar flux reaching the solar panel. In addition, the sun is up for 5 hours a day. Can 20Wp panel charge a 18Ah (at 3.7V) in one day? It is a bit confuse here. Let unwind!

First, 18Ah at 3.7V is different than 18Ah at 18V. Using Ah for battery capacity is convenient for estimating how long needed for charge or discharge. For example, 18Ah (at 3.7V) means the battery can run at 1A for 18h with a voltage of 3.7V. So the capacity of 18Ah battery at 3.7V is equivalent to:

$$18(Ah)*3.7(V)/18(V)=3.7Ah$$

Which means the battery can be fully charged at 18V with 1A in 3.7h. Noted that I assumed the solar flux at 1000W/m2 for often is the standard condition for solar panel efficiency testing. For a 20Wp panel at 10% net efficiency (assume that the rated capacity of this solar is 20%), the current produced is:

$$(20W/18V)*50\% = 0.55A$$

Time requires to fully charge a 3.7Ah battery with 20Wp is:

$$3.7Ah/0.55A = 6.66h$$

Which is less more than 5 hours by my assumption. My point here is to go over an exercise calculating a battery requirement and double check with solar panel capacity. We can adjust the condition to best fit our available part. For example, solar panels are available in 20W, 30W but not 21.1W. Lithium battery is configured to 1S (3.7), 2S, 3S, 4S but not 18V. In the end, the amount of time the battery can run for a particular system is an end product from best available trade off.

Now, are we sure that a constant flux of 1000W/m2 in 6.7h and 10% overall conversion efficiency?