Will switching to a flat screen monitor save on my Electric bill?


You've probably heard Flat Screen monitors for your desktop are more power efficient than their older CRT counterparts.  But how much more?   I decided to try a test of my own, to extrapolate some real world results.  How much power does the average flat screen save?  I wanted to know how much power my 3 year old 19" CRT Monitor uses, versus a 3 year old Dell 1900FP Ultra Sharp 19" Flat Screen Monitor, versus a Dell 20" Ultra Sharp Flat Screen currently in production (as of 2007), in a typical 8 hour day.  The power consumption between the 19" and 20" Dells was so slight I'm focusing between the 19" CRT and the 19" Flat Screen monitor.  Evidently four years in flat screen advancement hasn't resulted in any power consumption advances.

For testing purposes I used the P3 International "Kill a Watt" wattmeter.   This is a $30 device available from Altex Electronics off Midway Road near the Addison Air Field north runway.  It measures AC/RMS Line Voltage, Frequency, amperage draw, total wattage, and power factor.  Armed with this I should be able to calculate my monthly electrical cost to operate these devices.  The numbers below are only for the monitor, and do not include the Desktop PC.  The following numbers are measured after about 30 minutes of runtime.   CRT monitors take a brief surge of current when they degauss the monitor at power up, but this lasts only for a few seconds, and mainly when the monitor hasn't been running for several hours.

Here are the results:
                           Amps   VA     PF    Watts
19" CRT Monitor:    0.79    94    0.65    60
19" Flat Screen:     0.53    63.6  0.62    37
20" Flat Screen:     0.58     70    0.62    42

To determine how much it costs to operate a 60 watt CRT Computer Monitor 24 hours a day is fairly basic:

60 Watts x 24/Hrs a Day x 30.5 days = 43,920 total Watt Hours
43,920/1000 = 43.92 Kilowatt Hours
43.92 Kilowatt Hours * 0.16 per kilowatt Hour = $7.02 per month.
Yearly Cost = $7.02 * 12 = $84.24

8 Hours a day = 84.24 * 0.33333 = $28.08 per year,

To determine how much it costs to operate a 37 watt Flat Screen Monitor 24 hours a day:

37 Watts x 24/Hrs a Day x 30.5 days = 27,084 total Watt Hours
27,084/1000 = 27.08 Kilowatt Hours
27.08 Kilowatt Hours * 0.16 per kilowatt Hour = $4.33 per month.
Yearly Cost = $3.08 * 12 = $52.00

8 Hours a day = 52.00 * 0.33333 = $17.33 per year

Your savings: $10.75

Now, let's expect to pay about $250, including taxes, for an average 19" Flat Screen monitor.   At a savings of $10.75 per year, it would take 23.3 years to realize the power savings.  Now there are other factors - the CRT monitor runs hotter so your Air Conditioning has to work harder to offset the heat produced by the monitor.  On the other hand, in the Winter, the monitor also keeps your office warmer, so heat has to run somewhat less.  So perhaps the advantages and disadvantages offset each other.  If you set Windows to power save the monitor (not to be confused with screen savers, which doesn't save any power), the savings will decrease, and the payoff increase.

Now if you run your screen 24 x 7, say for a File Server, Closed Circuit TV, or a Web Server, the payoff is quicker.  Your yearly savings $32.34, will pay off in 7.75 years.

PCNS recommends, if your old fashioned glass monitor is in good working order, it isn't dim, fuzzy, or blurry, and you can live with it's bulk, PCNS sees no compelling reason to throw it out and buy a flat screen.  PCNS thinks your next screen should be a flat screen, but buy one only when you need it.  Note that older monitors have more power consumption requirements.  The best way to measure power consumption is with a meter like the Kill a Watt.

Acronyms

Amps:  Is the unit of measure which describes the "flow" of current traveling to the device.

VoltAmps:  Is the voltage, in this case 120 volts AC multiplied by the Amp rating, thus 120 Volts x 0.79 amps = 94.8 watts.

Power Factor is the difference between apparent and true power.  To get wattage, we multiply Volts x Amps x PowerFactor.

In a nutshell, things you plug into electrical outlets use electricity in different ways.  A 100 watt electric light bulb has a power factor of 1.0, meaning it's purely resistive.  A more complex device such as a Computer or Monitor introduce inductance and capacitance which uses electricity differently.  A device with a low power factor doesn't mean it costs more to operate.

Utility companies and electric providers measure the power we use by kilowatt hours.  To determine how much it costs to operate the Monitor 24 hours a day, we have to measure it in kilowatt hours.  A typical residential rate (for TXU, at least, in Richardson) is 0.16 cents per kilowatt hour.  Your rate may differ, depending on the provider, and the volume of electricity you use.  Rates may be different, depending on the time of day, and some providers may charge a minimum rate, simply for the privilege of being their electric customer.  Check your residential (or business) electric bill to get your current rate.

External Links

What is Power Factor?
Saving Electricity with Computers