We’ve been harnessing the power of the sun for centuries, from learning how to start fires with glass lenses in 7 B.C. to ancient Romans using windows to add solar heat to their famous baths.
While we’ve long known how to concentrate the sun’s power for more heat, using sunlight to generate an electrical current is a relatively new discovery.
Credit for the invention of solar panels varies depending on who you ask – a number of scientists made significant discoveries along the way.
In 1839, Alexandre Edmond Becquerel, a 19-year old French physicist, was in his father’s laboratory when he created the first solar cell.
After submerging silver chloride in acid and shining light on it, a current traveled through the connected platinum diodes and generated voltage.
What was then called the “Becquerel effect”, we now call the“photovoltaic effect”.
Years later, William Grylls Adams and his student Richard Day, discovered that selenium exposed to light would generate a current in 1876.
The selenium cells weren’t efficient, but they were a step in the right direction.
By 1883, a New York inventor named Charles Fritts began making selenium solar cells coated with a thin layer of gold and installing those cells in a panel form on his rooftop in 1884.
Like Adam’s and Day’s selenium cells, Fritt’s panels weren’t as efficient as their modern-day equivalents, but they represented a huge push forward in solar energy.
Modern solar panels can trace their clearest origins back to 1953, when three inventors at Bell Laboratories discovered that a semiconductive material like silicon was much more effective than selenium.
This silicon solar cell was 6% efficient and just enough to run small electrical appliances.
This discovery represented “the beginning of a new era, leading eventually to the realization of harnessing the almost limitless energy of the sun for the uses of civilization”, wrote the New York Times.
These cells were commercially available by 1956, although at $300 per watt, they were well beyond the financial reach of most consumers.
Solar cells were advertised in toys and radios as a novelty as Bell Laboratories tried to find ways to appeal to consumers.
While solar cells remained largely a novelty due to the price, they were key in the U.S. and U.S.S.R.’s space race – satellites from the late 1950s to the early 1960s were powered with solar cells and by the late 1960s, solar power was considered the gold standard.
Solar technology became a hot topic again in the 1970s, prompted by fuel shortages due to interrupted imports from the Middle East.
NASA worked with other government agencies to demonstrate the viability of solar panels while Exxon, surprisingly enough, led research to lower solar cell prices (they needed the panels to power warning lights on the tops of their oil rigs).
Prices lowered from $300 per watt, to $100 per watt, to $20 per watt (today, an average panel might cost you around $3 per watt).
Opinions about the actual usefulness of solar varied throughout the years – President Jimmy Carter had 32 solar panels installed on the roof of the White House in 1979, although a new President Ronald Reagan had them immediately removed, reportedly considering them “a joke”.
President Barack Obama had new panels reinstalled on the roof in 2014.
From 6% efficiency to 46% efficiency in 2014
While a drop in traditional energy prices in the 1980s slowed solar industry growth, since the 1990’s solar power has been steadily taking off.
Innovators have found more ways to boost efficiency, the government has subsidized costs, and both consumers and corporations have become increasingly concerned about their environmental impact.
While solar panels have come a long way since a young French physicist found that silver chloride in acid would react to light, the latest technology still finds surprising new ways to boost panel performance and make the switch to clean energy even easier.
From 6% efficiency in 1953 to a world-record high 46% efficiency in 2014, the future of solar panels looks bright.