We had a very ‘enlighting’ chat this week with Universal
Display Corporation’s Janice Mahon, who is their VP of Technology Commercialization
as well as GM of their PHOLED materials efforts, and it looks very good for
the promise of OLED technology to take effect over the next 2-5 years. Janice’s
contention is that it’s really less about anything revolutionary, and more about
the crank and grind of process engineering and capital commitment at this point,
which adds up to a lot more predictability than we might have seen before.
I presented my personal bias early on, which is that while I believe OLEDs
are the display technology of the future, I have some reservations that
they’ll be nearly as significant in lighting. Ms. Mahon was unfazed, and not
on the defensive, which is a good test of a leader’s belief in their technology.
We were in agreement on the display side, and when it came to lighting, also
in basic agreement that OLEDs and standard LEDs would be complimentary
technologies. Obviously we’ve shown a preference for the "glowing panel"
approach to lighting our office spaces, so what’s the difference whether that
panel is driven by fluorescent tubes or LEDs behind it, on the edge of it, or
if the panel itself is glowing. We suppose the same argument would hold true
with lamp shades. Give us a glowing shade some light projection up, and more
light projection down, and do we really care to have a bright bulb in the center
of it all?
Of course, we haven’t had much choice in either area up to this point, since
prior to this millennium, we’ve had to have a lamp/bulb as the starting point
for those lumens regardless of our innermost need for light to just "be
there". Once we have the choice, do we choose a Star Trek glowing ceiling
or wall, or do we want tiny little projectors hidden about the room and throwing
lumens wherever we need them? Do we like shadows, or prefer the world without
them? I suspect the answer is the classic "sometimes yes, sometimes no"
that makes up so much of our technology transitions. One could argue that the
ultimate goal for lighting is to behave, to our senses, like sunlight (at least
in the daytime). Maybe shadows are important for that. When we get to night-time,
the goal may be contrary, such as having illumination that our brain doesn’t
interpret as daylight, so that our circadian clocks tick along most contentedly.
Maybe shadows become important to avoid. Very talented folks are looking deeper
into this whole lighting-physiology relationship, now that we have the ability
to control the wavelength like never before, and it will fill the decade with
lots of new revelations, to be sure.
So why should OLEDs be dominant in the displays, and at least very strong in
the lighting space? As Janice explained, it’s a simple matter of the process
economics. Both organic and inorganic LEDs start life with a similar approach,
in the sense of a substrate, which is, or upon which you incorporate a transparent
conductive layer, followed by the layers that do the light generating work
(inorganic for LEDs, organic for OLEDs) completed by the other conductive layer.
For OLEDs and "flip-chip" LEDs, that is often a metalized layer that
ends up being the "bottom" of the structure and also serves to reflect
stray photons out the "top". Just as we use SiC, GaN or related materials
for the standard LED substrates, OLEDs also have choice in terms of glass, plastic
or "other" as things progress. For OLEDs, you typically carve that
finished sandwich into 6×6-inch panels, and then stitch them together into the
luminaire. And that’s where the cost advantage comes. For standard LEDs, you’d
have the additional steps of having etched the actual LED structure onto the
layered substrate (known there as the epi-wafer), then test, characterize and
map all the gazillion chips, dice it up (don’t lose track of them!), package
all those little guys up, including matching them with appropriate phosphor
blends, possibly test and characterize them again, add any needed optics, and
then integrate the whole thing onto a luminaire. Ouch… all else being equal,
fewer manufacturing steps translate to lower costs. Got it.
The overall state of the OLED technology sounds pretty positive. As far as
that manufacturing process, it isn’t something one does in ambient air (vacuum
thermal evaporation being the current mainstream technique), but it’s also not
all that exotic, with VTE described the same process used to manufacture an
automobile windshield with its layered safety glass. The middle of the bell
curve is now something around 50 lumens/watt at the luminaire (system) level,
with L70 lifetimes of 30,000-75,000 hours. Janice acknowledged that standard
LEDs will likely always enjoy an advantage in overall lifetime, due to that
whole "organic things decay faster" challenge that makes humans pretty
short lived, compared to the rock we may have hit our head on. [Note to self,
wear an inorganic helmet]. OLEDs also need to be sealed up, in some way, to
prevent premature decomposition. In the case of the glass/metal sandwich, it’s
easy to visualize. UDC has done work with thin-film layers which can be "applied"
over the "open face sandwich", allowing virtually unlimited options
for shapes and curves, so yes, you can build a glowing lampshade out of them.
Unlike LEDs, where there are the dual technology development tracks of converting
more electrons into photons, and then getting more of the photons out, OLEDs
operate at 100% quantum efficiency, and are simply focused on the light extraction.
It may not be easier, but at least it’s just one challenge to solve at a time.
Very interestingly, OLEDs and LEDs carry a common challenge, which is that
the blue proved to be the hardest color to produce. Obviously standard LEDs
have solved that, since blue LEDs driving phosphor are the basis for most white
implementations. OLEDs are still working on that, but since they don’t "pump"
phosphor but in themselves are phosphorescent materials (think jellyfish), blue
wasn’t an enabler, just one of the colors. They have gotten to light blue with
success, with more work to do for achieving the saturated blues standard LEDs
take for granted. What that also means is that OLEDs don’t suffer a loss of
efficiency with warm white implementations. OLEDs good at warm white, LEDs good
at cool white. Again, things sound pretty complimentary.
Manufacturing technology, and cost effectiveness, are measured in generations
in the OLED world. Generation 3.5 and 5.5 are being implemented right now, with
3.5 delivering displays into our smart phone world, among other applications.
Ms. Mahon feels that Gen 6 will be the real enabler for something like a cost-effective
2×4 troffer replacement, and whether that is 2 years or 5 years will depend
on how aggressive the manufacturing base pushes their capital infrastructure.
Korea is a big driver in all this, which both Samsung and LG pushing to introduce
55-inch OLED TVs in time for the Olympics. Don’t expect those to be cheap, but
do expect them to be showpieces on how good the OLED display technology can
be. Manufacturing experience and materials volumes will ripple towards the white
OLED platforms, so it’s not out of the question to expect a display cost plummet,
that spurs volumes, which help drive a commensurate white OLED cost plummet
in the not at all distant future.
Choice is a good thing…