Will we ever get full spectrum display?

Instead of this nasty spectrum

Attached: 1637942446048.png (503x356, 53.61K)

lazors will destroy you're eyes

Attached: snapshot_12.36_[2016.07.29_05.32.36].png (1920x1080, 3.56M)

low pressure sodium backlight displays when

Attached: low-pressure-sodium-lamps.jpg (1280x720, 62.65K)

An LED's tight band of output is exactly why they're efficient
If you had to blast color out in every part of the spectrum you'd have a space heater

Your shitty eyeball only has 3 types of cones anyway

Attached: 1633669165784.png (720x1280, 623.03K)

For a monitor?

It literally doesn't matter. Color encoding is fundamentally based on three primaries, and we can't even get 100% Rec 2020 yet.

>What is integration

I want this in my room. There's something soulful about having a pure spectral hue instead of the dull wide curves of LED's.

pseud moment

Attached: 1652854298848.jpg (1920x1080, 74.29K)

Thinking widening the spectrum wastes energy? Yes.

Brightness isn't determined by peaks, but by the total energy modulated by wavelength sensitivity (i.e. the integrated result).
In fact, a high peak in the blue part of the spectrum is highly wasteful because the eye isn't very sensitive to it. You ideally want all the energy centered around the green portion.

Your eyes contain 4 kinds of cells, ones that respond to a range of primarily red wavelengths, ones for green, ones for blue, and ones that respond to any light at all but are more sensitive for night vision (which is why you lose most color when it's dark). These cells respond the same to an image made of combinations of the wavelengths you show as they do to a full spectrum, so it makes no difference.

What does make a difference is when you're using light to illuminate objects. For instance you might have something yellow that reflects a specific wavelength of yellow light. In sunlight, which is approximately even amounts of all wavelengths, it will reflect that wavelength. But if it's illuminated by light with just these peaks at red, green and blue, it might look darker if it doesn't reflect these wavelengths much. This is why you need high color rendering index (CRI) or else your light bulbs make colorful things look weird. Yet a monitor with very poor CRI can accurately reproduce pictures of the same colorful items.

>what is quantum dot

what if i had more eyeballs

Attached: pepe_eyes.jpg (1024x930, 196.98K)

Why? That's more or less the spectrum your eye is sensitive to. Anything else is either going to be impractical or stupidly expensive.

when we get full spectrum eyes

>he uses 50 CRI bulbs

could work

Attached: 1631035849090.jpg (269x283, 19.62K)

Who actually uses DCI? I thought we all settled on Rec 2020 being the natural extension from sRGB.

Attached: color spaces.jpg (5678x1524, 1006.44K)

user they's still only have three types of cones

This is only needed in light bulbs for a better CRI
Xenon lamps have full activity across the entire spectrum and an endless CRI
tThis display will be monochrome orange
Not too tight enough for 100-110% of Rec.2020, only 85-90%. Need lasers
Rec.2020 unattainable without lasers. Phosphorus (CRT, WLED) is like a shotgun. LEDs and QD are like a rifle. Laser is like a sniper rifle.

QDs with electroluminescence would be the most feasible. Lasing is a complete other story, you need a DFB and a scattering layer, kind of like the newest blue LEDs.. which aren't quite ready for display tech yet.

Should cyan LEDs coated with red phosphorus give out white light? after all, if you look at CIELAB, they intersect in a white dot. Why is there no mass production of them? After all, blue (460-470nm) is more harmful to vision than cyan (490-500nm)