Hacker News new | past | comments | ask | show | jobs | submit login
Google offers at least $880M to LG display for OLED investment (reuters.com)
148 points by jbuzbee on April 10, 2017 | hide | past | favorite | 70 comments



For the last 10 years or so, whenever I wander around a store and see all the latest stereos and TVs, I'm never very impressed by anything. TVs got thinner and flatter, and then they were curved, which I just think is just a gimmick.

But recently I saw a 4K OLED TV from Samsung, and it stopped me in my tracks. Actually on two different occasions. I've never seen colors or pure black like that, and the clarity is just stunning. Anyway, this is just the first time I've ever felt the desire to save up and buy a new TV.

It would be nice if Apple started using OLED displays in iPhones, but I don't care so much about that. My iPhone 6S screen looks fine to me, and I don't feel any need to upgrade.


> It would be nice if Apple started using OLED displays in iPhones, but I don't care so much about that. My iPhone 6S screen looks fine to me, and I don't feel any need to upgrade.

It's a shame. My iPhones still feel like a downgrade from my years old Note 3 (display-wise).


One of the reasons I'm hanging on to a Lumia that barely got past a 2 digit price.

(Microsoft quit giving things away to consumers, is now pulling harder on enterprise teats.)


I've only tried the OLED display on a Samsung Galaxy S3, and while the colors were nice it was almost unusable in direct sunlight, due to the low brightness of the display. I was happy to upgrade to an iPhone 6, and regain the ability to use my phone in the sun.


I had a GS3 and hated that I couldn't use it in the sun, but my GS7 is usable in the sun.


Not only that, but OLED screens have perceptible low response times and burn-in (my old S3 screen has the Candy Crush UI burned in, and is also yellowish because blue LEDs burn in faster than green/red ones).

That said I suppose newer OLED screens do not have as many problems, but for a long time OLED screens were inferior to LCD screens, and were used only because they were capable of higher contrast ratios which made them much more appealing to the eye.


Perhaps it's typical for the sell and forget mentality of many Android manufacturers that phones are made to look as appealing as possible when in a display next to other phones but once the sale is over the love is usually gone. So a display that is super colorful in the beginning and yellow after a few months apparently for some manufacturers is no problem at all.

Lots of expensive looking big midrange phones suffer from this when I see people using phones in bars or public transport.


If you don't switch the note 3 to movie mode the color are super saturated. It is not accurate at all but maybe that is what you like over the iPhone? Side by side the iPhone will look dull compared to the note 3.


I bought an LG OLED TV last black Friday. Its incredible. I rarely watched TV but this winter I found lots of excuses to watch as much as possible.

Special mention to Amazon's the Grand Tour. Available in 4k HDR some segments have the best quality video I've ever seen - sometimes the TV just looks like a window.


I also bought a 4K OLED TV last black Friday. It's great. I played some games on it and it was amazing.

One thing that was off-putting: I'd noticed that the picture isn't perfectly square. Indeed, it seems that the TV warps the picture over time until you go through the one-hour-cleanup-thing in the settings. It's basically like oldschool degaussing, I gather.

But the color reproduction and contrast is amazing. My old 23" LCD TF panels just don't hold a candle.


I'm considering getting one of those LG 65" OLED TVs, after seeing one in-store and being completely blown away by the picture quality. Much like the GP I can't recall the last time I saw a display that actually wowed me, but watching a space shuttle take off, and bright stars as dots against a pitch black sky without the typical blue-ish backlight bleed, was just amazing. I stood there watching the same 4-5 minute video loop for what must have been half an hour or so.

But your comment now gives me pause – the picture warps over time? Do you have any sources on this so I can learn more?


Certainly it's nothing like degaussing; it's a panel of individually addressable OLEDs that do not move, not a vacuum tube.

My guess is that this is an artifact of some sort of 'motion control' or other processing the TV applies.


Samsung are not making OLED TVs. Maybe you saw an LG OLED TV?


Whoops, you're right! Haha the TV I was looking at wasn't even an OLED TV. I think it was this one: http://www.samsung.com/us/televisions-home-theater/tvs/4k-su...

I guess that's just a regular LCD, but whatever they're doing, it looks amazing.

P.S. I also looked at the LG OLED TVs, but the picture didn't look as nice. Of course, it was probably just the demo video.


The high-end LCDs can go very bright nowadays, and HDR content means that there can be a great contrast between the bright parts and dark parts of an image.

The Samsung you point at is one of the best of last years TVs, so yes, a very good picture. Only downside is that it is edge-lit, meaning that the brightness of light objects can bleed into the surrounding areas.

Where OLED shines though is the low-end, the blacks/dark scenes, where you can go very dark right next to very light.

source: I'm looking for a new TV, so of course have to do extensive research...


I got a Samsung KS8000 and honestly couldn't be happier. I didn't see an image improvement in the 9000 series to justify the additional expense and I didn't want any bigger than 49"

I would love OLED but burn in concerns me still. Maybe in a few years.


What are your favourite options right now?


I've had my eye on the LG OLED55B6V for months now. I actually bought it when it hit £1500 in the UK but was cancelled because they ran out of stock. A few months later and it's now at £1750, probably because they've refixed GBP/USD on it after the brexit disaster.

If it gets down to £1500 I'm gonna pick it up. I watch a lot of dark content and the OLED is amazing at showing it properly.


I think the only thing of note is that the B6 apparently uses a different SOC from the more expensive models (C6/E6 upwards) and apparently this has an effect on motion (potentially subtle stuttering on content not at native frequency).


I don't understand how inflation hasn't gone up by about 20% in the UK - most things seem to have had a huge increase in price with the combination of VAT at 20% (2011) and the devaluation of the pound I'd say we are already a much poorer country than we were 6-7 years ago.


Depends on the price range.

I'm looking at the LG E6 or Samsung KS8000 (which in the UK is the equivalent of the US KS9000) which are 2016 models, or the LG B7/C7 and Samsung Q7F series (2017 models).

One thing to note is that the OLEDs are seriously expensive. The smallest of the high-end ones run at £3k for a 55" set.


Really? The sweet-spot one, LG OLED65B6V (65") is about 24k SEK in Sweden (after 25% VAT). That's about 2100 GBP.

In the UK (with 20% VAT) this one seems to be about 2700 GBP:

https://pricespy.co.uk/product.php?p=3616500

I wonder what's going on here?


The big issue with OLED's is differential aging of subpixels across space and time.

In other words, red, green and blue subpixels "wear out" at different rates over time as well as over the surface of the display.

This means that as time goes by you could have one corner of the display develop a green tint while the other corner goes red.

This might very well be one of those technologies that has all the makings of the greatest improvement in the field yet never gets there because of one nagging issue.


It seems fairly trivial to keep a lifetime count of the on-time for individual pixels (or even just subregions) and adjust the tint based upon that.

This could be fairly easily solved, although doing this in real time at 4K would require custom silicon.


As with everything that should be trivial this becomes fairly non-trivial upon contact with reality.

At the most fundamental level you need three lookup tables with one entry per sub-pixel. On a 1920 x 1080 display that means a little over two million entries for each, R, G and B. In other words, six million entries.

Pixel period at 60Hz is going to be in the sub-10 ns range. This means you would need lookup table memory that can be randomly indexed and respond within less than 10 ns in order to be able to grab a correction value for each and every subpixel being displayed. Due to latency and other realities of DDR SDRAM this is impossible. DDR SDRAM does far better for burst access of sequential blocks of data within a row. It does not do well for fully random access of single memory locations. DDR4 can't do full random single location access at these data rates.

Wikipedia article on SDRAM basics:

https://en.wikipedia.org/wiki/Synchronous_dynamic_random-acc...

Other memory technologies (QDR SRAM and RLD RAM) do exist that can accomplish the above requirement. They are extremely expensive and so they are not commonly seen in products such as televisions.

Yes, custom chips or an FPGA would be required in order to manage the process.

Even then, the problem isn't resolved.

The next issue is one of differential quantization. Each subpixel is either going to be driven by an 8 or 10 bit data stream. I'll use 8 bits for this example.

Say a blue sub-pixel is 25% "worn out". Assuming a linear transfer function (it isn't) this means that in order for all other blue sub-pixels on the entire screen need to be limited to a max of 75% output in order to not be brighter than that one blue sub-pixel.

Yet, the problem doesn't end there.

We also need to knock down all red and green pixels to limit them to a maximum of 75% output. Why? Because, in order to make white light you now have to compensate all color channels, otherwise you'd have too much red and green against a weak blue output.

It very quickly becomes a lowest common denominator problem.

Going back to quantization. When we started with a fresh OLED display we had 256 levels from black to white. Now we are down to 192 due to the necessary compensation.

But, that's not all! There's more!

Let's say green sub-pixels on the opposite side of the screen have now "worn out" down to 60% max output. Now the entire screen needs to be brought down to this new limit. And total quantization levels are now down to 153.

The practical result of losing quantization levels is image degradation due to an inability to smoothly map input to output.

On first inspection this might seem to indicate that a high speed three channel lookup table isn't really necessary. If we trim the output of all pixels down to the worst case pixel, well, just adjust the entire display. Except this won't work. If you bring all pixels down to 60% you are also bringing down the bad pixels that could not produce full light output. The end results is that, yes, you do need a high speed lookup table in order to leave bad pixels alone (not quite) and adjust good pixels to match the bad pixels.

And yet, we are not done with this. There's a fundamental issue that hasn't been addressed, even after throwing all that technology at it: Calibration.

In order to be able to populate three lookup tables (red, green and blue) we need to be able to measure wear at the sub-pixel level. This is extremely time consuming and most definitely well outside the realm of consumers. At the factory? Sure. At home? Not likely. I can see gross measurement. For example, manually measure sixteen zones and accept the fact that the image might have visible color differences due to this. Or, maybe, use a high resolution sensor to measure a few hundred zones. And this is when, to reach for the popular saying, you start to find yourself up to your ass in alligators when all you wanted to do was drain the swamp.

Funny how things work.

EDIT: Forgot to add. A lifetime "on" time counter won't work. The degradation of OLED pixels isn't purely a function of time. You can end-up with pixels that have been active for exactly the same time and one is significantly brighter than the other. You have to measure.


I'm not going to argue that it would be a reasonable engineering challenge, but it certainly does not need random access to the memory. Screen frames are updated in a predictable pattern, meaning we could take full advantage of the bursting capabilities of modern memory.


I feel as if the picture quality of most contemporary televisions has topped out for my tastes. OLED can really look wonderful.

I wish more effort was being made to have them more smoothly integrate into a livings space rather than remaining as a great, black square on the wall [0].

[0]www.dezeen.com/2017/03/14/yves-behar-design-samsung-television-look-like-frame-artwork-design-technology/



What is special about this TV?


Yeah, looking forward to OLED TVs and monitors that use the BT.2100 color space standard (BT.2020 + HDR).

http://www.flatpanelshd.com/news.php?subaction=showfull&id=1...


It'd be a nice upgrade for the phone as it'd help battery life, especially with night mode.


I'm not sure that this is the best move in the long term. Mass produced phone-sized MicroLED screens should be ready by 2019, and MicroLED can be up to twice as efficient per lumen compared to OLED.

A large purchase of Samsung OLED displays would be a much more sensible move than an investment in manufacturing for a display tech which may not be around for long.

One explanation could be of necessity. Perhaps Google initially wanted to make a large purchase of Samsung or LG OLEDs, but neither may have the production capacity to fulfill the order. Samsung is already making 70-92 million OLED displays for a recent Apple order, and LG probably doesn't have the needed production capacity that Samsung already has at scale.


Thanks for the reference to MicroLEDs, I've downloaded a bunch of stuff to catch up on them, I had not realized they had become a 'thing' as it were.

They are however a bit more speculative than OLED. I was looking at an LG OLED television in Fry's (just $6K :-)) and the contrast is really nice. I bet it really pops in a low light room as well. So an 'in production' house vs a 'production hopefully in a year' house seems the more prudent bet?

And the other interesting bit is everyone is thinking phone, phone, phone. Why not a competitor to Surface Hub which appears to be making Microsoft a solid margin? Why not have a 'Google' TV which is just that, the entire screen and everything that hangs on the wall. You talk to it with Google Home, it shows stuff from the Internet and boring old TV shoes etc. The first Google TV effort was pretty doomed but maybe as a way to augment your TV (new YouTube TV is out), your phone, and your general notes place? That might be Google thinking the long game there.


> MicroLED can be up to twice as efficient per lumen compared to OLED

Do you have sources for that?

To my knowledge, inorganic LEDs have the advantage now, but both technologies approach the same limits which means that in the long term both should deliver the same efficiency. In the long run I don't see a deciding difference here.

One major complication for OLEDs is processing and sealing because they tend to have a short lifetime for various reasons. This sounds simple but in practice it makes everything much more complicated.


If your reasoning (necessity) is on mark, I wonder if this is evidence of the monopsonistic jiu-jitsu that Apple is famous for.

Supply-chain dominance wielded as a weapon.


Right but you sti;ll need to secure OLED supplies until then because Apple is now buying 70+ million screens a year from Samsung.


I thought they couldn't purchase from Samsung because Apple bought out the supply.


I agree.

I feel this is just a sign Google is playing a bit of catch-up (maybe even a tiny bit of desperation) in getting curved OLED displays. There's only so much capacity available, with Samsung and Apple eating up most of what's coming out.

Even if a billion dollar check was cashed today, that will only bring online one plant, which won't be ready to ship products for a few months at best...


I can understand why they chose OLED to invest in. But I think longer term, due to the low power characteristic electrophoretic or interferometric would be much more exciting for mobile displays. I'm still waiting on the roll-able E-Ink display we were promised in 2013. https://www.youtube.com/watch?v=94Ifhuc2bbQ


We should finally start to see OLED become mass market - the last patent is set to expire this year . https://www.thestreet.com/story/13398746/1/universal-display...


$880M is a huge number, isn't it? Can this number suggest how many phones Google is aiming for? Or is this number too big for just phones and suggest google has some other product up its sleeve?


It looks like Google is really going out with the Pixel line. There has been so much advertising for this product. And they seem to be selling, with some people having trouble getting out.

Of course, numbers out suggest they might have sold around 2 million, which isn't that many, but still a step above Nexus, probably.


And the margins on Pixel must be huge.


I hope future OLED displays don't have PWM flickering. Such a negative crippling an otherwise great technology.


I had an S6 and never noticed any flickering. Nor on my Apple Watch, which uses an LG OLED if memory serves. Is it really an issue in practice?


How slow do these oscillators run? It seems like that would be masked by persistence of vision prettymuch anywhere above 90Hz.


May not be visible if your phone is still, but you can see faster flickers if it moves or your eyes do. I don't have an OLED phone, but the backlight flicker on my Pebble Time is visible as I raise my arm up.

Not actually a problem for watch usability, but looks funny and takes away from the image of the pebble screen as a constant object. Can't vouch for how it affects phone usage.


After testing with (larger, but still OLED) LEDs I've got here, it seems to be far higher.

In my projects, for any frequency above 800Hz flickering wasn't observable, but still created nausea and headaches. Above 8000Hz nausea and headaches were gone.


Please specify what you are measuring, because the frequencies you are quoting are far outside human perception limits. These also don't match the numbers being quoted by the VR guys.

I'm extremely sensitive to refresh rates. I used to have two gigantic ViewSonic CRT monitors on my desk along with special Matrox video cards so that I could have high refresh rates.

60Hz gave me whacking migraines. 72Hz was fatiguing but didn't give me migraines. 80Hz+ and I was just fine.

I haven't had a migraine due to refresh rates on digital displays ever. Not even for incredibly slow displays.


Keep in mind that LEDs and CRTs are two completely different things. CRTs have "phosphor decay," where a pixel hit by the electron beam will gradually turn on and fade out over time until being refreshed again. For LEDs, they're normally operated using a PWM; turn them off (0% brightness) and back on (100% brightness) fast enough and people don't really notice that instead of 50% constant brightness, they're getting 0%, 100%, 0%, 100% ... with a duty cycle of 50%.

You can see that even with LCD monitors. LCD monitors with a LED backlight will have a display refresh (ie. how often the pixels can change color) of, say, 60 Hz but the LED backlight might be run at 4 to 8 KHz (ie. how often the whole backlight turns on and off to give different brightness levels).


Remember, these are PWM frequencies, not refresh frequencies.

A 60Hz LED display is usually driven by a 4 or 8kHz refresh crystal.


That's interesting because I have an Oculus DK2 with an OLED panel (Galaxy Note 3 panel) and I've experienced 0 issues in what many consider the most sensitive display environment.


800Hz PWM can cause nausea?

Is there any objective research supporting this? I haven't heard of anyone doing simple blind testing or controlling for other factors.


Wouldn't that make one unable to basically function in our world, given that a lot of things, even most basic LEDs, are PWM-driven, and not necessarily at very high frequencies?


I believe you're thinking of the pixel refresh rate, where 144hz is about the best you'll get and 60hz is normal, whereas PWM is used in the backlight and the frequency is indeed in the khz.


I'm afraid he does indeed talk about PWM. Sadly, lots of LED fixtures and displays (especially in laptops) still have atrociously low PWM frequencies...


60Hz PWM wouldn't even work in laptops – you’d have, at 60Hz refresh rate, only the following colors available: black, red, green, blue, yellow, magenta, cyan, white. As, for each frame, you could turn each pixel only either on or off. 8 colors isn’t exactly 16 million colors.

To get 8-bit colors, with PWM, you need far higher refresh rates. For OLED, that’s 15.3 kHz minimum to get sRGB colors with PWM. If you only need to dim backlight, 4kHz is usually enough.


Huh? I'm talking LED backlights and lamps that regulate brightness with PWM. No color generation involved here. And these can get really ugly: https://www.notebookcheck.com/Test-Lenovo-Ideapad-310-15ISK-...


WTF

That’s all I can say about that.

That said, you were mentioning laptop displays, and the topic was OLED, that’s why I was mentioning why for OLED 60Hz is basically impossible.

And as the display of that Ideapad has around 50 brightness steps, you get a real refresh rate of about 1Hz. That’s insane.

As mentioned, I’ve tried this with the LEDs I’m using to light my room, and below a few kHz I get nausea, and so does the rest of my family.


You can get at least 240hz if you're willing to pay.

http://www.benq.us/product/gaminggears/xl2540


How long until Apple buys a panel maker so that they can vertically integrate every part ? This investment by google seems poised to if it goes well for google to work closer so that they might acquire a piece .


Great question. You may already know that Apple has a long history of locking up LCD production capacity.

'After Apple's Q1 earnings call in January, Apple COO Tim Cook told the press and analysts that the company had entered a $3.9 billion component supply deal in a key area that was "an absolutely fantastic use of Apple’s cash". Many speculated that, after flash storage supply deals and agreements, Apple identified high-resolution LCD displays as a key factor to iOS' devices manufacturing process. Back then, speculation and Tim Cook's own words suggested that Apple had entered a deal with three manufacturers, including Toshiba and Sharp. A month before the the Q1 financial results, Apple was indeed rumored to be discussing with Toshiba an investment in a new $1.19 billion factory -- the same that Nikkan Kogyo Shimbun is mentioning today. But at the same time, several reports suggested that Apple was also considering a second investment in a $1.2 billion facility from Sharp -- with over $60 billion in cash, a double investment in LCD manufacturing wouldn't have surprised anyone. But today's report seems to confirm that the deal with Sharp hasn't gone through, implying that Toshiba has been chosen as the only Japanese manufacturer of iPhone LCD screens.'

From: https://www.macstories.net/news/apple-investing-in-toshibas-...


Yep, and its not just LCDs, these deals happen with many of their suppliers. Business-wise its often much better to do so than buy them outright, because you avoid the risk of owning the wrong tech in a fast-changing technology environment. Just finance or invest in some facilities that make the parts you need now, and leave the long term business risk to the suppliers.

Unless there's some groundbreaking tech that is a key differentiator for Apple (e.g. CPU/GPU), its better to NOT vertically integrate.


Best-in-class or near best displays are supporting and necessary but not sufficient for Apple. So acquisition doesn't make sense. The money is to guarantee supply for their huge product volumes and also sometimes a competitive measure to lock up supply so competitors can't get at it.

On the other hand, I would not expect Apple to farm out their software, OS, marketing, industrial design any time soon. Unlike the supporting value of LCDs these are all differentiators.


Google are planning to go big with pixel devices current generation they had a lot of supply constraints and have been unable to meet the demand.


What other part does Apple make?


Apple has a bunch of internal SOC designs https://en.wikipedia.org/wiki/Apple_mobile_application_proce... (currently at least 16).


Apple designs their own SoCs, but they do not manufacture the chips themselves. Fabrication is contracted out to Samsung and TSMC.


But designing and manufacturing are different.


Curvy OLED displays are neat, but it ultimately doesn't mean anything,

It's crazy that Google isn't investing that money in software. Unless it's a hedge.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: