Hacker News new | past | comments | ask | show | jobs | submit login
Switch off bad TV settings (practicalbetterments.com)
421 points by DitheringIdiot on Dec 4, 2023 | hide | past | favorite | 495 comments



As a former high end audio/video salesperson, I just want to state for accuracy that local dimming, as a feature, is not dynamic contrast. Dynamic contrast is terrible. By having an LED backlight array dim spots that are darker in the source material, the display achieves better absolute contrast. It is not adjusting the exposure to fake it. It is instead getting closer to the contrast given by the source material. It created some halo issues, but it was a step in the right direction.

This is not new tech at all - I was selling Samsung LED TVs with this feature in 2007 or so. Samsung, Sharp, and Sony has little choice but to improve contrast, because their LED sets were right next to Pioneer KURO plasmas that were just absolutely amazing - OLEDs are only catching up their PQ now, 15 years later. First on the scene was the Samsung LN-T5781 - https://www.cnet.com/reviews/samsung-ln-t5781f-review/


Seriously. Imagine people going out and proudly buying a shiny new MiniLED TV only to have their half-educated HN jockey of a child come in and disable the entire point of that technological advancement.

Even normal LED backed LCDs can have FALD (Full Array Local Dimming for those who don't pay attention to this field) and that's not especially new, though, hit or miss in effectiveness on earlier TVs.


Eh, depending on the size of the dimming zones, the halo can be much worse than the shitty LCD contrast it is trying to improve. At the extreme you have edge-lit displays which is just 100% useless at improving the contrast in any realistic scene while introducing very noticeable giant halos for mostly-black screens. You can get used to global shittiness but local and temporal artifacts will always stand out.

It's all just an ugly hack compared to real emissive display technology where each pixel can be set to any value on the full brightness range individually.


I came here to warn people that they should not disable local dimming on their TVS. It'll make dark scenes look overly bright and washed out.


One thing I seemingly can’t disable is how my samsung tv gets louder when ambient noise is high.

I absolutely do not want my TV to get louder when one of my kids is shrieking. Just adds stress on top of stress


Does this help? It's related to the "Sound Sensor" but the menu setting is not called sound sensor. There's also apparently a physical switch you can turn off? I can't check that though.

https://www.samsung.com/latin_en/support/tv-audio-video/how-...


I couldn’t find the physical switch on mine (the tv is hung to close to the wall and just below it is our mantle) but there was a GUI option under the settings “privacy” area!


I’ll give it a shot, thanks!!


It makes me anxious just knowing that TVs exist that do this.


I believe there is a setting for that.

Using an external receiver can help too.


External bluetooth transmitters/receivers are also the cure for shitty PC bluetooth stacks.

They don't switch to garbage quality mode every time an app, website, or game queries the microphone. They don't re-enable shitty defaults every software update. They don't require text config files in linux and the critical settings in those files don't get ignored due to open source politics. They don't mess up pairing every time you reboot into a different OS. They just work. $50 will banish all your bluetooth troubles to the deepest pits of the underworld, where they belong.


I should have been more clear. An audio/video receiver.

Beyond Bluetooth optical audio is quietly pretty decent.


Yes, TOSLINK is a godsend. It's immune to ground loops and motherboard manufacturers that don't give a shit, which is all of them, even ones that brand around having decent audio (ProArt I'm looking at you).


I have a different Asus motherboard and the audio hardware is on an apparently flaky USB bus (the motherboard has several, as they do). Even with an optical connection, the audio drops out sometimes. It was maddening to me when I first got the computer, because things like this are usually "not all the CPU pins connected to the motherboard" or "you know that RAM you bought on Amazon? yeah, it doesn't remember what you store in it! savings!". But... not this time. (I can pretty much kill USB on this machine by plugging in a bunch of unused USB cables; plugged into the computer, but nothing on the other end.)

I use an external DAC and I've learned which buses break USB when looked at the wrong way. But ... of course the on-board audio is just a USB device. Can't waste a PCIe lane on that!


This is for input but my lav mic is really quiet and has a pretty high noise floor and ground loop (not too bad tho) using the motherboard's 3.5mm port. Bought a USB sound "card" for 5.99 on Amazon a year back, it’s not even close. It’s output is mediocre at best but I don’t use that.


I tried a BT emitter once (a few years ago) and the out-of-sync with the video was unbearable.

Are there models that are recommended? I have an old Samsung TV (ca 2010) and would love to add BT to pair my wireless headphones.


I use a 1Mii B03 connected to my PC through TOSLINK. It has a physical switch to toggle between low latency and high definition mode.


Ensuring the Bluetooth version is ip to date enough on both sides is essential


> an external receiver

Yes, passthrough digital audio to an AVR if at all possible.


Open TV. Find microphone. Apply tape.


screw tape, cut it out. Any samsung TV with a microphone and an internet connection is probably sending everything it picks up to data brokers.


Tape? My television does not need a microphone, if I identify one and open it up, there is no reason to leave it there.


Just gotta find a proprietary Samsung screwdriver first.


This should be what you're looking for: :)

https://www.milwaukeetool.com/Products/Hand-Tools/Hammers/48...


Apply dollop of superglue, THEN tape for maximum quiet


I don’t hate this solution one bit


If nothing else works, you could open the TV up and disable the microphone.


A modern panel is a complex, powerful device with a wide range of capabilities. Human beings are diverse and even an individual's preferences can change over time. It's right that there are settings for this stuff.

What isn't right is that instead of using standard terms common across manufacturers and clearly related to what the settings actually do, manufacturers expect anyone who wants to change them to reverse engineer what Dynamic TruScungle means, how it interacts with Active ProHorngus and whether 100 means the ProHorngus is turgid or flaccid.


What else isn’t right is when the setting is hidden four submenus down and the settings is covering the whole movie.

Let me toggle the settings easily while the picture is displayed, so I can understand what it actually affects.


This article is timely because we just setup our new cyber Monday Samsung TV. The screen was on for about 2 seconds before I recognized the frame interpolation. We spent a good 15 minutes going through the settings to find what we thought was doing it and seeing how it looked. Hate that they turn this shit on by default.

Even worse is I’m always the one who notices it when no one else does. It’s immediately obvious to me and everyone else is going “I don’t see what you’re talking about”, drives me bonkers. And then trying to find the setting to make me happy takes too long.


I feel you, I have the exact same experience. I always see this and I'm bothered by it, yet most of the time nobody else sees it and it's hard to really convey to other people what you mean.


I get irrationally mad at frame interpolation! :-D


I'm surprised it doesn't mention "sharpness". It's tricky find the zero point: e.g. '0', '50', or '100' depending on whether it means 'add sharpening', 'smooth/sharp', or 'remove smoothing'.


Yeah, that’s often inconvenient. There are test images like http://www.lagom.nl/lcd-test/sharpness.php one can use for testing this. A USB stick with some test images can be useful.


This is a frustration shared with some monitors, too. Either the zero-point should be obvious or there should be a toggle that disables the setting altogether.


Agree with sharpness, I undo this on every TV I get a chance to touch at friends/family.


You really go around to people's homes and change their TV settings?


As in I randomly knock some doors, ask to come in, find their TV, then their remote, then change settings to my liking and leave?


Not all heroes wear capes.


"Excuse me sir, Do you have moment to talk about your TV settings"


I mean sure, if that's your style. Personally I carry a universal remote for when I'm doing this.


I only do it for TVs that are accessible to my universal remote that I can see from outside the windows, that way I don't need to disturb the families


I once fixed the remote at a brother-in-law's place. Came back a week later and changed channels using the remote. He shouted "You mean it's been working the whole week and I've been on my knees changing channels?!"


Frame interpolation is so incredibly awful looking in my opinion. Especially when it comes to animation. I can not comprehend all of the people on Youtube that take a beautiful drawn animation that is intentionally 24 frames per second and increase it to 60 thus ruining the hand crafted perfection of the drawn key frames.


A lot of good animators will also vary how many frames they animate per second. Trying to smear all of it into 60fps doesn't improve anything.

A good example is FUNKe[0], He's got a style with very pronounced changes in framerate. One movement will be 3fps and the next 30, never mind that the lipsync tends to be at a high framerate no matter what the rest of the animation is doing. Imagine trying to convert that to 60fps and still have it look good.

[0] e.g. https://www.youtube.com/watch?v=maqIaT_ZUxs


Very little hand drawn animation is done a 24 fps. Which make interpolating the actual movements even crappier.


I think the phrase they use is "drawing on 2s" for every second frame (12 fps) being drawn and "drawing on 3s" and so on depending on how many native frames there are between drawn ones. Usually, different things onscreen will be animated differently for effect/budget.


Question: Does this mean that animators draw every 2nd or 3rd frame and then some technology interpolates the missing frames? It sounds to me like that is not fundamentally different from interpolating 24 fps to 60 fps.

I suppose there are two differences:

1. The artist retains control of the interpolation between drawn frames.

2. The computational resources available to the artist far exceed those available in the TV.

Apologies if this is an uninformed question, I watch animation but don't know how it is produced.


The amount of keyframes in animation is not even close to 24 and everything inbetween are called "tweens" for obvious reasons. From my understanding the tweens are actually drawn as well, but can be delegated to non-lead artists thus don't require as much artistic decision making. You get the key frames and then draw what you believe would be the transition.

I suspect with modern digital animation you could do a lot of the tweening automatically on the computer, but even then you would be carefully selecting them and fine tuning it to match your vision.

Then at the end you would actually have 24 frames per second to turn into your final video. You shouldn't have any frames where something is mid-movement and blurred between the two positions.


Thanks for clarifying that. Sounds like a training ground for the artists (until they are fully replaced by digital tweening.)


I don't know if it carried over into 4K TVs, but I'm surprised that the dreaded HDMI overscan isn't mentioned anywhere. I'll never understand WHY and HOW someone thought that implementing this at all, let alone making it the default, is a good idea. You had one job, accept a 1080p signal and output it pixel-perfectly to the 1080p display panel. Yet somehow, someone thought that it would be a great idea to cut off the edges of the image and interpolate it so everything looks atrocious. And then every single TV manufacturer agreed and implemented it. Whenever I see this kind of cropped image on a TV, I grab the remote and set it to the only mode that should exist on digital displays, direct pixel-to-pixel mapping. Just blows my mind I have to do that.


I have an Aorus 4K Display [Gigabyte], which is technically "not just a TV screen," and in order to defeat the overscan mismatch [having found no method functional either in OS or on TV settings] I plug the HDMI cable into a DVI->HDMI adapter [instead of HDMI direct from TV to HDMI output].

For some reason, this dongle hack, de-necessitates the overscan settings [which never seem to work/hold]. But of course then there is no 4K output [which is fine].


From my previous reading on the subject, HDMI comes from the TV world, and DVI comes from the computer monitor world.

On a computer display, you want each pixel to be square and to be visible on the screen, a border around all the pixles doesn't matter.

On a TV, historically, pixels didn't exist, and the nearest concept to a pixel (I forget the term) wasn't square, it was rectangular. Showing a border around the content is not acceptable, and it should go right up to the very edge. Also in the TV world, the most important thing in the image, not the pixels.

So it's this philosophical difference, and separate histories, which means you should never try to get a TV to be a monitor, or a monitor to be a TV. Of course I only learnt this after buying a device which said it could be both.


It's rare but some source material is created with the assumption of overscan, whether that affects on-screen graphics or even some cases just total garbage is produced into the overscan areas... it all varies, but from the TV maker's perspective, doing what you're describing means this garbage never appears and they never get calls from confused angry customers who think their TV is broken because some broadcaster has some garbage at the edge of the feed.


Can you name a single example from this millenium?

Overscan makes abosolutely zero sense since the move to digital formats for transmission and local playback.


Consumer Reports has a "TV Screen Optimizer" that aims to give users optimal picture settings by Brand/Model.

Also nice that they mention how to turn off ACR and other privacy related features as well.

https://www.consumerreports.org/mycr/benefits/tv-screen-opti...


ack! you have to sign up for an account


For those with a Sony Bravia panel (2015 and later I think), you can enable a pro-mode with a key combo that can turn a laggy and unusable panel into mostly a dumb display.

Display, Mute, Vol +, Home

I used this to basically save a frustrating laggy panel.


Sadly, (mild) motion interpolation is necessary for those of us that get headaches from 24fps video, especially panning shots.

If only filmmakers started with decent frame rates. The few films that came out in 48 fps are so much nicer to look at.


It's funny, watching films in 48fps in theatres (specifically the first Hobbit movie that pioneered the concept) to me looks like the actors acted in 2x slow motion and then someone pressed fast forward. Everything looks incredibly unnatural.


I had a different experience. At first, I found things unnatural like you did. After a few minutes, I figured out that it was actually the opposite - it had a bit of a theatrical thing going on, i.e. live action vs. recorded and played back.

At that point I figured out that it was just because I've been so used to crappy frame rates that the more natural movements feel out of place.

I wonder what the first pass, with less motion blur, would have felt like. Maybe better, maybe worse. I kind of feel it would make it worse, in the same way that the transitional from analog to high definition digital made it look worse to me since I could notice the transitions between frames. That is, at least at first. I'm used to it now.


I firmly believe most people complaining about the Hobbit in 48 FPS just don’t like change.

It’s not worse. In fact it’s so vastly better. Watching a 3 hour movie in 3D without getting a headache from 24 FPS judder (“magic”) was a revelation.

But different bad.


Back in the day, I watched The Hobbit first in 3D 48 fps, and I hated it. Sure, it got less bad throughout the movie, but I just couldn't get used to it at all.

I then rewatched the movie (also in theaters) in 2D 24 fps and it was infinitely better.

I have the same with YouTube videos. I can't stand 60 fps videos (except for gaming content); it causes headaches for me. If it's something I really want to watch, I'll download the 24 fps version and watch it offline (YouTube has it on their servers, but only serves it to certain clients or something).

I know other people who just can't see a difference, or rather, they don't notice it much. I feel like I can spot a 60 fps YouTube video in about 2-3 seconds (usually I pause then to check and reconsider whether I really want to watch it). I also tend to notice 30 fps videos (compared to 24), although they don't really bother me.

Considering that I've always been the only one at movie nights to notice when the TV's frame interpolation setting is on, I guess I'm an outlier.


> If it's something I really want to watch, I'll download the 24 fps version and watch it offline (YouTube has it on their servers, but only serves it to certain clients or something).

This is going to be interpolated from 60 FPS. At least get a 30 FPS version that can just drop frames from 60.


I was really looking forward to 48 fps - the juddering in wide panning shots on 24 fps always takes me out of the immersion so I was hoping the cinema world could move forward.

There was just something very wrong with it. It kept feeling like suddenly parts of the movie were sped up and going at 2x speed. To the point where I wonder if the cinematographer was just not experienced with the technology to make it work right (like, maybe there are collaries to the 180 rule that need to be experimented with)

I watched it in 2D, I'm sure for 3D it makes a whole different experience.


I didn't like it either. I think it looks too real and takes you out of the fantasy. Something about the way 24fps looks is different from reality and lets your imagination take you away a bit easier. I can't really explain it.

Like I can watch an animated show, and I have no expectations of it looking real. I can still get lost in the story and enjoy it. I don't need it to look like I'm actually there.


Also saw the first film in 2d in the theater at 48fps, also very excited going in, also felt like everything looked sped-up.

What it reminded me of was silent era film that’s been slightly sped up on purpose for comic effect. All the walking looked kinda jerky, like it does when you speed up footage of someone walking, for instance. Or very early manual-timing film that was cranked a bit inconsistently. It was so distracting I could hardly focus in anything else the entire movie. If it’d been a better film, I’d say that gimmick ruined it, but… well.


Yeah it was very strange. Like, you don't get that effect from 60fps TV or video shot on your smartphone, so I wonder what went wrong to cause it.


Yeah, I’d seen higher-frame rate video before, and since (though maybe not again in a movie theater?) but that’s the only time I’ve noticed that particular problem. I spent a little time searching around after I saw it, trying to figure out what happened, but the chatter over 48fps in general drowned out any signal about what might have caused that specific issue (though many others did report experiencing a similar sped-up effect, at the time—never found an explanation, though, aside from just blaming the high frame rate, but I suspect there’s more to it)


Maybe they should try dynamic frame rates, or even dynamic rates for different parts of the screen, since 24 is usually fine for most shots.


This is common in drawn animation, where some elements are effectively 12fps while others are 24fps or 8fps. This can even be occurring simultaneously. (This is known as “on twos” and “on threes” etc.)


I watched it in 3D HFR and it was terrible terrible terrible to me. I felt like I was watching a play. The special effects looked weird/bad, the acting felt worse, the makeup more obvious, everything yuck yuck.


I think people blamed the frame rate, but for me it was the rest of the effects that put me off, faces were too softened, lots of scenes had weird color saturation, as others mentioned there was a lot of motion blur, compared to LOTR the VFX really pulled me out of lots of scenes.


I did not notice anything that I would attribute to the frame rate. Frankly I think the visuals of the movie were fine.

It was the writing that was the problem.


Peter Jackson added more motion blur in those because he said that it played better with a focus group. I think sticking to a normal 180-degree shutter angle would not feel so weird.


There was weird motion blur, but to me it looked far more normal than most films. I realise it’s a minority opinion.


There is the alternative of Black frame insertion. It loses a lot of brightness, but helps a lot with stuttering at 24 fps.

The problem that causes stuttering is (simplified) that your eye is moving smoothly to follow something on the screen, whilst the image is moving in discrete steps. So when your eyes expect something fixed in side your view, its actually stuttering.

Black frames make use of a natural image retention in the eye. Where you effectively continue to see the last bright thing. Hence what you expect to be stationary in your field of view does remain stationary.

This was actually key to film based projectors working, because they need a period of black to advance to the next frame. Without image retention it would seem to flicker. Though 24 hz is a bit to slow for that, so they actually added a black period (by just blocking the light) in the middle of each frame to even out the effect. They were already doing BFI, not for motion smoothing, but for flicker smoothing. It seems likely this is accidentally why 24hz film doesn't have stuttering whilst 24hz screens do need it.

Personally I care too much about the brightness loss of BFI, but it might be interesting for you.


That's not a valid alternative at all, IMHO.

It solves a problem, yes, but a different problem than what is solved by interpolation.

Black frames resolve ghosting. That's great! I want to see one frame at a time, no more.

Motion interpolation resolves jagged motion. Instead of making my brain do the extra work of filling in gaps of motion, it lets my brain see an image move smoothly through time.

The higher resolution gets, the larger the group of pixels are that need to be moved each frame. This causes two unique and interdependent problems:

1. Low shutter speed creates blurry images, which makes high resolution worthless.

2. Low framerate creates gaps in motion, which makes high detail painful.

The faster the shutter, the more detail is visible. Filmakers are happy to solve this as extremely as they can. New cameras and lenses capture incredibly sharp and bright images with very short frame times.

The more detail in motion, the more confusing gaps between frames become. Filmakers (and the industry in general) refuse to increase framerate, so they give us ever increasing gaps instead.


It helps a little and I prefer lower brightness anyway. But it doesn’t remove the panning judder entirely. Mild interpolation does better.


I'm personally fine with 24 and 48 FPS (without interpolation), but what I absolutely can't stand is a variable rate. I saw Avatar 2 in the cinema with this, and it ruined the experience for me. Switching down always felt like the projector is lagging.


Interesting. Did you ever go to the movies in the 90s? Those were all 24 fps. Did you get a headache then?


Cinema at 24 fps also gives me headaches in panning shots. It did as far back as I can remember.


Cinema has a black period in the middle of every frame, that actually prevents these problems.


Wait, are you saying that 50% of the movie is a black screen?


Yes. Also, most of your LEDs are off 50% of the time too. If it’s done quick enough, you don’t see it.


Should be 50% off in that case!


Did you get this symptom from the narrow shutter snapshot-like filming in Saving Private Ryan?

https://cinemashock.org/2012/07/30/45-degree-shutter-in-savi...


Even if they just moved to 30 like normal TV that’d be noticeably better.


I don't think many cinematographers would agree with you. We have the technology to make films at higher framerates, yet few choose to do so.

Interestingly, David Lynch shot Inland Empire at 60fps interlaced, but the film was released at 24fps.


anything with cgi will be much more expensive at higher frames right? I also have absolutely no idea how any of that works.


I would think so, double the render time for 48fps, which is how Cameron shot Avatar II.


Is the final render time really the major cost factor for CGI?


Just a note that NTSC has 30 fps. PAL has 25 fps.


Most movies and fiction TV are shot at 24fps. Reality, news etc are shot at 30fps.


TV in the UK is generally 50 , at least.


That's interlaced though. Effectively half for a progressive scan.


I don’t think interlaced video is still broadcast, since analog was shut down.


https://en.wikipedia.org/wiki/List_of_HD_channels_in_the_Uni...

> All HD channels in the UK broadcast at 1080i, apart from Sky Sports Main Event UHD channel and the BT Sport Ultimate 4K


That surprises me. I guess I never noticed since I always have interpolation on nowadays.


I've never heard of headaches from 24fps video , must be rare


I'd never heard of it growing up in the era when that's all there was. I don't know if it's rare, but it's only in the last decade or so I've heard people complaining about it.


We don’t perceive all types of screens in the same way. Film projectors and CRTs display parts of the frame, only part of the time. TFT and IPS screens introduce a lot of inertia and blend the frames. Both of these help the motion illusion. OLED on the other hand has the harshest frame transition - it displays the entire area for the entire time and switches frame content almost immediately.


> it displays the entire area for the entire time and switches frame content almost immediately.

I've heard this called the sample-and-hold effect. It looks a bit like a fast slide show, and really stands out in high-contrast, steady motion scenes.


I grew up in the same era, and I definitely knew people who could not watch TV because it induced headaches in them.


I get it too. I visually see tearing / juddering on most video lower than about 40FPS and it’s incredibly tiring.


Afaict it’s a kind of migraine. Everyone I know that gets headaches from low fps also gets migraines.


Are we including 3:2 pull down and frame doubling as interpolation?


No, that makes it much worse. Avoiding it is a big reason to get 120 fps tv.


Yes! Panning shots are such a pain. Motionblur them or something please.


Just get a modern Sony TV and be done with it. They perfected Motionflow to the point where you no longer think about framerate (choppiness nor soap opera). It's clearly a priority, probably because they are the only manufacturer with their own studios (Columbia/Sony pictures). There is a reason people pay the $800+ Sony tax over any TV that has the same panel.


The Sony tax is because ads on Sony TVs can all be turned off. Plenty of TVs have their price subsidized by ads, where as when going through initial setup, I've had Sony TVs with ads disabled by default and questions asking if you want to turn them on.

Sadly disabling "recommended content" on the Google TV launcher also disabled voice search from the remote, but I am pretty sure that is a Google problem and not something Sony chose.

(Also my Sony TV cannot stay connected to my WiFi network for more than half an hour before I have to toggle WiFi on and off again...)


It took me several tries to find it but the Projectivy launcher ( https://play.google.com/store/apps/details?id=com.spocky.pro... ) is a great replacement for the Android/Google TV launcher like it used to be, with no "suggestions" -- just your content.

But definitely use the "override current launcher" setting. The description implies this is a "only if you're having problems" option, but I find it makes a variety of subtle things work the way they should.


All TVs can have ads disabled: unplug the Ethernet.


Given that 100% of my TV usage falls into these categories:

1. Controlling Spotify 2. YouTube videos 3. Photo Albums from Google Photos

No network connectivity would render my TV completely useless.

Though I think I could show photo from a thumb drive, so I'd have that going for me I guess.


Chromecast, Apple TV, Fire Sticks, Rokus, etc. could all help out here too. I connect a few of those but never my TV.


Fire Stick and Roku are worse for ads than what Sony ships.

To clarify, my TV shows a list of apps, and that is it, aside from a single "suggested channel" at top I cannot get rid of.

No content previews, no "watch now!" no special promos, just a list of apps.

To explain a bit more what I said up above, Sony TVs cost more than other TVs with identical specs, ~$200-$300 USD more, but compared to a mid-range LG or Samsung, Sony opts you out of advertising by default (the initial setup is hilarious, for the most part you'd have to manually select a bunch of checkboxes to opt into tracking!).


This is all good to know because I am in the market for a new TV. I am not a fan of anything LG has ever made. I always jokingly repeat the joke that LG stands for "Low Grade." I have had good success with LG monitors, but laptops, phones, and other electronics in the past were nightmares.

So, my choices are really between Sony and Samsung, and I think I might just bite the bullet on the Sony and pay the extra amount.

Thank you.


To get the list of just apps UI you need to enable an option in settings that will disable all content recommendations, if you want to go that far. It also turns off all the "continue watching" UI elements. I forget what the setting is called, but I do remember it is setting for the Google TV launcher.


That behavior follows, given that Google is an ad company.


> Sadly disabling "recommended content" on the Google TV launcher also disabled voice search from the remote

WTF.


That doesn't make sense. Are you saying these TV's still butcher the original artistic intent of the creators for the sake of arbitrary petty consumer desires to have their expensive TV purchase be justified?

But they just do it better than the other manufacturers do?


It's a sort of truth. Creators' intension isn't so special for consumers.


C'mon, that's a reddit-level wilful misinterpretation of what he actually said. I mean, look:

Are you saying the original artistic intent of the creators to insert unskippable ads at the beginning of the disc is more important than the consumer's right to control the playback of the content they bought? Plus I heard it might kill babies.

See? It's just silly.


Not sure exactly what display I have, but it's a recent OLED Bravia. Motion smoothing is still awful.


Is it just that easy? Do you have any specific model recs?


I have an A80J and it’s possibly the best panel I’ve owned/seen in any house that didn’t have something similar. My dad who is a big movie buff with an Atmos HiFi actually got one like a month later after being a big LG fan.


It's easy, but expensive. ;) Search for Digital Trends on youtube, they have a lot of videos about this. Their current "budget" pick seems to be the Sony Bravia X90L.


Are they changing their interpolation settings based on source material? Some TVs will disable motion interpolation when they detect 24 frame rate content.


Yes, Sony TVs try to detect the original fps and display it accurately.

https://www.reddit.com/r/bravia/comments/7ztuwv/what_is_moti...


I own one, and can say that Motionflow produces uneven results. In certain scenes it kicks in, while completely ignoring others. Still has a way to go.


Bit the don’t tax almost three years ago on an A80J. Honestly the best panel I’ve had. Probably going to buy another Sony panel to replace a circa 2012 4K Samsung.


But then what framerate is it?


I treat the configuration settings from www.rtings.com as the Lord’s word.


Took me a while to find the configuration settings you mentioned. For anyone else, you go to a specific TV model's page, and there is a tab called "Settings."


They're pretty good but when it comes to things like color, that can vary from panel to panel so you might actually end up with something worse. That said, I'd start there and if there's any question about the result you should just buy or rent a calibration tool.


I put my Samsung tv behind pihole and holy s** did it chatter. It stopped ads in the tv ui and I can only imagine other dystopian data exfiltration api calls.


How did you get the ads to go away? Some left for me but most stayed. I understand that this is because pihole can only do DNS based rejection. I actually had to reduce some blocking because the tighter restriction on DNS calls completely made my Samsung TV inaccessible (a la dumb TV, which may be desired by others but not if you want to use viewing apps like Netflix).

I've heard chatter on and off about ad blocking through packet inspection but I suspect this would not be computationally feasible for a pi. But way out of my wheelhouse to even know tbh.


You can hook up an Apple TV or a PC to your TV and watch whatever you want.


I had to fine-tune my local pihole to achieve this with a Samsung TV. I basically turned on maximum filtering, then unblocked domains one by one until the needed functionality worked.


What do you mean by needed functionality? Do you mean apps like Netflix working or something more fundamental in the TV itself not working without being able to connect to the Internet.


Right. For example, the built-in Netflix app wouldn't launch until I unblocked some Samsung-specific domain, despite Netflix itself not being blocked. So it was kind of like test feature > unblock domain > retest feature until everything worked with fewer/no ads on the dashboard.


I think I had to block and unblock some calls to figure it out but I don’t recall it being too tricky


> Motion interpolation or motion smoothing increases the frames per second from 24fps to 60fps

It's pretty common for TVs to have even higher refresh rates these days - my 4 year old mid-range LG OLED is 120hz, for example. Conveniently, 24 evenly divides 120, so when you turn off the interpolation you get perfectly consistent frame times.

As a more general note, don't be afraid to experiment with the settings. If you're watching low-bitrate netflix streams, some of the artifact-reduction filters can be worthwhile, especially on the lower intensity settings.

For watching bluray remuxes however, "filmmaker mode" or equivalent settings is generally the way to go.


I can not believe people still turn on Soap Opera Mode.

If I walk into a bar or restaurant and see a TV with it, it's instantly offensive from 20ft away. It boggles my mind how people really can't see it, or think it looks good.


My LG OLED had intermittent contrast issues that persisted after toggling every permutation of picture bell and whistle. After finally stumbling across the correct search term, it turned out the offending feature was hidden in a maintenance menu that requires a button that only exists on service remotes… The same issue was fixed in an update for newer models. Tons of forum posts on the subject resulted in RMAs and refunds. How did we get like this?


I can't quite phrase this right, but persistence of vision and "frames" is really quite distinct from I+P blocks, and what we have in modern LCD screens is anything but a fixed rate scan.

I really liked old TV. those wagon wheels running backwards were a direct artifact of PoV and frame rate.

The whole NTSC vs PAL vs SECAM debate, was about what aesthetically is best against what eyeballs do. Personally I didn't get the argument NTSC was better for live action. I just think PAL was superior to all the others.

Yes it was wasteful of bandwidth. Yes, it was analogue signalling of something which was at the fringes of what analogue and digital is (encoding colour as a layer over b&w signalling of grey levels, under a time code synchronisation for framing). But, it was also amazing, and I think we've lost something in MP2 -> MP4. We've also gained fidelity of what is sent to what is received and we've gained far more content.

I do really miss old analogue(ish) TV


What I miss is the motion clarity of impulse-driven displays vs. today's sample-and-hold displays. Smooth 60 Hz scrolling on a CRT is something so visceral that just can't be replicated on modern displays.


Scrolling on a modern 120 Hz display does feel extremely smooth though.


It’s smooth, but it suffers from motion blur.


Really depends on the display - pixel response times vary by a lot.


This is not about pixel response time. This is for the same image to be held over the whole time interval of the frame, instead of just momentarily, meaning that movements from one frame to the next suddenly “jump” at the interval boundaries, instead of being smoothly interpolated by the eye/brain between momentary flashes like with CRT. Even with instant pixel response, sample-and-hold displays inherently produce motion blur. This is why mitigation techniques like black frame insertion, backlight strobing and scanning backlights exist.


No, motion blur is not caused by that. Motion blur might be added intentionally to hide non-smooth movement with low framerates but that is uncommon outside of games or pre-rendered content (movies).

And for a 120 Hz display (or even higher framerates) the jump between frames is small anyway that any reasonable scroll speeds will appear as smooth without the need for motion blur.

If you are seening motion blur on a high-framerate display then either it was already there in the source content, intentionally added by the application, or you display has a crappy pixel response time. It's not an inherent problem with high-refresh displays.


#CRTmasterrace


I've tried filmmaker mode and it is just another kind of bad smart TV setting, making everything way to dark instead of way too bright. Turning of most "features" of these TVs seems to be the only sane solution.

Of course perhaps it is the filmmakers that are to blame: https://www.avclub.com/how-to-watch-dark-movies-and-tv-shows...


The truth is the filmmakers want their movies to look dark (cinema projectors are kind of limited), to have no colors (use too much color grading) and to move like a slideshow (24fps) :)


Yeah I'm finding film maker mode way too dark on my Samsung oled.

I can't find any explanation of how it actually works. Does a each movie get different settings set up by the director?! Doubt it.


I found FILMMAKER MODE (why is it capitalized in settings?) dark and muddy on a Samsung Frame. The "Samsung TV picture settings" section in the linked which.co.uk article [1] seem like decent advice.

[1] https://www.which.co.uk/reviews/televisions/article/getting-...


It's dark because OLED are not bright at all. Anything brighter than the filmmaker mode is modifying the source image to achieve it, or alternatively driving the pixels in a way that loses color accuracy.

If you care about film and getting the closest result to what the actual thing is supposed to look like you're going to need to couple correct settings with a light controlled room for best results. Or don't use OLED, because, it simply can't achieve the brightness of cinema projection, not even close.

Personally I like the results of a 120hz OLED so much better than other options that I strongly favor a light controlled space for movie watching. For lower grade junk it's usually easy enough to swap to another viewing preset that is brighter.


Typically, it means that if you put the TV in a dark room, it is calibrated to the same specifications that the monitors used in post-production used. Therefore, it is what the directors "intended" the video to look like since they were looking at the monitors (in a dark room).

However, if your room has even a little light in it, the settings would make the TV too dark.

It will also disable any effects the TV has that aren't "map video to screen 1:1" such as motion interpolation, upscaling algorithms, etc


It supposedly makes the content look more "just as the director intended".

However, Samsung TVs are not exactly known for realistic colors. So turn up the eye candy and enjoy!


I found this to be an informative and well-laid out argument in favor of motion interpolation: https://www.wired.com/story/motion-smoothing-defense-hdtv/

I have come to prefer it. I think this comes as a combination of recent algorithms getting better (there are relatively few artifacts and they are far subtler than they once were) and because newer consoles have conditioned me to abhor any content running at less than 60 fps. The jaggedness of 24 fps grates my eyes. I’ve had my iPhone recording 60 fps video for years.


The thing I hate about "advice" like this is that it assumes that everyone likes the same things and feels the same way, and it comes across as an attempt to shame anyone otherwise.

I like motion interpolation. I always have it turned on, and I chose my TV based on the quality of its motion interpolation.

Screens are so big these days, that if you're watching a 24fps movie, any panning movement becomes a horrible juddering shaking movement. Judder judder judder... ugh. I can't stand it.

With motion interpolation turned on, everything is silky smooth, and I can actually see what's on the screen, even when the picture is moving.

So no, I won't be turning it off, and I suggest that the next time you watch a shakey-cam action movie, you try turning it on too!


It makes everything look like a cheap soap opera to me. I can't stand it. I think this might be either a generational thing or perhaps a cultural thing, or maybe some of both.


"It makes everything look like a cheap soap opera to me"

This is nothing more than conditioning. "Quality" TV shows "film" at 24 fps despite the fact that they were going to be viewed at 30/60. They did this because even though 3:2 pulldown incontestably is a dirty, ugly hack that reduced quality, people were conditioned to think that if something went through that hack, it's quality. If it didn't, it can't be quality.

So when people talk about the "soap opera" effect, what they usually mean is concise and clear.

The best example of this was The Hobbit when presented at the original, director's intention 48FPS. People were so conditioned to movies being a blurring mess at 24FPS that a frequent complaint about the Hobbit was that it had the purported "soap opera effect".


It's not conditioning. Frame rate is a tool in the toolbox, and more isn't always better, just like a colorized black and white film isn't better just because it has added information.

This is most easily apparent in animation, which frequently makes use of variable frame rates to achieve different effects and feelings, often even within elements of the same scene. A character might be animated at 12 fps while the background is at 8 fps. Bumping those all up to 60 would be an entirely different shot with very different vibes.


Surely 24 fps was made the standard to save money on film.

It's about the slowest frame rate that's not very distracting.

Needless to say, film is no longer the major expense it used to be. In the early days the film itself was a fairly big part of the production cost (and don't forget distribution - cost to replicate the film - and operating cost for the projectors - handling film rolls twice as big, etc.).

And standards are sticky, so the industry stayed on 24fps for decades.


> It's about the slowest frame rate that's not very distracting.

Except, it is very distracting. Around 60 it starts to become bearable for me and allow camera pans to be interpreted as fluid motion.


It's quite astonishing how terrible 24 FPS really is, but as the GP mentioned it was "good enough" for the time. It's a scale, and costs and processing was prohibited that it was the point where most scenes, especially dramas and human interest type things, were served well by the format.

Action scenes are just brutal at 24FPS. Running, pans, and so on. Either it's a blurry mess at a 180 degree shutter, or it turns into a rapid slide show (like early in Saving Private Ryan).


This is as much about frame rate as it is about display technology.

24hz on an OLED with very quick pixels is a very different experience than 24hz on a CRT or 24hz on a film projector.


Actually, more is always better. What do you think why we try to run PC games at 144, or even 240 and 360 fps?


Games are different. Interactive media requires responsiveness. Saying that higher framerate is always better in cinema is a pretty bold statement, one that would fail a simple survey among moviegoers. What does "better" even mean, in this context? Fidelity is just one of many axes of expression when creating art. Higher isn't better than lower, or vice versa.


For me the higher frame rate is not really about response time, it is all about smooth motion. Panning the camera at 60Hz is just barely acceptable. 120Hz is where it starts looking real nice and I can really see things during the pan. 24Hz pans are unwatchable garbage.

A movie properly authored at 120Hz (not interpolated with potential artifacts) would be objectively better than a 24Hz one in every possible way.


Also with higher frame rate I can see subtle patterns in the movement and flow of things (clothes, water, humans). Also the eye can see more details and can recognize the texture of things, the different kinds of fabrics, all kind of details like this.


PC gamers pushing for higher framerates is mostly about reducing latency, which doesn't matter for a movie.


Only some people try to run at 144/240/360. I set my max FPS on my PC games to 60 because it 1) feels right and 2) uses less electricity/generates less heat. I only bump it for competitive FPS where a higher FPS makes target acquisition easier.


Need some way to justify an overpriced graphics card, is what I always assumed.


What's the point? Can people even tell?


I’ve never had a display refresh rate higher than 144, but maybe because I play games but I can definitely tell the difference between 30/60 and even 60/120. After 120 the difference between 120/144 starts to diminish but there’s still a slight difference though it’s more pronounced in very fast moving objects like moving a mouse.


Of course they can. If you ever try to go back, it looks like garbage. There is no limit to improving the illusion of smooth motion, but it scales non-linearly (you need to double the frame rate each time to improve it noticeably). I can't personally tell apart whether it is running at 144 or 120 fps, but the larger jumps are very obvious.


Yea 120 v 144 isn’t too big of a difference for me. Actually until I got a new GPU that could actually output 120fps at 1440p, I left my monitor on 120hz. When AMDs AFMF drivers came out I actually went ahead and bumped the refresh to 144hz as I was definitely starting to see screen tearing but it was still pretty fluid. I’d be curious on whether the next jump for meto will be 4k 144hz or 1440p 240hz


Absolutely they can, though I found anything above 240hz to be indistingushable for myself personally. Though the monitor I'm writing this on is 300hz; it's motion clarity is only bested by OLEDs and the $2000 AUD (twice what I paid for this) 360hz bigger brother PG27AQN.


Animation frames are almost totally orthogonal to what we're talking about here. In fact, I'd argue they're the exception that proves the rule. Animation had to develop tricks like spaghettification to look good at 12 fps because it looks inherently worse than higher frame rates. In fact, techniques like smear frames are often used to directly emulate higher frame rates. It's an art form born of limitation, not a neutral tool. Just look at any 3d animated show that just drops the frame rate of mocapped characters to 12fps (Dragon Prince is one that comes to mind) - it looks like jarring, choppy shit.


Those are the origins, but the result is a style that can look good or better than higher framerates depending on what you're going for. Art is not held back by fidelity - instead fidelity is part of what defines it, as well as other constraints.


> The best example of this was The Hobbit when presented at the original, director's intention 48FPS. People were so conditioned to movies being a blurring mess at 24FPS that a frequent complaint about the Hobbit was that it had the purported "soap opera effect".

The reason I didn't like the Hobbit was because they went overboard on CGI. They had to make Orlando Bloom (Legolas) appear younger than he was in the Lord of the Rings which was released a decade before.


> They had to make Orlando Bloom (Legolas) appear younger than he was in the Lord of the Rings which was released a decade before.

Tolkien's elves are literally immortal and the time between The Hobbit and The Lord of the Rings is less than a hundred years. Legolas' father is several thousand years old. There is no reason to expect Legolas to look younger in The Hobbit; you'd want him to look exactly the same.


Yeah, there were so many other things I did not enjoy about those movies that any potential weirdness from the higher framerate didn't even come into play. Rewatched them this past year while down with COVID and after finishing the LOTR extended versions.

Production-wise, it was the silly looking CG characters. Even worse was the way they stretched one simpler story into three movies (while the LOTR trilogy just managed a good job of adapting three long books into three long movies without too much oversimplification.)


I watched The Hobbit in the cinema in HFR.

It looked like absolute soap opera crap. And the worst thing was that I could _easily_ tell when the actors were carrying light prop weapons compared to the real thing.

At 24 fps your eye can't tell, but at 60fps you see the tiny wobble that the foam(?) prop weapons have and it's really jarring. Same with other practical effects, they're easier to hide in the ye olde 24 fps version.

Watching movies in 60fps is the same as when porn moved to 4k. You see every little flaw in great detail that used to be hidden by worse technology. Everyone needs to step up their game in areas where it wasn't needed before.


It's not just concise and clear to me. It's that people tend to "pop" from the background too much, and it reveals the artificial lighting to my brain. Everything looks "over-lit". You know, like a soap opera.


This is an odd take.

Firstly: No one on the mass market actually knows what 3:2 pulldown is, so it's hard for people to see it as an indicator of 'quality' -- most of HN, a technical audience, probably doesn't know what it is either: For reference, it's when a 24 frame per second film is converted to 29.97 frames by by interlacing 4 frames together to create 5 frames. That and a tiny bit of a slowdown gets you to 29.97, which is the NTSC frame rate.

Secondly: Why do people in traditionally PAL regions also hate the soap opera effect? Again, for reference, PAL regions ran at 25 frames per second and so got away with a 4% speedup and a 2:2 pulldown that has no real frame-blurring effect.

Thirdly: Generally, I prefer higher frame rate content: I have a 144Hz monitor, and prefer content as close to that as I can, but I still hate watching The Hobbit -- a lot of this has to do with motion blur: 48 frame per second content is not fast enough to get away with appearing seamless, and not slow enough that the per-frame motion blur you get with 24 frame per second 180 degree shutter hides the stuttering.


>No one on the mass market actually knows what 3:2 pulldown is

People don't have to know what it technically is to know it when they see it, and the simple incantation of "soap opera effect" demonstrates that.

Again, almost all dramas shoot at 24 fps. There is zero technical reason for this (there once was a cost saving / processing reason for it, and then people retconned justifications, which you can see earlier in this very thread). They do this because, again, people are conditioned to correlate that with quality. It's going to take years to break us from that.

>I have a 144Hz monitor, and prefer content as close to that as I can

This is not meaningful. Preferring games at a higher framerate has zero correlation with how you prefer movies. And however odd you think the take is, you like 24 FPS because you've been trained to like it, Pavlov's bell style.


What you're suggesting is that you know better than every cinema-goer and every cinematographer, animator, producer, and director around what their preferences "really" are, which is a pretty wild thing to claim, especially in the face of people telling you exactly why they prefer unaltered 24 FPS content to horribly interpolated uncanny-valley messes.

The reason no one has changed process isn't because there's tonnes of better options that everyone is just studiously ignoring because of pavlovian conditioning. It's absolutely nothing to do with people liking the look of interlaced 3:2 pulldowns. It's because the current options for HFR content just plain don't look very good. Some of this is unrelated to the technical specification of the recording & due to things like action content in HFR looking cheesy -- there's going to need to be a wild change in how content is choreographed & shot before we're anywhere near it being as well understood as current practises.

There are exceptions: 4K 120FPS HDR content for things like documentary content looks pretty good on a high refresh rate monitor (note: no one said games), but we haven't reached an era where that's even nearly commoditised and the in-the-middle stuff you'd want to do for cinema or TV just can't cut it.


>better than every cinema-goer

Humorously this submission, and so many just like it, are about people who are outraged that their parents / friends / etc actually like motion smoothing. So...I guess? I remember a similarly boorish, ultimately-failed "no one should ever take vertical video!" movement from a few years ago, again pushed by people really, really certain of the supremacy of their own preference.

>and every cinematographer, animator, producer, and director

Now this attempt to appeal to authority is particularly silly. Peter Jackson -- you might have heard of him -- tried to do a movie at 48 FPS for a wide variety of quality reasons, to be lambasted by people just like you. People who are sure that the completely arbitrary, save-our-rolls-of-film 24 FPS is actually some magical, perfect number. It is insane. Everyone else is simply handcuffed to that completely obsolete choice from years ago, and will be for years more.

I'm not going to convince you, and have zero interest in trying. And I am certain you're not going to convince me. But your argument at its roots is "that's the way it's done, therefore that's the perfect way and the way it will forever be done". It's utter nonsense.


>People who are sure that the completely arbitrary, save-our-rolls-of-film 24 FPS is actually some magical, perfect number. It is insane. Everyone else is simply handcuffed to that completely obsolete choice from years ago, and will be for years more.

Instead of trying to jump to 48fps or 60fps, maybe they should just adopt 30fps as the new standard for a while. The 24fps fans won't have too much to complain about, because it's not that much faster (and it's the same as the old NTSC standard), and the HFR fans will at least have something a little better. Then, in a decade, they could jump to 36fps, then 42, then 48, etc.

As a bonus, the file sizes for movie files won't be that much bigger at only 30fps, instead of 60+.


> I remember a similarly boorish, ultimately-failed "no one should ever take vertical video!" movement from a few years ago

But that's more about what you're shooting and where you're watching it.

I typically don't like portrait video because I watch most video on a 16:9 (or wider) screen. 9:16 video leaves a lot of wasted space. I get why people shoot vertical - because they're only using cell phone screens to view and the content is "portrait-oriented" like a person talking to the camera.

But the other side of this is when you see someone shooting portrait orientation and they have to pan around, back and forth, constantly moving just so they can capture the whole scene. It doesn't make sense if the subject(s) are arrayed horizontally. Add to this the simple fact that you can just spin a phone sideways and even mobile viewers can see the whole thing without all the panning.

If anything, the easy switch from portrait to landscape should offer mobile-shooters more flexibility to match orientation to content rather than likely viewing device.


people are conditioned to correlate that with quality

Are you sure it’s really just conditioning? Impressionist paintings are obviously a lower fidelity reproduction of reality than photorealistic paintings, yet people tend to like Impressionism more, and I don’t think that’s necessarily just cultural conditioning. Sometimes less is more.


I'm not complaining about pulldown effect. I'm complaining that the interpolation makes people look like they "pop" from the background to my eye, and it emphasizes the lighting, making everything look "over-lit" to my eye, which I associate with soap operas, because soap operas (especially Latin American soap operas, for some reason) really emphasize lighting the actors' faces.


Maybe some people can pick up on the fact that those extra frames aren't real more easily than others. Some innate sense thrown off by the interpolation. Motion interpolation gives me an uneasy feeling more than anything.

For example, some people can see a high frame rate and thus can't watch color wheel based DLP because they see rainbowing in the screen. I can't watch my old plasma in 48hz mode because it flickers. My spouse can't see the flicker at all.


For me, the interpolation really seems to separate the "layers" in a shot, and it just completely destroys the illusion of them not being a set with some lights. Like I said, it feels like a cheap soap opera from the developing country no matter what movie or show I'm watching.

Some of the problem for me may be related to the fact that I worked as a camera operator and video editor during the years from transitioning from the old NTSC standard to HD, and I paid hyperattention to detail as HD came online.

For some reason, the interpolation just screams "this is fake" to me.


> For me, the interpolation really seems to separate the "layers" in a shot, and it just completely destroys the illusion of them not being a set with some lights.

Great description. It's the same for me.


Where does this nonsensical argument keep coming from? No idea what a "cheap soap opera" is even supposed to be. The only thing I notice is that the horrifying judder during camera pans is replaced by minor artifacts that don't distract anywhere near as much. It's literally not possible for me to interpret a 24hz pan as fluid motion.


See llm_nerd's answer below. Conditioning or not, it takes me out of the realism of whatever I'm watching if interpolation is applied. People don't really have to rationally justify it, they're used to a certain way of footage being played growing up and when some folk see interpolation it just feels off to them right away, and reminds them of those soap operas.


Kind of weird that you associate motion blur with realism.


Of course it seems weird... if you read what I said I implied it's not really rational.

But no one's brain is saying "wow the lack of motion blur really affects the realism here".

I walk into a house and see a TV show playing on a TV with interpolation turned on and it just looks weird because of how TV looked growing up. I mean that's just the simple reality of it. I understand when film nerds come along and explain motion blur etc but that's all very "system 2" thinking. I'm talking about pure system 1 subconscious processing.

What's even weirder is how some folks can never seem to grasp why their explanations of how motion blur is bad can't convince others to suddenly like interpolation.


I think its as simple as the blurriness of 24 fps hides props/backdrops/etc better than at higher frame rates. Its the same with resolution. More detail makes the fake things more obvious. An old film shown on a CRT from the 90s vs. an HD remaster on a modern OLED can be almost distracting in how obvious movie props are. This is a big component of the "soap opera effect", where props and scenes become more obvious and feel "less real", despite the higher detail.

With time, higher quality sets should balance this back out.


It's not the motion blur that is a problem for me. For me, interpolation seems to make the actors "pop" from the background. It separates them from the background and makes them look like they are "over-lit".


I wonder if you'd have the same reaction to a movie properly authored at 120Hz without the subtle artifacts.


There's a point at which these things balance out and work. I'm not sure that 120 frames per second is high enough, but it's much closer. We have anecdotal evidence (the fact that everyone hated The Hobbit) that 48 frames per second isn't.

One of the rules of cinema and shooting at low frame rate is to use "180 degree shutter" (a term that comes from the spinning shutters used in old film cameras), or in other words a shutter that is open for half as long as the frame rate. i.e.: If you're filming at 24 frames per second, use a 1/48th second shutter speed.

The reason for this is that 24 FPS @ 1/48" exposure film adds enough motion blur to each exposure that any movement that occurs over a small number of frames is naturally smeared by a consequence of the exposure time, but anything that is relatively steady across frames tends to look crisp. If you shoot at 24 FPS @ 1/1000", you end up with something that looks like a fast flipbook of static pictures. 24 FPS just isn't fast enough, and people can see that each individual frame doesn't contain movement.

Anecdotally, 120FPS @ 1/1000" on a 120Hz display doesn't exhibit this same problem, at least to me or people I've shown it to, although some people will notice it "feels fast".

48 FPS @ 1/96" seems to be the worst of both worlds: Too fast a shutter for motion blur to compensate nicely, too slow a frame rate to to make it look seamless, so it ends up in the uncanny valley where people know there's something missing, but not what.

The frame interpolation feature that people seem to hate is almost directly designed to fall into this horrible territory.


> We have anecdotal evidence (the fact that everyone hated The Hobbit) that 48 frames per second isn't.

Maybe, or maybe we just haven't adapted to 48fps yet. Something I heard a lot about The Hobbit was that the outdoor scenes looked great whereas the indoor seens looked like film sets - which, well, they were film sets, and making a set that looks "natural" will require new techniques. Everyone hates early stereo mixes (e.g. The Beatles), not because stereo sound inherently sounds bad but because production takes time to adapt to new technology.


> Motion blur

Motion blur doesn't work on modern TVs (i.e. OLED). They will display a static, blurry frame for 1/24 of a second, then snap to the next one practically instantly with a clearly visible jump. What I get in the end is a juddering mess that is also blurry, so I can literally see nothing.


Motion blur doesn't really work in that way on any device, but what you're missing is that the content's captured frame rate has a relatively important relation to the display frame rate: You can't usually change one & not the other without making everything look worse, at least to most people, and this is what a lot of modern TVs do.

Naive interpolation of inter-frame content is awful, is what a lot of TVs do to implement their smoothing feature, and is why everyone complains about it.

The reason a lot of people hated The Hobbit may be partly because of this problem: It was shot at a 270 degree shutter to try to get a bit more motion blur back into each frame, which to a lot of people felt strange.


Interestingly, some of The Hobbit fan edits changed the frame rate back to 24fps; watching those on my regular IPS screens, they look just fine.


There ought not be any juddering since that is an artifact of a varying frame rate rather than a static frame rate, nor should you be "literally seeing nothing". You might be more visually sensitive than the average person, who seems to not care about these things. It reminds me of my spouse, who is really good at detecting motion at night where I see nothing, almost like a super power.


I wonder if these 120 fps panels are fast enough to do black frame insertion to mimic the effect of film projection


I tried the black frame insertion mode on mine, but I can see the strobing, so it is extremely uncomfortable to look at.


Nah it’s much better at higher Hz



A visual reference [1] that shows what I'm thinking of. Pay attention especially to how they light the actors. They light from several angles so that there are no shadows, which flattens their faces. They also use a strong backlight or a rim light on them to give a shiny look around their head and shoulders to separate them from the background. That's what all interpolated video looks like to me. [1] https://www.youtube.com/watch?v=B9eLHmLRSq0&list=PL7N8vg4EqC...


Watch a soap opera from Latin America. The actors' faces are lit from every direction to remove shadows. And they are backlit to make them "pop". In other words, to make them stand out from the background.

To my eye, interpolation does the exact same thing. It's like they used too many lights and it makes the layers of depth in the shot "separate" from each other and look flat. It makes the whole shot look completely fake to my eye.


> No idea what a "cheap soap opera" is even supposed to be

https://youtu.be/MFlz5cCDMmc?t=25


I noticed this too after getting a 4K TV earlier this year. It really ruins a lot of films.


I suspect it has to do with the degree to which your past was plagued with cheap soap operas vs poorly performing video games.


It feels like the jump to 4K and 240hz tv at the same time as one groups normal and another groups exception may have something to do with it.

Maybe the group who doesn’t want it too life like is inoculated, or the other way around.


for as much debate there was on 60fps, I do agree. games and movies on 60fps do not feel "cinematic". I indulge in these to escape from reality.


I understand and agree with what you are saying, but I think your preference for motion interpolation is quite unusual.

Perhaps it is a preference that will change with generations.


Probably not, young people watch most of their content on phones/etc, not big TVs with motion interpolation.


This is absolutely the case with all of the nieces/nephews that I know. They prefer that they see the majority of, tablets, which 81% of kids have these days [1].

[1] https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&c...


That number is a pretty high--looking at your link, living in a house with a tablet isn't the same as having a tablet. I have tablet from work but my kids don't use it or any other. BTW that's really cool that you could direct the highlighting in the URL like that. If it's not specific to census.gov, what is that called? What level of the stack is interpreting it?


Its at the JS layer from what I can see:

#:~:text=In 2021%2C 81%25 of households,17 years old — owned tablets.

That query fragment is what triggered it, which means theres some Javascript taking that in and highlighting whats on the page



Neat! I learned something new :) Thanks


Someone tell the film industry, so that they stop lighting and color-grading the movies and TV shows for ultrabright, turbo-HDR-enabled expensive TVs - almost no one has those, and even those who do mostly watch on their phones. Maybe if the industry gets that memo, we'll once again have films and shows in which one can actually see what's happening.


I’d rather my films be graded for home theater use than phone use. HDR TVs these days are basically all TVs. Why should we all take a step back in quality when the hardware to view it decently is more affordable than it’s ever been? I don’t care how Oppenheimer looks on your phone I know Nolan would care even less.


Because who has home theater? I'd think it was mostly killed by the Internet, streaming, and the overall trend of cable-cutting, ditching TVs, and switching to computer and laptop screens for one's entertainment.

(I guess at some point this must have reversed everywhere except the bubble myself and my friends and acquaintances live in...)


I have a home theater. I don't know how big the market is, but you can buy receivers from a half dozen manufacturers (Denon, Yamaha, Sony, Onkyo as well as a bunch of speciality brands), digital projectors (Epson, Sony, JVC, BenQ and also a bunch of speciality brands), and all manner of ancillary items (sound proofing, audio treatments, theater seating, etc).

/r/hometheater/ has nearly a million members.

https://www.avsforum.com/ is pretty active.

TVs are still getting larger and they've recently crossed the 100" in size.

So yeah, there's a least a few dozen of us!


Lots of people have TVs, not many are seriously watching Netflix on their phones (maybe their laptops) but generally on their TVs in the evening.


I personally, have a 100” dropdown screen, ust projector and a Sonos surround. This last month, I helped my work buddy shop some deals and got him a 75” Dolby Vision TV and surround sound system with sub for $1200. These are affordable things.


> I understand and agree with what you are saying, but I think your preference for motion interpolation is quite unusual.

The number of TVs that have it enabled by default seems to indicate otherwise. I'm not saying that the manufacturers are correct, necessarily, but I don't think they would go through the effort if they didn't think enough people wanted it.


They are in fact quite incorrect. These poor upscale features are horrors crafted in the early days of post-processing nightmares. Half of it was pitched to make football games appear more real. They all murder the soul of the film and jack up delay. The industry even had to invent a synchronized audio mechanism to compensate for the summoning of these Eldridge horrors. What I don’t mind are modern upscalers which look very nice on expensive sets. Other modern features like HDR are also lovely because they focus on reproducing the movie with vibrant lighting. Anyhow, one summer working in a television station will improve your acuity for these things. You might even come to understand how these old creations will be the downfall of civilization. As an aside, do not forget to surgically remove samba tv from any android set with adb.


Or: it just "shows" better when they're demo'ing the TV to a potential customer, not unlike the hyper-saturated nature footage they use to get you to look at their TV and not the one next to it.


I don't buy that. Why would motion interpolation look better in the store than it does in your own home?


Because the store chooses to show footage for which it is optimized, not a movie. But there's also the consideration that it looks better at first sight when you walk past it, than it does when you watch a whole movie looking straight at it.


I think it’s because of the content used for it. Some things do look better. But movies absolutely do not.


Just as the people who design google maps navigation don't drive, the people who design parental control features for google and apple surely don't have kids, and the PMs who choose to turn on motion interpolation by default likely don't watch movies on their TVs.

Major, shockingly obtuse sensibility gaps are sadly the norm today in product management.


It seems the side effect of treating customers as users in the classical sense, is the cross-industry adoption of the mantra, "don't get high on your own supply".


Seriously... No! Our first LCD TV (several years old by now) had this, and watching LOTR on it - was wtf is this - am I in a theater? We both sit with my wife and watched it, and we were secretly annoyed but dared not to say or complain about it, because we just spent tons of money on it... Then we found it and fixed it!

New TV I've got has the `Filmmaker mode` - wasn't sure what exactly is that, turned it On and yes - it's how it should be. This article cleared it for me now


You can also prefer to not like subtitles, and watch films dubbed from their original language.

You can also prefer to not like black & white films, and watch the colorized versions done decades later.

You can also prefer not to see nudity, violence, or profanity, and watch the edited-for-TV versions of those films.

Finally you can prefer a totally different story or shot composition or artistic choices altogether, and ask Generative AI to recreate, reedit, or augment scenes to your preference.

All of these are valid preferences and we have the technology to facilitate them for you.

But 1) They should never be the default on any technology you acquire. That's the PRIMARY sin of all of the technologies mentioned by OP - it's not that they exist, it's that they're on by default, and since most humans never change the default settings on ANYTHING they change, they experience content in a way that as not intended by the original artist behind the vision.

And 2) Well, this is subjective, and everything is a spectrum. But you are ultimately robbing yourself of the specific experience intended for you by the creator of the film. It's certainly within your right not to care and think that you know better than them, but on a spectrum of that philosophy, carried out across all of society, it's probably not a good thing.


The default should be what is more popular. The connoisseurs are probably savvy enough to change the settings from those defaults.

Most people don't give a damn about those things. They just want to be entertained, maybe they have other hobbies where they can exercise their particular snobbism, and movies are not particularly important for them as an art form or they don't care much about art at all.


Oh look the exact type of shaming OP was talking about.


And so they should be shamed. There are professionals working on every aspect of a film, and these options just shit all over their work, ruining their intended vision.

Now if you're talking about watching YouTube or similar content then it's a different story.


"Vision" implies that there is a choice. How many feature films have been made at a frame rate other than 24fps? Ever since the sound era, I suspect you can count them all on your fingers.

So I don't buy this "vision" argument. It's just holier-than-thou. Directors choose 24fps because it's literally the only option. It's no choice at all.

Forcing me to watch your movie as a jerky motion-sickness-inducing mess? I will indeed, to use your words, shit on it.


My mind is blown. I didn't think -anyone- could possibly like motion interpolation for watching movies. I hate it so, so much. I'm trying to understand your POV.

How do you feel about watching movies in a theater? The frame rate there is low but the screen is so much larger.


It's even worse in theaters. The screen is HUGE. Panning motions are so bad they often give me motion sickness.

There was one movie that they showed at 48fps - I think it was The Hobbit? I've forgotten. That was amazing. Blissful. My eyes have never been so happy.

Even if I forgot the plot already.


Yep. Hate theaters for this reason. Absolutely unwatchable. My LG G3 with motion smoothing provides far better experience. Even in Avatar 2 they put some sections of it in lower frame rate for no reason and I noticed them instantly.


Avatar 2 in the 48 FPS scenes was jaw dropping, it looked so real, as if you were transported there. It does not look the same in the 24 FPS scenes and pulled me right out of the movie.


There really are a large range of opinions about this. I love high frame rate, but can't stand interpolation. I wish theaters used higher frame rates. I really enjoyed the parts of Avatar 2 that were smoother, and it felt jarring to me whenever it would switch back to low frame rates.

Probably it's just what you're used to and how much you've been trained to notice artifacts.


Movies have a different display though. Film shutters and whatnot. Helps a lot with keeping the motion from just being jerky. OLEDs don't have that, and attempts at black frame insertion don't really work there because they already struggle with brightness. Hence, a mild motion interpolation is useful.

Different display technologies need different things. No difference from CRT filters on old video games played on modern screens.


Nah, it is just that filmmakers avoid a lot of panning shots because it looks like crap.


Is anybody shooting or distributing film anymore?


Lots of people still shoot with film. But distribution will always be digital unless your screening is specifically advertising a film showing. That's usually just big Imax movies like Oppenheimer in select cities, or indie theaters showing old movies who can get access to the film.


I wish for theatre film releases to move on to HFR instead of sticking to 24 FPS which was arbitrarily set by ancient technology limits.


Even the ""HFR"" is a joke. What is 48 fps supposed to be? Every modern TV can do 120 already. Use that as a baseline.


48 allows theaters that haven't upgraded to show the movie by throwing away every other frame.


Well then, they can have that same alignment with an actual high frame rate movie at 120Hz, displaying every 5th frame.


Your mind is blown that someone might not like to watch movies at a prehistoric 24Hz on a screen that is capable of 120Hz, after video games have been running at this frame rate for a decade already?


Yes! Whenever I see motion interpolation on a TV, it gives me a sense of revulsion. It seems okay for sports, I guess?, but awful for movies, to the point where I would rather turn the TV off than watch a movie with motion interpolation. Perhaps what I'm thinking of is the 3:2 pulldown thing for 60Hz TVs and I haven't seen what it's like on a 120Hz screen?

I don't play games much, but visually I've always distinguished between games and movies. I expect them to look quite different, not at all similar.

And I thought my feelings about this were universal, and had confirmation bias from seeing postings before Thanksgiving like "if your parents' TVs look weird, here's a guide for turning off motion interpolation depending on make and model", etc. I've assumed that the whole point of motion interpolation was for new TVs to impress people with how they look for sports, that this is what sells TVs in Best Buy, and the over-application of motion interpolation to other content was an unfortunate byproduct.


How much of that high FPS or interpolated media have you seen? At first I too was jarred by it, as if it were sped up somehow, but after a few hours, I couldn't watch regular 24 FPS anymore as it literally looked like a slideshow. I distinctly remember watching Thor: Love and Thunder in theaters and when they panned over the pantheon, you literally couldn't see anything. In contrast, Avatar 2 with the 48 FPS scenes was one of the most immersive movies of my life.


A lot of lower end TVs have really bad implementations of it. Maybe try it again on a LG G3 or a Sony A95L and it might be completely different from what you remember.


I'm in agreement with them here, judder on movie screens is so bad to me it's distracting. I don't think I've had a better visual experience in a theater than at home in a long time. The screen is physically larger, but when I sit 4 feet from my TV it's the same size visually.

Audio still wins by a mile in a theater though. Though home theater audio has gotten pretty darn good. I just wouldn't know as long as I'm in an apartment.

You can also always choose when to have motion interpolation on or off. For sports and nature documentaries I think it's just better. For animation it's worse. For other films it depends, I usually prefer it in something fast and actiony, like John Wick.


Motion smoothing is great on my 120Hz TV.

Theater is fine -- I have a home theater w/ a 24p projector. I wish it had 120Hz smoothing, but at least it's not dim like my local theater.

24p content at 60Hz is really, really bad (3:2 pull down what not). At least we can agree on that. That seems to be what "turn off motion smoothing" is about, to me.

24p content at 24Hz is fine, but 120Hz is better.

EDIT: I should say that the input lag (gaming) is much greater for smoothing, so even at ~30 fps I'd run my PlayStation at 60Hz no smoothing (game mode on &c).


The problem with motion interpolation is because most of it is the cheap variety. I don't like motion blur, it is annoying, but bad motion interpolation is worse.


I like interpolation too, but it must be high quality, not the low quality that's built into TVs. For example, I use SVP 4 which does interpolation with your GPU, works great.

Movied are even worse, I can absolutely see the judder especially when panning. It is one reason I prefer to wait for home releases and will go to movies that are specifically phenomena irreplaceable at home, such as Oppenheimer in 70mm IMAX.


My hope is that thanks to stuff like DLSS frame generation in video games (or maybe AR/VR) that the opinion of a majority of people will change over time. So that eventually maybe... we might actually see movies being filmed in higher framerates. People's conditioning really stands in their way, imo.

The only bad thing about motion interpolation on most TVs in my book is the fact that the /implementation is often pretty bad/. If perfect frame interpolation was a thing, I'd watch everything upsampled to 165Hz of my monitor. Well, I do it anyway using the Smooth Video Project. But that also suffers from artifacts, so is far from perfect. Much better than judder though...


DLSS 3 really is fantastic technology. It works amazingly well for interpolating a game from 60 to 120 (or higher). It fails pretty hard if you start with 24 though. We'll need something like RIFE for that, but currently no hardware exists that can run it in real time...


I like the feature, too. I remember watching the Battlestar Galactica remake with the interpolation setting active and getting an even deeper sense of realism out of the scenes. They were already aiming for a documentary style with the camera work and special effects, so the higher prosumer framerate fit with the style very well. On other films and TV shows I like the interpolation for panning shots which jidder a lot on the standard framerate.


Isn't the manufacturer doing a worse/larger version of this assumption normalisation by having these things on by default?

More articles explaining how to customise the settings is better than less because it draws attention to the fact these options are being forced on all consumers.

Shaky-cam action movies are their own unique problem :)

I think they exist because it saves a fair bit of cash on fight choreographers and the time it takes to train the actors to do it realistically. On the flip side, it really increases the respect I have for properly choreographed action scenes that are consistent and visible.


Juddering 24fps footage means you have a display that isn't displaying 24fps properly (see here: https://www.rtings.com/tv/tests/motion/24p). The refresh rate and frame rate aren't matching, most modern TVs account for this and can correctly display it.


It's not that, because I experience exactly the same in movie theaters. I'm always amazed when people say it doesn't bother them.


Most people see smooth movement. The brain usually interpolates for us as part of its normal wetware, such as blind spot filling, blur when moving your eyes quickly, low light vision, etc. People who are more visually sensitive and get motion sickness from moving pictures probably just have different circuitry that is more sensitive to the digital/analog mismatch.

I would say your visual experience is fairly rare, but common enough that I regularly encounter people that have various issues with digital motion. I have no problems with low framerates myself, but I do find camera shake (especially as so common in video games now) to be nauseating, though it seems most people have zero issue with it.


We could be confusing terms here... I mean 'judder' as uneven frame times, it looks like things skip slightly out of time. Uneven frame times shouldn't even happen with a cinema screening unless something has gone terribly wrong.


> The thing I hate about "advice" like this is that it assumes that everyone likes the same things and feels the same way

It's so ironic that at the end you gave an advice:

> I suggest that the next time you watch a shakey-cam action movie, you try turning it on too!

Are you assuming that everyone likes the same things you like? I guess not. The same for the article—it doesn't assume that.

No advice fits everyone, but good advice fits many people. And I'd argue that the article's advice fits many, if not most, people.

TV is probably the only screen many people have that has motion interpolation on by default. They watch movies on theaters, PC monitors, laptops, tablets, and phones probably more than on TVs.

Many people are already used to what movies "look like." Non-tech-savvy ones might not know what "FPS" means, but they would likely be able to notice that watching movies on their phone feels different from on a TV with the default setting. The article's suggestion to disable motion interpolation on TVs makes the watching experience more consistent across all screens.


Don't know how you do it. I can't stand the jelly effect. Motion interpolation was the first thing I turned off when we got our A95K earlier this year.


> it assumes that everyone likes the same things and feels the same way

I don't think this is about feelings. (But it just might be in a small minority of cases IMO, and if it's the case for you, then all the power to you.)

In the case of this article, all the points are about discernment. Some people not only notice the issues, but are also able to identify them and articulate precisely what is happening. The rest don't.


No one was shaming you, get over yourself. The author repeatedly framed recommendations in the context of how the creators produced it. If you feel shame that’s all you.


I recently went down the rabbit hole to find a dumb TV. It was surprisingly difficult. I ended up with a Sceptre 65 inch TV, to which I’ve plugged in a rooted, jailbroken Chromecast.

It’s been awesome. The TV is fast to boot up, responsive, doesn’t spy on me, and doesn’t need useless software updates.


What's the benefit of rooting and jailbreaking a Chromecast? You can already cast anything you want to them, so I'm assuming there must be added functionality.


Well, I didn't expect that to get so many comments.

It looks like there are a lot of people with perfectly valid reasons to keep some or all of these settings on. Which is great. I will rewrite the article to reflect that — and change the title to something more neutral and less absolutist.


I'm honestly shocked at the amount of pro-motion interpolation comments there are in here lol


Why? If you have a decent TV (hint: SONY), a little bit of motion interpolation can make stuff look amazing.



I find the headline to be seriously clickbaity, as the word "smart" (in the context of TVs) generally refers to a network connection that facilitates streaming, telemetry, ads, etc. but TFA is not discussing that category of features whatsoever. It's discussing features totally unrelated to the growing popularity of disabling "smart TV" features for the sake of privacy and fewer ads.

Unfortunately, I don't know that there's a generic non-jargon word for this collection of settings, but let's not solve for that by overloading the word "smart"!


That was my fault - see https://news.ycombinator.com/item?id=38526517. Fixed now. Sorry!


No worries; I only meant to comment on the headline within the work itself and didn't even notice a discrepancy!


Those are image-enhancement settings. Too jargon-y?


Not too jargony, but it possibly conflates the regular calibration settings (brightness, contrast, etc.) which are actually worth tweaking because they let you get closer to the original signal. The settings this article discusses generally let you stray farther from the original signal.


Attend a CES and you'll see this use of "smart" is standard in the TV industry. They had net connections before they had special apps and the word "smart" wasn't used back then.


So if I asked enough industry people at CES what makes a smart TV smart, ultimately I would more often hear about things like motion interpolation than things like network connectivity?


TVs were sold to us with some degree of smarts, they weren’t really smart, and upgrading or replacing the smarts is too much work.


Or indeed being a sort of tech apologist by describing it as “weird”.


motion smoothing is the worst feature ever conceived by man


For me it’s the absolute most important setting to enable on any TV. Without it I really notice the tearing / juddering effect that footage <40fps has.


And you know what, you may both be right! Different TV's use different algorithms and tricks so what looks great on one, might look quite bad on another one.


Yeah absolutely true. I’ve seen some Samsung brand TVs do a really bad job of it (also their colours are terribly over saturated by default), where I’ve found the LGs do a good job (at least the c and g series).


I use it on a game-by-game basis when playing my Switch, because it's so underpowered. It works surprisingly well when playing Zelda Tears of the Kingdom.


I hear it's good for sports.


Sports are already shot at 60 fps. Same for any live TV events. It makes movies look terrible, like it was shot on video, to me and many others. Soap Opera effect. But for things that were shot live on video, it looks like what it is.

For the high frame rate version of The Hobbit that I saw, in my opinion it looked bad for character shots but cool for overhead action views.


Nowadays, TV sets can display a lot more that 60 fps (120, 144Hz sets are quite common).


It's only good for anime where the fps is extremely low


Counterpoint: this YouTube rant by an animation person called Noodle is a pretty good overview of why frame interpolation sucks. https://www.youtube.com/watch?v=_KRb_qV9P4g

Basically, low FPS can be a stylistic choice, and making up new frames at playback time often completely butchers some of the nuances present in good animation.


Low FPS can be a stylistic choice, but as a member of the audience, I tend to disagree with that choice.

Perhaps it depends on the quality of the execution, but there are shows where I wished I had frame interpolation.


Personally I'm dubious. There may be times when the low framerate is a stylistic choice, but the vast majority of the time it's purely a budget thing.


Which the style is then centered around.

If it had looked better with frame interpolation the studio could have baked it in, before release.


If you are a good director you can make the most of that low budget. Look at the first episodes of Scum's wish (https://www.imdb.com/title/tt6197170/) if you want a good example.


Animation is the worst use case for motion interpolation because the frames are individually drawn and timed by the animators to achieve a particular look and feel.


*Used to be. Now they're rendered


A lot of animation is still drawn. Some CG anime is also combined with 2d drawn elements (like in SpiderMan into the Spiderverse).


The luddite hacker in me wants to try interpolation on early Ghibli movies.


Do it! I personally love the old Looney Toons, it makes them so smooth it's almost unreal (but in a good way).


It amazes me that humans continue to have this perverse desire to fix what isn't broken.


1. Features on the box.

2. More pop on the store display.

It turns out that "fixing" these things do result in more people picking the TV. Just like an overstated display will typically be preferred in a side-by-side comparison.


And don't forget overscan.


The fact that it's still enabled by default on TVs being sold today is an unforgivable sin.


We had the perfect opportunity to dump it into the wastebin of history when TVs switched to HD, but for some damn reason the industry decided to carry it forward.


Provisioning for overscan on 1080i CRTs seems just as valuable as with 480i CRTs.

People want content to the edge of the screen, but not to pay a TV technician to come and calibrate their tube to exacting standards in their home. Content creators need to know that some of their broadcast is invisible to some of their viewers as a result.

Pixel perfect tvs came later, so the transiton to HD wasn't the right time. ATSC3 could have been a reasonable time to change, but then broadcasters couldn't use the same feed for ATSC1 and ATSC3 ... and who knows if ATSC3 will ever win over ATSC1, or if all the OTA TV spectrum will be refarmed to telcos before that happens.


You would be surprised how many movies on dvd/bluray have junk in the margins. Usually just a line or an overscan of the audio track. But a lot of them have it.

I turn off overscan as those artifacts do not bother me.


Sometimes you can see the closed captions line of the NTSC signal at the top of the frame of the video when watching an old show converted to digital from an NTSC source. It looks like a single line of black and white dashes which dances around quickly every time the onscreen captions would have changed and then sits static until the captions update again.


I highly recommend when buying or setting up a tv looking to see if HDTVTest on youtube has a review of it. He really does a good job of going in to the effects of certain tv settings has on the picture. Be forewarned that he is in the UK so TV model numbers wont line up a lot of the time but with a little digging you can find your model that corresponds to his.

https://www.youtube.com/@hdtvtest/featured


Every electronic device should be fitted with a large red mechanical switch labelled "Fuck Off" to reset the device to its most basic usable level. Mobile phones, TV, computers, cars, washing machines, any device where manufacturer or seller has options they can configure for you, should be required by law to have such a switch.

And FTR don't have TV anymore since 3year old tested the strength of the screen with a hammer. Don't miss it. Hypnotic on Linux for news TV and Netflix for (mostly stupid) ways to waste an evening.


I'm one of those people who turns on motion smoothing. For some weird reason, it makes older shows like friends or sound of music crystal clear on my LG C2 OLED. I can't explain why.


Motion interpolation is an absolute essential for me. I can’t stand how choppy TVs (even high end models) are without them.

30fps videos look very jarring to me, I almost always notice “tearing” or “shuddering” - even when in a cinema. Enabling motion interpolation / frame rate up scaling usually fixes this for me.

It’s so distracting and at times almost painful for me to watch without it that at times I’ll use a tool (Topaz Video AI) to re-render videos to 50-60FPS.


Interesting!

I can't stand motion interpolation. Turned off on every TV I own. Will literally walk away and do something else if it's on a (non-sports) TV in public. There's something "too smooth" about it that irks me.


Same here. It feels like it takes everything, from classic B&W to modern SF extravaganzas, and turns them all into somebody's home videos.

At the same time, I'm pretty confident that this is a subjective phenomenon. My parents have it on all their TVs and my mother both prefers it and notices immediately if video isn't 60fps or equivalent, while my father says he doesn't notice the difference.


I tend to avoid it, but don't constantly try out newer devices and their settings. I always remember when I first saw it on a proud friend's new TV about a decade ago. I was deeply disturbed and asked him to turn off the feature.

We were watching an action/fighting movie with swords and other martial arts, and I distinctly saw these graceful arcs of the actors' limbs and weapons turned into polygons. The motion interpolation clearly inferred linear transitions between the different positions captured in the actual frames. Imagine a large swing tracing out an octagonal path with all the physical unreality that would entail.

It seemed like I was the only one in the room who perceived this travesty.


Usually TVs have bad motion interpolation which ruins the concept for many people. I use SmoothVideoProject on my computer which uses your GPU to analyze the motion vectors between frames via deep learning (Nvidia optical flow analysis) so it's much better.


The problem typically is that motion interpolation isn't consistently smooth. Generally a fixed framerate 30fps will seem smoother than something that goes between 40-60fps. Our brains are sensitive to changes in the pacing.


The motion of an object isn’t the same as the frame rate though. You can have a 60fps scene where an object is moving fast on one side of the screen and slow on the other. It only means that for a given object travelling from A to B - it will have more fine detail in its movement for a given distance.


It’s interesting to think how different our visual systems must be right? I’m always saying to friends “how can you watch this? It’s so choppy!” And some of them agree and others don’t see it at all.

Biology is weird so I say just give people the options to pick what works for them.


I read this (from HN?) awhile ago, and it boggled my mind about how much subjective reality is actually invisibly self-fabricated.

https://www.portsmouthctc.org.uk/a-fighter-pilots-guide-to-s...

I expect refresh rate is similar, given that... if a substantial portion of your subjective perception is mentally fabricated, then your brain physiology contributes, and that's set during childhood.

For reference, I grew up on NTSC screens (29.97 interlaced frames per second).


> For reference, I grew up on NTSC screens (29.97 interlaced frames per second).

Considering it as 30 interlaced frames per second isn't really accurate. It's 60 fields per second. A lot of content intended for interlaced broadcast is not 30 fps broken into fields, it's 60 distinct half pictures per second.

(Excuse my rounding)


That link is an article I really, really wish I'd read while learning how to drive, and is something I'll teach my kid before he starts riding a bike with traffic. I hadn't seen it before, so thanks.


That and the Dutch(?) bike-safety trick [0] are minimal effort life hacks I got from HN.

[0] In urban/bike areas, always open a car door with your opposite hand (e.g. driver's side with right hand). It forces you to turn your body, which allows you to look behind you, which lets you notice bikers approaching from behind before you open the door and splat them.


Usually TVs have bad implementations while on the PC, using something like SmoothVideoProject which uses your GPU for motion vector analysis via optical flow makes it much more high quality.

Also, you get used to it after a while. At first I was similarly jarred by the smoothness, as if it were fast forwarded, but after a few hours of getting used to it, when I saw 24 FPS content, it was literally unwatchable as it looked like a slideshow to me.


Decades of TV being filmed on cheap(er) video cameras that had lousy picture quality but captured at 60 fps vs. film that looked beautiful but only captured at 24 fps has taught people that blurry smeary motion is the ideal.


I used to think that but now I’m not so sure. Yeah 24fps is bad for panning and sweeping movements, but….

There is something about 24fps that I believe may have something to do with how the eye or brain works that makes viewing more immersive. Perhaps it’s due to historical cultural reasons, but I’m not sure that totally explains it.

FWIW I play valorant on a 390fps monitor so I am not a “the eye can only see 60fps” truther.


24/30fps looks dreadfully unnatural and distracting to many people though. It’s almost painful to watch on larger screen sizes.


If it’s blurry or smeary then your TV / source is doing it very wrong or just can’t keep up or is too low resolution / lacks quality upscaling.


It's blurry and smeary in the movie theater. You just can't capture fast motion at 24 fps. Once you train yourself to look for it you will never be able to stop seeing it.


Oh sorry you mean it’s blurry and smeary at 24 fps in the cinema! Yes I agree. Sorry I thought you meant that higher frame rates looks blurry.


Same. It's more noticeable on large screens because more is in the peripheral vision. Screens are larger today (and perhaps people are putting bigger TVs into smaller rooms than before) so we see more of the screen in our peripheral vision than before.

Peripheral vision has a lot of rods (instead of cones) which are more sensitive to rapid motion. I can certainly pick up flicker and "perceive the frames" more clearly when looking in my peripheral vision.

Same goes for the old CRT monitors: 60 Hz was an absolute no-no, 85 was tolerable but higher was better.

Edit: CRTs were worse, of course, because they were constantly flashing light-dark, unlike LCDs which simply transition from frame to frame.


A major issue with motion interpolation is that it can’t be perfect, and is often far from it. The implementation on many TVs is jarring, you’ll see super-smooth motion while an object is moving a slow or medium speed, but as soon as the patch of pixels that it’s tracking goes really fast, it assumes the patches are distinct, and the motion will be juddery. Individual objects switching from high-framerate to low in the span of a half-second is quite noticeable to my eyes, but I admit that most people around me don’t seem to care.

Maybe one day the real-time implementation will be good enough, but I find that it’s shockingly bad most of the time.


Is it possible what you’re seeing is ‘judder’[1] or bad 3:2 pulldown? I really don’t think much actual ‘tearing’ [2] makes it to screens in theaters - that would be a big screwup!

1 - https://www.techtarget.com/whatis/definition/judder

2 - https://en.m.wikipedia.org/wiki/Screen_tearing


Yes sorry, Judder is the correct term.


Is tearing common? What video source are you watching where the frames recorded in the video do not match the frames displayed by the display?


Very common for video footage lower than 40FPS. It doesn’t matter what the source is (AppleTV, laptop with HDMI, nvidia shield, PS5) - this is very noticeable to a large chunk of the population.


Perhaps you're conflating juddering and tearing? - they are distinct. Judder is what you see with, for example, low-fps panning, but tearing is where one segment of the screen (usually a horizontal strip) is out of sync, still displaying the previous frame, while the rest of the screen has moved on. This is not normal on a correctly configured system.


Sorry - yes I am - I had to look up some examples but I’m talking about juddering.


What do you mean by “tearing”? Like, VSync-off tearing or a different effect?


Sorry I was conflating juddering and tearing, I meant juddering.


Must be vsync off otherwise nobody would watch movies.


For me when motion interpolation is on, I can immediately see that it's interpolated. And then I keep noticing the artifacts where the lines meet and boundaries. It's very distracting. I experimented with this setting while watching Koyaanisqatsi and for me it was better when it was very slight interpolation (at 3 on the scale of 1 to 10).


Check out SmoothVideoProject, it does the interpolation in real-time. I also use Topaz sometimes but it's too slow for most use cases unless you have the time to wait.


Very cool! Shame that Plex Server / AppleTV doesn’t implement this natively


Yes, maybe it could be done as a plugin but I generally use it on my computer. They also have an Android app which interpolates video on your phone too, pretty cool.


I have done this on family and friends' TVs a number of times already. Most of these settings are crapy and very visibly making picture worse instead of better. Worse offender in my opinion is the noise reduction or sharpness. Noise reduction is incidentally also the one which makes Smart phone or other cheap camera outputs worse.


> Noise reduction is incidentally also the one which makes Smart phone or other cheap camera outputs worse.

This is all subjective of course, but I think you'll find that in cheap cameras, overdone noise reduction is the culprit, rather than noise reduction itself. If you're able to look at the raw sensor data, I think you'll find something even worse still. Small sensors are inherently very noisy, in typical lighting conditions.

So yes, the images look worse than optimal, but not worse than if there was no filtering at all.


Modern smart TVs are so disappointing that I just prefer watching films on my 27" IPS computer monitor - no bloatware and every video just looks right.

Not to mention that after 6 years the TV becomes useless junk killed by bulky modern app updates. I think there is a market for something like "Framework TV".


Some Webos LG TVs, like mine, can be rooted. I can now do wonderful things like install an ad-blocking version of youtube(still works in spite of recent changes). And SSH into my tv and mess around with the Linux system if I want to.


I don't want to hack my tv, I want to watch a film.


It's "hacker" news...


I suppose there's an opportune time to hack and a sub-optimal time to do so, subjectively.


Parent comment's point is that you shouldn't have to hack your TV just to get a half-decent user experience.


You shouldn't have to, but the fact that you can is part of the hacker ethos. We've already gotten past the most fundamental part of hacking, can it be done. The next is can it play Doom. The third is announce it. Then, we just have to get past the boring people asking "why"


I don't want to spend all my waking hours hacking all things in my life. That's not what "hacker ethos" means. Especially not when the "hack" is working around stupid wankery, which, to me, is completely boring and uninteresting and doesn't "build" anything.

I just want to watch a film.

I don't want to hack my forks either. Or my microwave. I just want to heat stuff and then shovel it in the appropriate foodhole so I can get on with things.


Then don't buy the thing in the first place????


I’d rather spend the time “hacking” something fun than a smart TV. My time is better spent elsewhere


Tell that to the VC posters, the corporate drama posts, the psychology links, the biology, space and popular science articles that are on here.

This place isn't for just hackers, not anymore at least. And that brings good and bad.


But can you make the UI not super laggy?

I recently acquired a very old Sony TV it reminded me how lag-free TV interfaces are supposed to work. But nooooo LG is like "we're going to make great TVs but the UI is a poorly written web page running on a 386, enjoy!"


Not really, but you can improve it a little by disabling automatic firmware updates. So it at least doesn't get worse.


LG webOS in my opinion is the least bad out of all the smart TV operating systems, and so far (we have a few of these at home & at work) I don't really feel the need to root them as they are ad-free and working fine even after several years of auto updates. This lack of enshittification has urged me lately to buy and prefer LG products over Samsung et al.


I agree, I'd rather use something else, even if only for philosophical reasons, but I couldn't find anything that didn't sacrifice something - or at least that could promise it didn't.

No separate box seems to do all of 4K, Atmos, Dolby Vision, HDR10+, HLG without some weird restriction like 'except the Netflix app'. Even worse if you wanted DV IQ I assume, but I think my TV doesn't support it anyway.

And honestly I think I prefer WebOS to Fire sticks' UI for example. Stock Android TV isn't great either. If I was into playing Xbox/PS games I imagine that would make a better 'hub', since it's presumably (I have no idea) more organised around your own content/games at least rather than random adverts and features you don't want. Also probably more money being spent on designing it.


if you have an LG tv, HDR10+ is not needed as the dynamic handling of HDR10 on the tv makes the HDR10 layer look as good as HDR10+. With this is mind, an Nvidia shield pro is pretty much what you are looking for.


An external Chromecast stick also works fairly well (assuming you pay for YouTube premium and Netflix etc to avoid ads, but at least not much in the way of bloatware).


the webOS on my LG is so out of date that modern apps won't install. so instead, i don't use the webOS, and treat it as a dumb TV with an HDMI input by using an external device for the smart things. there is nothing wrong with the picture that any of the updates would be required. so, just don't connect the smart tv to the rest of the world, and say thanks to all of the others that do to subsidize the price of your TV


> they are ad-free

The home screen on my LG TV seems to devote the top 60% of the page to sponsored content. Is this not an issue for you?


I understand where you are coming from, but I personally think that it's reasonable for a TV to show me sponsored content that is TV shows. I don't see this any different to how Netflix or Spotify show sponsored TV shows or music.

But my friend has a Samsung TV and it shows him ads for things he can buy from eBay, ads for games etc. in a similar area .. and yeah that's nonsense, that would really get on my nerves.


I think you can switch it off. Poke around on menus


I don’t even have a Home Screen on my C9


It's not that hard to get around this. I kept my TV offline and plugged in an Apple TV immediately.


Some TVs constantly display nag screens or blink the power LED at you if you keep them offline.

I can't decide if it's better to check for such things before purchase, or just return it as defective if I end up getting a model that does this.


Absolutely return as defective. Make it sting a little.

If it doesn't say on the box "requires constant internet to function" in big bold letters, it should.


What about saying in big bold letters 'requires constant electricity to function'?

That would be a ridiculous requirement, because people expect that TVs need electricity. But the producer can argue that people these days also expect TVs to need the internet, can't they?


They sure as fuck cannot, I've never connected a single TV to the internet. That's something that needs to be explicitly declared.


That would be a for a judge to decide.


They're going to take me to court for returning a TV?


They might refuse to accept your return, and you might have to enlist the help of the legal system, if you want to force them to accept it. Of course, details depend on your jurisdiction and on the retailer / manufacturer and what warranties they give you (or are forced to give you), etc.


I also use an AppleTV with my Samsung TV. I did give the tv WiFi credentials so that sometimes if needed it can reach the internet (eg firwmare download), but I set up firewall rules to block all traffic for the tv in steady state.


Trust me, you don't want your TV to download firmware. My Sony TV started freezing every so often after downloading some random update. Since then, it's gone off wifi and only the Chromecast is allowed to connect.


This is why i bought a Sony. They have a ton of features packed in but won't scream if you don't ever connect them.


> [...] blink the power LED [...]

You can fix that with a piece of tap or blu-tack.


Convert the Smart to DumbTV by offloading all apps to your favorite streaming device (for example, Apple TV or Nvidia Shield). If it helps, don't ever enable internet connectivity to the Smart TV.


Worse yet, now high end monitors (Samsung Odyssey G9 OLED) are offered with poorly implemented smart TV hub features.


The only thing that works well on their smart TV hub features is the ability to block signal from my current-gen Chromecast.

Coordinated asshole design.


How does that blocking work?


I have no idea, except that it blocks video output, and comes up with a prompt to log in to whatever service via the TV. I can cast just fine from e.g. a webpage or a podcast app. But Netflix or Max or Prime? Forget it.


It doesn’t become useless junk. You can still use it as a display without any of the smart features. Connect to a Roku or Apple TV (which also automatically turn on when the TV turns on) for smart features.


I've honestly never used the smart features of my TV. It's an old 4K LG (Netcast era), so it's been in service for 6+ years. The "smart" stuff always sucked: riddled with privacy policies, generally quite slow, and could honestly be replicated way better with a media stick / NUC.

A NUC is generally the way to go for smart TV functionality in general, IMO: you get way longer support, and it's much easier to diagnose problems as compared to some locked-down, proprietary, do-it-ourselves OS.


This gets most of the way there, with buying a monitor instead of a TV. You just need to take the next step and get a pg42uq. OLED, super accurate color, great response time, and unfortunately the answer to the question what would modern TVs cost without the app/telemetry subsidies.


It was always surprising me than digital signage screens without all that TV crap were many times more expensive than just consumer equipment.


There are some business/enterprise displays offer a "framework" approach by using a RPI CM4 to control everything. Unfortunately, that is out of reach for most consumers lacking a corporate sales account.


That also makes me wonder how much TVs are being subsidized with ads and sponsored content. Those businesses displays cost significantly more than a regular consumer TV.


Funny, I use an LG OLED 48" TV as my monitor. It's really the cheapest option for a 4k 120hz OLED, although I'd prefer 240hz as GPUs are getting more powerful.


yeh it's a shame my TVs work so fast when I first bought it but becomes noticeably laggy with each update


Doubt it. Most people only care that their TV is huge and cheap. Visual quality is a distant afterthought.


Well, an OLED will be pretty much dead after six years anyways.

Also, you cat reset an old, slow TV, put it in "dumb" mode, then add something like a Vero V or another box.


This depends a LOT on both how, and how much the display in question is used.

You can make an OLED visibly burn in within a couple months if you max the brightness, cover it in static content and then leave it running constantly.

Or it can last a decade if used lightly, on lower brightness, with good burn-in compensation software, with few to no static images.


A decade is still a really short lifespan for a TV


Is it though?

2004 is the first year I remember watching a HD widescreen TV show (Lost). There were a handful of test clips and the like earlier, but they were a rare thing. E.g.: there's a HD video taken of the aftermath of 9/11 because the operator was just learning to use this (at the time) brand new technology.

That lasted about a decade, and then 4K become a thing around the 2013-2014 timeframe. I got a Sony 4K TV in late 2013 and a Dell 4K monitor in 2014.

However, both of those were standard dynamic range, albeit with a fairly "wide" colour gamut.

I got my first OLED HDR display in 2021, and an OLED HDR TV in 2022.

In other words, every "generation" of television technologies has lasted about a decade before being superseded by something significantly newer and better.

Display technology still has some ways to go. No current consumer panel reproduces 100% of the Rec.2020 colour gamut. Similarly, there are no consumer panels that can reproduce the full HDR10 or Dolby Vision brightness range of up to 10,000 nits. There are 8K displays that are close to that target spec, but they're either one-off experimental models or more expensive than a luxury car.

A decade from now, around 2033 mark, I full expect something like VR technology to have replaced TVs for many people. Apple's new headset (available next year!) will set a new quality bar that no traditional TV can ever hope to match, such as an extremely wide field of view, stereo, and very bright HDR in combination. Sure, they're expensive now, but after a decade of further development? They'll be amazing and cheap.


In your future, do households with more than one person exist? Because no matter what the tech is that'll never happen. We have a family, and after dinner if we want to watch a sitcom and laugh, we ... what .... go put on our own individual headsets and lay in bed? What happens on Sunday and it's icy outside, and we want to put a james bond movie on with the booming sound, have a glass of wine while the kids make hot chocolate, how do we do that? If I'm a single dude with disposable income I can see the vr headset with insane quality being pretty awesome, but for ANY kind of household with a family or roommates, I just can't fathom that kind of dystopian nightmare.


Teenage kids will likely want to play video games or watch their own thing.

Many young people live by themselves and can't afford a "huge" TV. Meanwhile VR will give them a virtual TV as big as they like.

Another potential future technology that might obsolete the current 55-85 inch typical TV sizes is "OLED wallpaper", or something similar that would allow wall-sized televisions.

Even before the 2030s I expect something like quantum nanorods to replace OLED panels, potentially allowing up to 10,000 nits in an affordable panel that doesn't suffer from burn-in.

On another front, I've noticed a lot more people watching something like TikTok, YouTube, or Netflix on their hand-held devices instead of traditional TV. I can picture a future super-thin, super-light Apple iPad with an OLED HDR display becoming very popular, especially for people on a budget that can't afford a huge TV.


The comment you replied to wisely stated 'for many people', and not 'for all people'.

You are likely right about your specific use cases. Though you might also want to see how early on some people predicted that TV would never take off, because families just didn't have the time for it.

We don't have a TV at home, and we would just do different activities in the situations you described. (But, yes, those activities in these situations would likely not include hanging out with headsets on.)


We're outlier in the TV market. 95% of buyer don't care Rec.2020 and continue using TV until it broken (maybe 80%?). OLED isn't for who use it for a decade.


...it is? I buy a new TV every two to three years as notably better tech comes out.


We probably have different definitions of "notably better"


I definitely think notably better is fairly subjective here.

I have a 4K OLED from 2018. Since then, I think the only notable improvement in the models from the same manufacturer is maybe enhanced-ARC? That is a neat feature, but not sufficient imo. Maybe a 120Hz refresh rate, but very little content even uses 60, let along 120 - at least in the cinema and TV space.


Anecdotally, My cheap old TCL dumb LED tv light burned out after about six years which is about average from what I’m seeing online but maybe I can fix it. But I did use it as suggested with a mi box.


> Well, an OLED will be pretty much dead after six years anyways.

Why is that?


They get dimmer as they age and might also develop "burn-in".


This really isn't an issue these days if you research your screen. Rtings.com has a good series on their website and YouTube testing OLED burn-in, and there are many TVs where this is not an issue. Also check out Wulff Den on youtube and his videos testing the OLED Nintendo Switch. He does get some bad burn in after a year, but that's also because it's on a still image at max brightness and never turns off. I don't mean a year of standard usage. I mean 8,765 hours of it displaying the same image.


Screensavers became necessary again?


Not just that. OLED TVs nowadays have a screensaver that kicks in when it detects stationary image for more than a few minutes.

They also shift the image around by a few pixels to prevent logos and tickers from permanently marking the screen.

And when turned off, they also try to level the brightness of individual pixels by displaying different patterns - again to combat "burn-in".


They were never necessary to begin with: automatically turning the screens off (or at least to 'black') is never worse.


For OLED computer monitors: yep!


> Well, an OLED will be pretty much dead after six years anyways.

How much TV are you people watching such that this statement becomes even remotely accurate? My 2017 LG OLED display says it has 6400 power-on hours and it looks as good as new.


The average American watches five hours of television every day, or just shy of 11,000 hours over six years.

https://www.marketingcharts.com/television/tv-audiences-and-...


That explains a few things.

Still, the objective tests from RTINGS seem to suggest that it's the LCD sets that look all fucked up after a few years, while almost all the OLED ones look perfect. And they OLED sets aren't showing a downtrend in overall brightness over time.

https://www.rtings.com/tv/learn/longevity-results-after-10-m...


Lots of people leave their television sets on for almost the entire day, even when not watching. Never understood that.


Gives a feeling of having other people around and staves off loneliness.


SO does that. She'll put the show on pause, go do shopping, come back and resume. All while leaving the TV on.

I've tried to get her to change oh so many times, all in vain...

No rational explanation either. Just... that's how she rolls I guess.


That doesn't kill LG OLEDs thought, that results in a nearly black screensaver (just a small slow "firework" effect). Wastes a touch more energy than turning the tv into sleep mode (since nothing actually turns off anymore), but hard to believe that would shorten life much.


enable energy savings and turn off the tv after 2 hours. problem solved! Most tv's show a popup when they are about to shut off because of power savings, and with a button on the remote you can reset the timer.


This is extremely passive aggressive, tell me you’re single without telling me you’re single. People’s preferences being different than yours is not a technical problem to be worked around.


I’m not the one complaining here. My wife turns off everything before she leaves the place. Even if its just a quick walk with the dog :)


I had a girlfriend who would flush the toilet before using it. She said "it's cleaner this way". I tried to get her to stop, but it was a futile effort.


Before and after, I'd hope. :)


That's insane to me. I can't stand watching TV. I feel my life being sucked away.


That actuality sounds pretty standard. That would be from 5pm-10pm. Evening news, game shows, then like two 1 hour long shows and it’s already been five hours.


Many people leave the TV running in the background while they do other things, like eat dinner.

For the lifespan of the screen, it doesn't matter if anyone actually directs their eyes at it.


Try sitting down.


Yeah, that's a wild claim to me too. My 2018 LG C8 has about 5800 hours on it and it is indistinguishable from new. (Though I've never run it at max brightness because it would be blinding.)


Confusingly, the contrast control has a higher effect on image brightness (in the ordinary English language use) than the brightness control does.


Another neat idea is to connect all “smart” equipment to an isolated vlan and separate wifi that can still be seen by your normal network devices.

For example if your wifi was called “Home”, an additional “Home-IoT” is for every device.

The IoT devices can then be set to not sniff your network, or even connect out if you want.

A good example of this is in this EdgeRouter setup guide, which is a pretty decent guide on how to plan a home network for more than just basic home browsing.

https://github.com/mjp66/Ubiquiti/blob/master/Ubiquiti%20Hom...


I just want to turn the smart tv dumb, simple input switch, volume up down, brightness/contrast, and basic color adjustment, that's it.


My TV turns into this if I set it behind a pihole and block all requests to it. I have heard some TVs will look for open networks (or known ones with deals such as xfinity) but there are none around be that I'm aware of and it responds as if it is dumb other than the features that can be turned off like frame interpolation. But those do randomly reset themselves which is quite frustrating.


The key is block all network access. A lot of IoT devices have hard coded DNS so even running a pihole will sometimes not work.


Don't forget the sleep timer. Such a simple QoL improvement, and none of the TVs I've owned in the past 10-15 years have had one. Seems like every TV had it when I was a kid (or at least a good sized portion did).


The smartest thing you can do with a “smart tv” is to keep it unconnected from your WiFi and instead plug a Raspberry Pi into one of the HDMI ports and use that for your YouTube etc needs.


None of the functions TFA discusses has anything to do with connectivity.


No but my point is that if you keep your “smart TV” offline you don’t need to worry about any of the settings on it. And that’s just aside from all of the problematic things of allowing it to connect.


But their point is that your comment doesn't have anything to do with what the article is talking about. TVs having picture settings have nothing to do with connecting it to the network.


I mean sure, but then my TV wouldn’t have VRR that was enabled with a software update. Plus a lot of TVs will just connect to an open WiFi network anyways.


Not a smart tv problem specifically, but the inability to adjust volume level while on mute (at least adjust down). I often turn the tv on after the rest of the household has gone to bed and it's way too loud, so I hit mute. But as soon as I hit volume down, sounds comes back on at the initial loud volume! Aargh. Only thing is to navigate away to a silent input, adjust volume, then navigate back to previous channel. Annoying. I have seen this feature once before (Sony I think).


The local dimming suggestion isn't fully in line with the rest.

It's about bringing some parts of the image closer to whats intended by the filmmaker, at the cost of other parts of the image (usually noticeable by adding gradients to flat color). That isn't going against the filmmaker's intention, so much as respecting the contrast the filmmaker wants at the cost of some gradients. It's a different way to approximate the actual signal the TV should send.


Using the author logic, we should also watch kramer with the image shrunk to a quarter of the screen (the size of the tv in the past was smaller) and also using garbage color accuracy (I don't actually know how color accurate were tv in the past but I suspect it was horrible). I am still waiting for 4k/8k 100hz movies and youtube, but until then I will use motion interpolation on my TV as the next best thing.


My Samsung “HDR” TV can only approach what an HDR image looks like on a studio display if a bunch of these settings are on and balanced just right.

It feels as if Samsung deliberately lowers contrast, for example, until their dynamic contrast is enabled, as the picture matches studio displays much more closely then.

No proof, not making this claim, but it did feel this way when I had my TV side by side to a studio display.


Those aren’t Smart-TV features, and they precede the introduction of Smart TVs. Smart TVs are internet-connected TVs with apps [0].

[0] https://en.wikipedia.org/wiki/Smart_TV


Ok, sorry - the submitter did have "Switch off bad TV settings", and I changed it later to "Switch off weird smart TV settings" because that phrase felt more informative and does appear in the article. It doesn't work if it's wrong though!


haha what a nit-pick


Important to know that Roku has an "adjust frame rate to content native rate" which, insanely, is off by default. It will make up frames on all content by default if it's less than what your TV is capable of. I just got a new LG C3, and this was making us nauseous until I figured it out. I thought it was the TV, but unbelievably it was the Roku.


Then Roku must be doing something _really_ weird.

What this setting is usually used for, is to avoid "blinking" and screen going dark when you leave menus and start watching videos — there is unavoidable (...mostly. VRR kinda fixes it?) renegotiation period whenever going from 60hz<->24hz or 60<->30. So to avoid the TV going dark for a second to re-negotiate the connection, the device steps in and "pretends" it's all 60hz all the time.

_Usually_ what a set-top box does is just frame double (if possible), or do 3:2 pull down, which _should_ be unnoticeable for human eyes, unless you're very sensitive.

Side-note: this is why HDMI 2.1 and 120hz HDR is so good for everyone. 120 divides cleanly into 24, 30, and 60fps so you don't have to deal with 3:2 conversions ever.


I went back and watched content that had been jarring, and it's fixed. My wife, for whom all this is instinctual as opposed to technical, is able to watch now without feeling sick. IDK, it's weird.


Between this and the embedded crapware, I am glad I went the videoprojector route. I initially though I would not bear the sound of its fan but since it is a constant noise the brain just filter it out. And I use my noise cancelling headset when I am alone.


Thankfully, there are a few projectors with very quiet fans. I haven't kept up with the current models, but Sony made some roughly eight years ago.


HDTVTest [0] is a great YouTube channel if you want to go in some of the depths of this.

[0] https://youtube.com/@hdtvtest


Noise reduction and dynamic brightness aren't too bad if done tastefully. But it's really up to the TV manufacturers to do it properly which is why there is just general advice to turn it off.


Does the same apply for audio settings as picture settings?

For example, Dialog Clarity/Enhancement, TruVolume (automatic volume leveling), and DTS Virtual:X?

Why or why not?

Do you use Spatial Audio on your Apple products (which sounds great to me)?


Motion features are the worst, I find they always make shows and movies look uncanny. Have turned this off on many friends and family devices, who seem to not notice until I turn it off.


it is surprising that manufacturers turn on motion smoothing by default, as it ruins the aesthetic of most content and makes it look like it was shot on a 1990s camcorder.


New generations will get used to more fps, and new content then would be able to filmed at 60 fps. I think that's a good thing. We can't be stuck in the past because of current habits


True. However my theory is that filmmakers would not use the same shots in a 60fps world that they use in a 24fps world.

The aesthetic of it is so different, seemingly much about the creative aspects of a production would be done differently.

Motion blur is evidently used by filmmakers in an artistic/aesthetic way similar to the way rock bands use guitar distortion and electronic musicians let some signals clip, photographers use lens flare, use limited camera dynamic range for artistic effects, etc.

Adding a post-production step to remove the "limitations" that were used artistically does alter the aesthetics of the production.

Imagine if metal albums were played back on "better" stereo equipment that removed all of the distortion products produced by the amplifiers.


That's for sure. Still, if you want you can always use older tech, like they do with film cameras now.


> Wouldn't I have remembered Elaine being a 9ft tall blue humanoid alien with a tail?

I never observed what TFA is complaining about, does someone have an screenshot?


A couple years ago, my Samsung TV slowed to a crawl. Each click through a menu took multiple seconds. I eventually discovered a new setting buried deep to turn off "real-time anti-virus scanning". That immediately fixed the performance problems.

How would my TV get a virus? This was a Tizen TV, not an Android TV where I'm installing shady apps.

https://www.techspot.com/news/78967-samsung-loading-mcafee-a...


Or don't buy products that are subsidized by recording and selling your data! Not to mention these half-baked "features" produce thousands of hours of headaches, tech support calls, and general unhappiness. $tvManufacturer could care less because red line go up.

Build quality and software invasiveness are both going to keep trending in the wrong directions until people stop buying smart TVs. And it's not like you need to break the bank or order commercial displays - $150 on Amazon for a dumb 43" 1080p, $260 for a 55" 4K.


I don't necessarily disagree, but this article doesn't talk about any of that. It's talk about picture setting, like motion smoothing, dynamic range, local dimming, etc.


It's a shame no one makes a control board for TV screens like you can often find for old screens like from digital photo frames.


At first, I bought into the "switch off weird (Ai) settings" talk.

Quickly, it became apparent that I couldn't come close to the overall picture quality potential with manual settings. This is for a late model qled. The chip and its auto-adjust capability are built for the panel.

So now I deal with some soap opera effect as a trade off for an amazing picture in every other respect. And I rarely notice the soap opera effect anymore. One gets used to it.


Imagine actually, unironically watching a movie with the motion smoothing off. Yes, I sure like to watch my movie on Google Slides and see every single frame during camera pans.

Perhaps once movies are recorded at a frame rate from the current decade this advice will be useful. We play PC games at 144Hz+ for a reason. There is no need for movies to still run a format derived from technical limitations of a hundred years ago.


Basically deactivate everything, not that complicated :) I must admit it's always a pain when friends are asking me to aknowledge how awesome is their brand new tv just to notice that it's all défault, very catchy, but definitely not good and tiring to watch.

Do you guys remember this hero putting papers on car's windshield to explain users how to fix their TV ? You are what you fight for !


I wonder what it would look like if you designed a movie to glitch these settings as badly as possible.


I'd be happy if my Samsung didn't put up OSD overlay at power on.


interesting youtube comparison:

https://www.youtube.com/watch?v=rNcNOgMwH_4


"At first this seems great. Why shouldn't 90s sitcoms seem like they were filmed in 4k at 60 frames per second? Then you start noticing things…"

Interpolation will never give the same results as actual capture, so the author is wrong here.


Obligatory PSA to enable the setting to set correct framerate of the content from the source you're playing if you have at least 120hz-capable panel.

E.g. if you're playing 24fps movie from a device that's reporting 60fps uniformly across all content, you will have a bad time.

For Apple TV it can be set like this: https://support.apple.com/en-us/102277


Or just - get rid of your TV. I’m no Luddite but I’ve quite happily forgone a TV - and the concomitant ads and other irritants TVs bring - for nearly 20 years.


Motion smoothing is so horrible…I don’t understand how TV makers have persisted with it for so long. Surely the people who make televisions appreciate a good picture enough to not spread this nonsense, right?

The effect makes movies completely unwatchable. I was at a friends house watching Elf and it felt like watching Will Ferrell host a Zoom call.


Genuinely tired of people telling me what an awful person I am for the weird shit I like. You do you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: