omnicognate
a day ago
It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.
It seems they want to make these settings usable without specialist knowledge, but the end result of their opaque naming and vague descriptions is that anybody who actually cares about what they see and thinks they might benefit from some of the features has to either systematically try every possible combination of options or teach themselves video engineering and try to figure out for themselves what each one actually does.
This isn't unique to TVs. It's amazing really how much effort a company will put into adding a feature to a product only to completely negate any value it might have by assuming any attempt at clearly documenting it, even if buried deep in a manual, will cause their customers' brains to explode.
robhlt
20 hours ago
"Filmmaker mode" is the industry's attempt at this. On supported TVs it's just another picture mode (like vivid or standard), but it disables all the junk the other modes have enabled by default without wading though all the individual settings. I don't know how widely adopted it is though, but my LG OLED from 2020 has it.
burnte
19 hours ago
The problem with filmmaker mode is I don't trust it more than other modes. It would take no effort at all for a TV maker to start fiddling whit "filmmaker mode" to boost colors or something to "get an edge", then everyone does it, and we're back to where we started. I just turn them off and leave it that way. Companies have already proven time and again they'll make changes we don't like just because they can, so it's important to take every opportunity to prevent them even getting a chance.
robhlt
16 hours ago
"Filmmaker mode" is a trademark of the UHD Alliance, so if TV makers want to deviate from the spec they can't call it "Filmmaker mode" anymore. There's a few different TV makers in the UHD Alliance so there's an incentive for the spec to not have wiggle room that one member could exploit to the determent of the others.
robotnikman
13 hours ago
Huh, I never knew this, they even have a website
Good to know there seems to be an effort to keep some consistency.
Crontab
12 hours ago
That's cool info. Thanks!
mkozlows
16 hours ago
It's true that Filmmaker Mode might at some point in the future be corrupted, but in the actual world of today, if you go to a TV and set it to Filmmaker Mode, it's going to move most things to correct settings, and all things to correct settings on at least some TVs.
(The trickiest thing is actually brightness. LG originally used to set brightness to 100 nits in Filmmaker Mode for SDR, which is correct dark room behavior -- but a lot of people aren't in dark rooms and want brighter screens, so they changed it to be significantly brighter. Defensible, but it now means that if you are in a dark room, you have to look up which brightness level is close to 100 nits.)
squeaky-clean
17 hours ago
On my Samsung film mode has an insane amount of processing. Game Mode is the setting where the display is most true to what's being sent to it.
sethhochberg
16 hours ago
Game mode being latency-optimized really is the saving grace in a market segment where the big brands try to keep hardware cost as cheap as possible. Sure, you _could_ have a game mode that does all of the fancy processing closer to real-time, but now you can't use a bargain-basement CPU.
Avamander
14 hours ago
Not "Film mode", but "Filmmaker mode". The latter is a trademark with specific requirements.
Game mode will indeed likely turn off any expensive latency-introducing processing but it's unlikely to provide the best color accuracy.
jbaiter
19 hours ago
Yup, it's great, at least for live action content. I've found that for Anime, a small amount of motion interpolation is absolutely needed on my OLED, otherwise the content has horrible judder.
kachapopopow
18 hours ago
I always found that weird, anime relies on motion blur for smoothness when panning / scrolling motion interpolation works as an upgraded version of that... until it starts to interpolate actual animation
gchamonlive
18 hours ago
On my LG OLED I think it looks bad. Whites are off and I feel like the colours are squashed. Might be more accurate, but it's bad for me. I prefer to use standard, disable everything and put the white balance on neutral, neither cold nor warm.
Nihilartikel
18 hours ago
I had just recently factory reset my samsung S90C QDOLED - and had to work through the annoying process of dialing the settings back to something sane and tasteful. Filmmaker mode only got it part of the way there. The white balance was still set to warm, and inexplicably HDR was static (ignoring the content 'hints'), and even then the contrast seemed off, and I had to set the dynamic contrast to 'low' (whatever that means) to keep everything from looking overly dark.
It makes me wish that there was something like an industry standard 'calibrated' mode that everyone could target - let all the other garbage features be a divergence from that. Hell, there probably is, but they'd never suggest a consumer use that and not all of their value-add tackey DSP.
mkozlows
16 hours ago
"Warm" or "Warm 2" or "Warm 50" is the correct white point on most TVs. Yes, it would make sense if some "Neutral" setting was where they put the standards-compliant setting, but in practice nobody ever wants it to be warmer than D6500, and lots of people want it some degree of cooler, so they anchor the proper setting to the warm side of their adjustment.
When you say that "HDR is static" you probably mean that "Dynamic tone-mapping" was turned off. This is also correct behavior. Dynamic tone-mapping isn't about using content settings to do per-scene tone-mapping (that's HDR10+ or Dolby Vision, though Samsung doesn't support the latter), it's about just yoloing the image to be brighter and more vivid than it should be rather than sticking to the accurate rendering.
What you're discovering here is that the reason TV makers put these "garbage features" in is that a lot of people like a TV picture that's too vivid, too blue, too bright. If you set it to the true standard settings, people's first impression is that it looks bad, as yours was. (But if you live with it for a while, it'll quickly start to look good, and then when you look at a blown-out picture, it'll look gross.)
SV_BubbleTime
18 hours ago
This is all correct.
“Filmmaker Mode” on LG OLED was horrible. Yes, all of the “extra” features were off, but it was overly warm and unbalanced as hell. I either don’t understand “Filmmakers” or that mode is intended to be so bad that you will need to fix it yourself.
Tadpole9181
17 hours ago
Filmmaker is warm because it follows the standardized D6500 whitepoint. But that's the monitor whitepoint it is mastered against, and how it's intended to be seen.
TV producers always set their sets to way higher by default because blue tones show off colors better.
As a result of both that familiarity and the better saturation, most people don't like filmmaker when they try to use it at first. After a few weeks, though, you'll be wondering why you ever liked the oversaturated neons and severely off brightness curve of other modes.
Or not, do whatever you want, it's your TV!
mkozlows
16 hours ago
The whites in Filmmaker Mode are not off. They'll look warm to you if you're used to the too-blue settings, but they're completely and measurably correct.
I'd suggest living with it for a while; if you do, you'll quickly get used to it, and then going to the "standard" (sic) setting will look too blue.
gchamonlive
14 hours ago
The problem is that comparing to all the monitors I have, specifically the one in my Lenovo Yoga OLED that is supposed to be very accurate, whites are very warm in filmmaker mode. What's that about?
mkozlows
12 hours ago
Your monitor is probably set to the wrong settings for film content. Almost all monitors are set to a cool white point out of the box. If you're not producing film or color calibrated photography on your monitor, there is no standard white temperature for PC displays.
empiricus
15 hours ago
still looks like yellow piss.
empiricus
15 hours ago
Disclaimer: i prefer movies to look like reality. but apparently this is far away from "artistic purpose".
Forgeties79
12 hours ago
What does “like reality” mean?
empiricus
3 hours ago
It means that the colors should be correct. The sky on tv should look like the sky. The grass on tv should look like grass. If I look at the screen and then I look outside, it should look the same. HDR screens and sensors are getting pretty close, but almost everyone is using color grading so the advantage is gone. And after colors, don't get me started about motion and the 24fps abomination.
nevon
a day ago
I'm sure part of it is so that marketing can say that their TV has new putz-tech smooth vibes AI 2.0, but honestly I also see this same thing happen with products aimed at technical people who would benefit from actually knowing what a particular feature or setting really is. Even in my own work on tools aimed at developers, non-technical stakeholders push really hard to dumb down and hide what things really are, believing that makes the tools easier to use, when really it just makes it more confusing for the users.
consp
a day ago
I don't think you are the target audience of the dumbed down part but the people paying them for it. They don't need the detailed documentation on those thing, so why make it?
TeMPOraL
a day ago
> It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.
It would also help if there was a common, universal, perfect "reference TV" to aim for (or multiple such references for different use cases), with the job of the TV being to approximate this reference as closely as possible.
Alas, much like documenting the features, this would turn TVs into commodities, which is what consumers want, but TV vendors very much don't.
Avery3R
a day ago
"reference TVs" exist, they're what movies/tv shows are mastered on, e.g. https://flandersscientific.com/XMP551/
0_____0
20 hours ago
I wonder if there's a video equivalent to the Yamaha NS-10[1], a studio monitor (audio) that (simplifying) sounds bad enough that audio engineers reckon if they can make the mix sound good on them, they'll sound alright on just about anything.
SilasX
18 hours ago
Probably not, or they don't go by it, since there seems to be a massive problem with people being unable to hear dialogue well enough to not need subtitles.
https://news.ycombinator.com/item?id=37218711
It was a real eye(ear?)-opener to watch Seinfeld on Netflix and suddenly have no problem understanding what they're saying. They solved the problem before, they just ... unsolved it.
olyjohn
14 hours ago
My favorite thing about Kodi is an audio setting that boosts the center channel. Since most speech comes through that, it generally just turns up the voices, and the music and sound effects stay at the same level. It's a godsend. Also another great reason to have a nice backup collection on a hard drive.
estebank
16 hours ago
It's a similar thing to watching movies from before the mid-2000 (I place the inflection point around Collateral in 2004) where after that you get overly dark scenes where you can't make out anything, while anything earlier you get these night scenes where you can clearly make out the setting, and the focused actors/props are clearly visible.
Watch An American Werewolf in London, Strange Days, True Lies, Blade Runner, or any other movie from the film era all up to the start of digital, and you can see that the sets are incredibly well lit. On film they couldn't afford to reshoot and didn't have immediate view of what everything in the frame resulted on, so they had to be conservative. They didn't have per-pixel brightness manipulation (feathering and burning were film techniques that could technically have been applied per frame, but good luck with doing that at any reasonable expense or amount of time). They didn't have hyper-fast color film-stock they could use (ISO 800 was about the fastest you could get), and it was a clear downgrade from anything slower.
The advent of digital film-making when sensors reached ISO 1600/3200 with reasonable image quality is when the allure of time/cost savings of not lighting heavily for every scene showed its ugly head, and by the 2020's you get the "Netflix look" from studios optimizing for "the cheapest possible thing we can get out the door" (the most expensive thing in any production is filming in location, a producer will want to squeeze every minute of that away, with the smallest crew they could get away with).
Y_Y
21 hours ago
$21k for a 55-inch 4K is rough, but this thing must be super delicate because basic US shipping is $500.
(Still cheaper than a Netflix subscription though.)
mjlee
13 hours ago
If you account for the wastage/insurance costs using standard freight carriers that seems reasonable to me as a proportion of value. I’m sure this is shipped insured, well packaged and on a pallet.
Walmart might be able to resell a damaged/open box $2k TV at a discount, but I don’t think that’s so easy for speciality calibrated equipment.
dylan604
20 hours ago
Reference monitor pricing has never been any where near something mere mortals could afford. The price you gave of $21k for 55” is more than 50% of the average of $1k+ per inch I’m used to seeing from Sony.
BikiniPrince
18 hours ago
I disable all video processing features and calibrate my sets. Bought a meter years ago and it’s given me endless value.
Melatonic
17 hours ago
Yup - this is the way. Your room color and lighting effect your TV so proper calibration with a meter is always ideal
rsanheim
14 hours ago
These exist, typically made by Panasonic or Sony, and cost upwards of 20k USD. HDTVtest has compared them to the top OLED consumer tvs in the past. Film studios use the reference models for their editing and mastering work.
Sony specifically targets the reference with their final calibration on their top TVs, assuming you are in Cinema or Dolby Vision mode, or whatever they call it this year.
mjklin
21 hours ago
My local hummus factory puts the product destined for Costco into a different sized tub than the one destined for Walmart. Companies want to make it hard for the consumer to compare.
megablast
9 hours ago
You think the factory decided this?
mjklin
9 hours ago
The sizes were requested by the companies, the tour guide pointed this out in answer to questions.
lotsofpulp
21 hours ago
Costco’s whole thing is selling larger quantities, most times at a lower per unit price than other retailers such as Walmart. Walmart’s wholesale competitor to Costco is Sam’s Club. Also, Costco’s price labels always show the per unit price of the product (as do Walmart’s, in my experience).
mjklin
9 hours ago
The ones I’m talking about were only subtly different, like 22 oz vs 24 oz. To me it was obvious what they were doing, shoppers couldn’t compare same-size units and they could have more freedom with prices.
SoftTalker
20 hours ago
Often a false economy. My MIL shops at Sam's Club, and ends up throwing half her food away because she cannot eat it all before it expires. I've told her that those dates often don't mean the food is instantly "bad" the next day but she refuses to touch anything that is "expired."
RajT88
18 hours ago
My wife is the same way - the "best by" date is just a date they put for best "freshness". "Sell by" date is similar. It's not about safety.
My wife grew up in a hot and humid climate where things went bad quickly, so this tendency doesn't come from nowhere. Her whole family now lives in the US midwest, and there are similar arguments between her siblings and their spouses.
Scoundreller
19 hours ago
Also: freezer
drdec
20 hours ago
Showing a unit price on the label is a requirement of US law.
pixl97
18 hours ago
Which unit is the fun game that gets played. I've seen way to many products right beside each other that use different measurements.
lotsofpulp
14 hours ago
Most people will have devices that can easily convert measurements to the desired unit.
WillPostForFood
16 hours ago
There is no federal law requiring unit requiring unit pricing, but the the NIST has guidelines that most grocery stores follow voluntarily. 9 states have adopted the guidelines as law.
https://www.nist.gov/system/files/documents/2023/02/09/2023%...
mkozlows
16 hours ago
There is! That is precisely how TVs work! Specs like BT.2020 and BT.2100 define the color primaries, white point, and how colors and brightness levels should be represented. Other specs define other elements of the signal. SMPTE ST 2080 defines what the mastering environment should be, which is where you get the recommendations for bias lighting.
This is all out there -- but consumers DO NOT want it, because in a back-to-back comparison, they believe they want (as you'll see in other messages in this thread) displays that are over-bright, over-blue, over-saturated, and over-contrasty. And so that's what they get.
But if you want a perfect reference TV, that's what Filmmaker Mode is for, if you've got a TV maker that's even trying.
pmontra
a day ago
They will setup their TVs with whatever setting makes them sell better than the other TVs in the shop.
vladvasiliu
20 hours ago
I don't particularly like that, but even so, it doesn't preclude having a "standard" or "no enhancement" option, even if it's not the default.
On my TCL TV I can turn off "smart" image and a bunch of other crap, and there's a "standard" image mode. But I'm not convinced that's actually "as close to reference as the panel can get". One reason is that there is noticeable input lag when connected to a pc, whereas if I switch it to "pc", the lag is basically gone, but the image looks different. So I have no idea which is the "standard" one.
Ironically, when I first turned it on, all the "smart" things were off.
Melatonic
17 hours ago
Sometimes PC mode reduces image quality (like lowering bit depth) at the expense of lower input lag
vladvasiliu
2 hours ago
Is there a way to verify this other than breaking out a colorimeter (which I happen to have)?
thesuitonym
19 hours ago
I'm not certain this is true. TVs have become so ludicrously inexpensive that it seems the only criteria consumers shop for is bigger screen and lower price.
KronisLV
5 hours ago
The whole comment 100% matches my experience with any and every BIOS setup out there.
crabmusket
a day ago
"Our users are morons who can barely read, let alone read a manual", meet "our users can definitely figure out how to use our app without a manual".
deep_merge
17 hours ago
I just went through this learning curve with my new Sony Bravia 8 II.
I also auditioned the LG G5.
I calibrated both of them. It is not that much effort after some research on avsforum.com. I think this task would be fairly trivial for the hackernews crowd.
drob518
18 hours ago
Agreed. And I’m not going to flip my TV’s mode every time I watch a new show. I need something that does a good job on average, where I can set it and forget it.
fuzzy_lumpkins
4 hours ago
exactly. the only adjustment I need to be making is hdmi input and volume.
hsbauauvhabzb
a day ago
The purpose of the naming is generally to overwhelm consumers and drive long term repeat buys. You can’t remember if your tv has the fitzbuzz, but you’re damn sure this fancy new tv in the store looks a hell of a lot better than you’re current tv and there really pushing this fitzbuzz thing.
shakna
a day ago
Cynically, I think its a bit, just a little, to do with how we handle manuals, today.
It wasn't that long ago, that the manual spelled out everything in detail enough that a kid could understand, absorb, and decide he was going to dive into his own and end up in the industry. I wouldn't have broken or created nearly as much, without it.
But, a few things challenged the norm. For many, many reasons, manuals became less about the specification and more about the functionality. Then they became even more simplified, because of the need to translate it into thirty different languages automatically. And even smaller, to discourage people from blaming the company rather than themselves, by never admitting anything in the manual.
What I would do for a return to fault repair guides [0].
[0] https://archive.org/details/olivetti-linea-98-service-manual...
keyringlight
a day ago
Another factor is the increased importance of software part of the product, and how that changes via updates that can make a manual outdated. Or at least a printed manual, so if they're doing updates to product launch it might not match what a customer gets straight out of the box or any later production runs where new firmware is included. It would be somewhat mitigated if there was an onus to keep online/downloadable manuals updated alongside the software. I know my motherboard BIOS no longer matches the manual, but even then most descriptions are so simple they do nothing more than list the options with no explanation.
pixl97
18 hours ago
Yep, old features can disappear, new features can be added, the whole product can even be enshittified.
Updates are a mixed bag.
_factor
19 hours ago
Going a level deeper, more information can be gleaned for how closely modern technology mimics kids toys that don’t require manuals.
A punch card machine certainly requires specs, and would not be confused with a toy.
A server rack, same, but the manuals are pieced out and specific, with details being lost.
You’ll notice anything with dangerous implications naturally wards off tampering near natively.
Desktop and laptop computers depending on sharp edges and design language, whether they use a touch screen. Almost kids toys, manual now in collective common sense for most.
Tablet, colorful case, basically a toy. Ask how many people using one can write bit transition diagrams for or/and, let alone xor.
We’ve drifted far away from where we started. Part of me feels like the youth are losing their childhoods earlier and earlier as our technology becomes easier to use. Being cynical of course.
omnicognate
a day ago
That doesn't preclude clearly documenting what the feature does somewhere in the manual or online. People who either don't care or don't have the mental capacity to understand it won't read it. People who care a lot, such as specialist reviewers or your competitors, will figure it out anyway. I don't see any downside to adding the documentation for the benefit of paying customers who want to make an informed choice about when to use the feature, even in this cynical world view.
nkrisc
a day ago
That costs money.
hsbauauvhabzb
13 hours ago
Why let a consumer educate themselves as easily as possible when it’s more profitable to deter that behaviour and keep you confused? Especially when some of the tech is entirely false (iirc about a decade ago, TVs were advertised as ‘360hz’ which was not related to the refresh rates).
I’m with you personally, but the companies that sell TVs are not.
observationist
15 hours ago
TV's are on their way to free, and are thoroughly enshittified. The consumer is the product, so compliance with consumer preferences is going to plummet. They don't care if you know what you want, you're going to get what they provide.
They want a spy device in your house, recording and sending screenshots and audio clips to their servers, providing hooks into every piece of media you consume, allowing them a detailed profile of you and your household. By purchasing the device, you're agreeing to waiving any and all expectations of privacy.
Your best bet is to get a projector, or spend thousands to get an actual dumb display. TVs are a lost cause - they've discovered how to exploit users and there's no going back.
hodanli
21 hours ago
worst is graphic settings for games. needs PhD to understand.
ludicrousdispla
20 hours ago
They just need 3 settings for games, 1) make room hot, 2) make room warm, 3) maintain room temperature.
pitched
16 hours ago
I use that first setting to keep my basement living room warm in the winter.