Hacker News new | past | comments | ask | show | jobs | submit login
Television viewing and cognitive decline in older age (nature.com)
289 points by casefields on Jan 5, 2020 | hide | past | favorite | 142 comments



It's like when they say "wine is good for your heart". And then a study analyzed over three million grocery receipts..

"The people who bought wine were more likely to place olives, low-fat cheese, fruits and vegetables, low-fat meat, spices, and tea in their carts. Beer drinkers, on the other hand, were more likely to reach for the chips, ketchup, margarine, sugar, ready-cooked meals, and soft drinks."

Schatzker, Mark. The Dorito Effect: The Surprising New Truth About Food and Flavor (p. 155). Simon & Schuster. Kindle Edition.

EDIT: What I am saying is - yes, television shuts off the thinking part of your brain, but then you also don't move, don't exercise. Causation here is VERY flaky.


The authors of the study are clearly and explicitly aware of this (and other) confounding issues. Do you have reason to think the authors made stronger claims for causality than can be supported by their evidence?

In these sorts of study, causality is not simple and is rarely established in one go — it is approached incrementally, as one potential factor at a time is investigated.


> Do you have reason to think the authors made stronger claims for causality than can be supported by their evidence?

They made no claims for causality.


Maybe. What are old people watching? If it’s anything like what I see at my relatives and at assisted living facilities, it’s also of cable news and weather.

Staring at repetitive, content free crap cannot be good for your brain.


Totally. But who's to say that watching dumb television causes most of the decline, versus a neglected brain taking the shortcut to comfortable brain-dead TV?

I would wager it's more of a vicious cycle, with other factors at play.


That’s a great question, and the conclusion I draw from observations are just conjecture.

The thing that bugs me about these genres of dumb TV is that the constant attention grabbing and little dopamine hits of whatever the crisis de jure is has to have some impact.

My in-laws called me this summer genuinely concerned about thunderstorms approaching some place in Alabama. We live like 1500 miles away.


Over 50, would like to believe content matters. After buying my first large screen, I tune in a bit of HDTV. Cooking shows or NHK - which is ironic because that channel is filled with science/documentaries related to the aging population. Their 72 Hours episodes are amazing.

And stopped criticizing parental units over holidays for watching too many old westerns/noirs. There was an ancient Audie Murphy movie on that could be plopped into modern day anti-corporate/rich messaging but the plot was so complex it would never get made today.


We LOVE NHK. Best app on the Apple TV. Just watched a documentary about Edo fires from 400 to 200 years ago, last night.


> Staring at repetitive, content free crap cannot be good for your brain.

"Good for your brain" is an incredibly loose criteria. I think for many definitions of that statement there's no clear evidence to to support it.

Furthermore there's a huge cultural bias towards believing that statement - which makes me even more wary of accepting it unconditionally.


There’s obviously some sort of spectrum regarding these things. If you were forced to sit in front of a screen for 4 hours a day showing you horrifying content or extreme slogans or content for hours, I think there would be no question that doing that is not good for your mental health. (Think of the anxiety reported by social media content screeners)

News isn’t that, but where does it lie on the spectrum? I’m not saying I have proof, just presenting conjecture, and I’d love to read about research in this area.


They did also mention that video games were better for cognition, so it could be that passive activities are worse than active. Not necessarily to do with exercise.


It totally makes sense that video games are good for cognition - reaction training, problem solving. BUT, when you are younger, your body can take a hit if you abuse it.

Younger people binge-drink and watch Netlfix for hours, and can still be sharp and fit. But then activities become much more important after 30, as you've drained the gift of genes and youth.

I bet there is not a lot of info on the results of 70-year-olds playing half-life :)

Either way, I always skeptically view these studies that take two random things and tell you that they somehow directly relate to each other. Especially with humans, where the number of variables at play is nearly infinite.


> yes, television shuts off the thinking part of your brain

Is there any scientific evidence to support that? At the MRI/brain activity level?


I swear I've read about it in one of the books I had but I recently did a house cleaning of my Kindle collection and now I can't find it. But it had to do with shifting brain activity to the right hemisphere when watching the box - the left side is responsible for logic. So.....


Don’t forget to mention what people are actually watching on tv. Quite difference between documentary and Oprah


A lot of documentaries are even worse than Oprah in terms of bias and misinformation.


Sure there is bias and misinformation in everything. But if you watch enough different (sources), you can make a balanced opinion yourself. Point I tried to make however was there is a difference in what you watch on tv.


Says the guy who didn’t read past the title. “Hard to prove” doesn’t mean “impossible to prove.”


Maybe television is the reason they don't move?


>Immediate and delayed verbal verbal memory were assessed using a word learning task in which participants were presented with 10 common words by a taped voice, one every 2 seconds. Participants had to recall as many words as possible immediately and after a short delay during which they completed other cognitive tests.

This is the measure that declined in heavy television watchers. It is also a task that television watching doesn't require, since recall of previous spoken words is rarely needed to understand the next video sequence.

It's also a task that I don't do well on. I do much better if the words are presented in writing. I can watch a TV show and not remember a thing about it later. It may be a way the brain has of protecting memory from being overwritten by nonsense.


If your health is bad you probably watch more TV and listens to radio, though, since you can't go out as easily.


True, although I know people who neither watch much TV nor have lots of social interactions with people. There engage in other passive activities like reading or they have a variety of hobbies, sports, etc. Some people have little need to listen to chatter, either from a box or face-to-face.


The question is what causes what. We know that TV and cognitive decline in this population are associated.

Does television watching cause cognitive decline?

Does cognitive decline cause television watching?

Does some other factor cause both television watching and cognitive decline?

Knowing some people are inactive and read books doesn't help much in probing this. Because we're not asserting cognitive decline always causes television watching, or whatever.


What if you are from a developing country and have to rely on subtitles to understand the sequence, and also you cant read fast enough and hence you need to rewind sometimes to retain the sequence. Is this good or bad?


The linear presentation of audio or video is inefficient because it is tedious to refer back in order to recover a train of thought or search for previous material.

Some MOOC courses are given primarily by video lectures with a side-by-side transcript. I can read faster than the audio, even with the audio set to 1.5 X speed. It's also easier to skip back and re-scan the transcript. Reading the transcript at the same time as listening to the audio enhances retention.


> I do much better if the words are presented in writing. I can watch a TV show and not remember a thing about it later.

Maybe that is because when we read a story, we imagine much more about it than when watching it on a screen, and that effort makes more of an impression?


Is there a reason to believe that there's a difference between TV viewing and passive textual content consumption on the web? I used to think so, but nowadays I'm not so sure. Passive consumption of media designed to entertain, be it visual or textual, seems too similar.


The article addresses this directly:

"Indeed, it is of note that other screen-based activities such as video gaming and using the internet which involve more interaction can in fact have cognitive benefits, leading to higher levels of frontal and central EEG activity(21) and enhanced visual-spatial skills, problem-solving skills and cognition in older age(16), suggesting that it is this alert-passive interaction that is a key feature of the effects of television viewing."

Also note in Table 1 there are activities such as "using the internet" and "reading a daily newspaper" that were examined.


Note that the article uses "verbal memory and semantic fluency" as a proxy for general cognition, so readers will almost automatically score better than viewers. They also seem to have modeled all factors as linear, and "cannot rule out (...) the possibility of reverse causality".

So to answer your question: it depends on whether you consider all assumptions of the article relevant. One thing the article certainly doesn't contradict is the old adage "use it or lose it".


Reverse causality would be an interesting conclusion. Essentially asserting that articulate people with good memories are less likely to find TV appealing and would prefer to read or do something more interactive instead.


There are probably a lot of reasons why people may watch a lot of TV, including depression or other forms of mental and emotional pain.


Or: it’s a physiology / social sciences paper, so probably nonsense / narrow in scope / “more research is needed”.


You could probably say the same thing about 90% of scientific papers.

Almost all research is narrow in scope, and more research is always needed.

And calling other research fields "nonsense" is also something that researchers from all fields do.


Older people have had more practice operating the TV. Just getting the phone to work is often a mental challenge.

A ton of them are constantly inadvertently opening control center and then accidentally turning it on airplane mode or whatever. Another issue is that software changes constantly and any slight interface change they have no clue what to do.


At the farmers market earlier this fall, the seller was having a hell of a time getting his phone to work so that he could swipe our card on his Square.

After looking a little, he was wrapping his hand behind the phone far enough that he had his fingers from that hand touching the screen while he tried to poke buttons with his other hand.

Since there were multiple fingers on the screen, it kept interpreting them as gestures and not doing anything.

Further, his case made it so the Square dongle thing wouldn't stay plugged in all the way. So when he finally did get it to the swipe screen, it wouldn't read the card.

Finally ended up offering to look at it myself and manually entering my card info. The guy was so visibly relieved to be out of that situation.

Smart phones aren't nearly so user friendly as a lot of tech people like to think.


> he was wrapping his hand behind the phone far enough that he had his fingers from that hand touching the screen

Yeah, the perfect amount of bezel is significantly more than zero, I really don't get the hype for maximising screen/body ratios. It's just impractical, at least without serious measures against wrapped touch like having a fully touch enabled side only tho know which touches near the edges of the screen to ignore.


Although I'm sure this gets said fairly often around HN, smart phone "advancements" have stopped being "advancements" a while ago. They're they're closer to fads, and the cheaper brands have to chase the luxury brands, because the luxury brands get to define what is trendy. Much the same way that the rich get to set fashion trends, and the rest have to follow.

I'm not denigrating the upper class, fashion, or trends in general -- just pointing out that fashionable trends are different from technological advancement.


On the zero bezel phones the software makes a huge difference. I have very small hands where I can barely grip friends' XL phones. If they have some cheap off-brand, discount prepaid or even fancy Chinese Android I am constantly activating it despite barely touching the edges, whereas I never have the problem on the Google Pixel or an iPhone.


Any tool requires learning and understanding- a time and effort investment. Using a cash register also requires training.

The more power you put into a tool, (frequently) the more complex the interface becomes. This is a trade-off that must be weighed. Either you make your devices do less, or you require your users to learn how to use their tools.

I do not think we should sacrifice the power of our devices in exchange for improved ease of use.

Final note: Of course better interface design can help mitigate the increase in complexity of the tool with the increase in power of the tool, but I believe this has a limited effect.


Sure, you think that now, because you're in the group that interfaces are designed for. But once you're the guy at the farmer's market, you'll think differently. Power only matters if one can get something done with it. A kilogram of plutonium is enormously powerful, but that's no reason to carry it around in your pocket.

I'll also note that we have been sacrificing the power of our devices in exchange for improved ease of use since the 1980s and it's great. Manually configuring X-Windows in those days was enormously powerful, and enormously user-hostile. We sacrifice vast amount of computing power on the altar of usability, and that's exactly what we should be doing. Contrary to my early computing experience, I have never once had to rebuild the kernel on my phone, and I am very happy with that.

And third, you mostly create a false dichotomy here. If we're talking about a typewriter or a steam loom or something, yes, power and user complexity go together. But the true power of software is the ability to hide most of the complexity most of the time. Most of the work of software development is pushing complexity down, wrapping it in abstractions that provide the next level up with a clean interface.

A perfect example here is Google Search. It's a magic box into which I can type (and now, just say) anything, and it will apparently intuit what I want. The "powerful" version of that interface was 1980s search, where you carefully specified the various fields you wanted to search (because all data had to be carefully fed into the system's structure) and a painfully constructed, manually stemmed, boolean query. For maybe 1 search in 500, I still want that kind of power. But for the rest of the time, I'm very grateful that thousands of nice people at Google have made it so that their search's power has increased continuously, with users needing to learn less and less to get good results.


People should know that that power exists though. It should be available even if it's not always needed.


> Any tool requires learning and understanding

No, it's just bad design. Holding a phone only from the sides is simply not comfortable nor very safe for the phone.


> Another issue is that software changes constantly and any slight interface change they have no clue what to do.

I have my own story from today: I have iOS devices since forever and sill I had to google how to access some option. The damn things move between versions and some become invisible until you read somewhere where they are etc. It’s really that bad.

Previously on HN: “perfectly cropped”:

https://news.ycombinator.com/item?id=21353920


How does one passively consume textual content?

Reading engages the brain in an entirely different way than viewing images does. When you read you interpret meaning from text and construct a mental model of what's being described.


I've long struggled with retaining what I read. It feels like I've trained my brain to enjoy the quick dopamine hits from, for example, reading short stories or, say, seeing a problem someone has that my brain can pretend to solve. Sites like Reddit or StackExchange.

Yes, maybe I interpret text in my head and build mental models, but my brain just throws them away as soon as the story is over. Ask me what I read about 5 minutes later and I swear, I won't be able to tell.


Ingesting data and evaluating it according to some model, sounds like what a good cluster node would do. No need to permanently store neither the data nor the output on the node itself, but you might want it to update its process via learning.


Scrolling through reddit or HN threads takes almost no mental effort. I’d dare say it’s the sort of thing people do to relax after reading “hard” content like novels or research. Speaking personally, I can feel burnt out from reading a novel for a while. On the other hand, I can mindlessly scroll through internet comments, read them in full, and not notice that hours have passed and feel zero exhaustion from it.


I don’t feel exhausted, rather: bored by fiction and non-fiction alike.

Having said that, it often feels like the internet speaks with one voice, even when it’s disagreeing with itself.

Rarely does anyone say or write anything original.

I guess HN feels more familiar though, more like family. And we’re probably driven by millions of years of evolution to seek that out.


Yes the Internet's voice is very homogenous. I crave the tiny YouTube videos that are posted by people that have truly different views and are articulated well.

There's too much everyday garbage on the web, it's disappointing quantity is trumping form and quality. If YouTube loosened up it's monetization rules and went back to letting the producer set the form we could have more focused videos. The ten minute spam videos are the plague.

HN is more rational and quantitative than other sites in general so I prefer this as a counter to other more abstract information sources.


I believe you can read a novel more or less mindlessly, as you can read HN. But HN is more surprising, and it's like a slot-machine (in the analogy that Tristan Harris makes with social media).

For me it is like System 1 or System 2 taking charge (as in the book "Thinking Fast and Slow", by Kahneman).


You can’t passively consume anything. You say you have to interpret meaning from words. Sure, but.. What about faces? Tone of voice? Body language?


You’re correct. It seems though that there’s something about the level of - I can’t think of a less ridiculous phrase than - adaptive engagement you marshall while performing tasks that’s important.

a director watching a cut of a movie, weighs every detail with a focus that is quite unlike the mental state of the public audience. I think that changes the way they watch all films, always dissecting.

So I’m assuming the importance of television isn’t that it’s television, but that it’s minimally stimulating activity, so easily within your cognitive capacity as to require neither physical nor mental discipline nor adaptation, requiring just enough from you to displace sleep.

I don’t watch a lot of tv, but I choose to take this as a warning I need to do less scrolling

Edit: one thing in particular that stands out is the discussion of television being fast paced, fleeting information, I wonder if I’m viewing adaptation incorrectly here, what if it’s that we are exhausting our adaptive capacity, training ourselves on static?


> a director watching a cut of a movie, weighs every detail with a focus that is quite unlike the mental state of the public audience. I think that changes the way they watch all films, always dissecting.

Viewers do this as well. Analysis of literature and of television/film overlap to some degree, and both require an active consumer of the material.

I think the question is: Given an equal amount of effort on the part of the consumer, does one medium convey more benefits than the other?


My completely unfounded theory is that reading and watching both exercise different portions of the brain, like running and swimming. But like running and swimming, there is a wide spectrum of available exercise intensity - and even in their most intense forms they don’t exercise the whole of the brain.


1965: He is so smart, he is watching so much TV

1990: He is so dumb, he is watching TV all day

2000: He is so bright, he spends a lot of time on the Internet

2020: He is so dumb, spending all day on the Internet.


> 1965: He is so smart, he is watching so much TV

I'm not sure this was ever the case.

> Newton N. Minow spoke of the "vast wasteland" that was the television programming of the day in his 1961 speech.

https://en.wikipedia.org/wiki/Social_aspects_of_television#N...


One reason for this pattern is classism: the novelties of the upper classes are held up as glamorous and even virtuous, and that goes away when something is mass-marketed and becomes popular among the lower classes. Eventually, one might even hear words like "disgusting" and "poison" used to describe something that was once considered perfectly wholesome.


The difference is in quality and quality.

Both media began with precious little content, most of it relatively high in value for the times and circumstances. (Even if it could be argued some content was low quality or utility, the simple access to a much wider array of thought was valuable.)

But once they matured, of course they filled up with trash. Just like every medium.

Given the quality of most books these days, someone “reading a lot” doesn’t mean much anymore, for instance.


Do you mean quality and quantity?


> 1965: He is so smart, he is watching so much TV

This never happened.


Well, you have to remember the memes so you can exhale more air from your nose when you see it next time.


If you're talking about Listicles (ie 'Top 10 X') then yeah they're probably just as bad, but the other reply you got makes the good point that reading is fundamentally an active cognitive activity in the way TV is not.


I’m not really convinced. That just sounds like an issue with quality of content. Viewers and readers can both be active or passive.


they can be. However, reading is probably "by itself" more active than watching TV, irrelevant of the content. Probably in the same vein that handwriting is better for your brain than typing.


>handwriting is better for your brain than typing. {{citation needed}}



This citation does not support the claim. The paper is about literacy in children during the development of literacy.

Abstract:

> Digital writing devices associated with the use of computers, tablet PCs, or mobile phones are increasingly replacing writing by hand. It is, however, controversially discussed how writing modes influence reading and writing performance in children at the start of literacy. On the one hand, the easiness of typing on digital devices may accelerate reading and writing in young children, who have less developed sensory-motor skills. On the other hand, the meaningful coupling between action and perception during handwriting, which establishes sensory-motor memory traces, could facilitate written language acquisition. In order to decide between these theoretical alternatives, for the present study, we developed an intense training program for preschool children attending the German kindergarten with 16 training sessions. Using closely matched letter learning games, eight letters of the German alphabet were trained either by handwriting with a pen on a sheet of paper or by typing on a computer keyboard. Letter recognition, naming, and writing performance as well as word reading and writing performance were assessed. Results did not indicate a superiority of typing training over handwriting training in any of these tasks. In contrast, handwriting training was superior to typing training in word writing, and, as a tendency, in word reading. The results of our study, therefore, support theories of action-perception coupling assuming a facilitatory influence of sensory-motor representations established during handwriting on reading and writing.


thank you.


I'm not sure how you can remove the reader/viewer and the content from the act of reading/viewing to make a claim that reading is 'by itself' the more active activity.

At best I think you could make the claim that, for the average person, as typically consumed, reading results in more activity in the brain.


I know a ton of these older adults. I think the true problem is that they are not socializing enough which is the cause of the mental decline and often paranoia.

Some of them have moved somewhere cheaper to retire where they don’t have many friends, but most of the ones I know just watch tv instead of calling a friend to do something. The friend is the same even though they would love to hang out. I’ve tried nudging them with no success.


My grandmother fought against leaving her house for a retirement home for a long time. When she was about 85 she finally gave in and was surprised to find out she loved it there.

In her home she mostly passed the time watching TV and knitting. In the retirement home, she joined a choir, she ate with friends every day, she took advantage of regular trips to restaurants and shopping malls, and she was constantly playing cards.

Having friends around and things to do with them was huge for her happiness.


I think, the problem is deeper than just watching TV. The real question is what activities does it replace. 3.5 hours per day is a a significant portion of one's day, if you subtract sleep, food, shopping, doctor visits and other basic needs.

I wouldn't be surprised if those who watch less TV fill that time with creative hobbies, socialization, reading of thought-provoking books, etc. As a matter of fact, I would love to see a paper trying to structure those activities and correlate the prevalent groups to the cognitive performance.


Roald Dahl the famous writer used to love reading news and the like, saying he could do it all day. This problem has likely been around for a long time.

I'd be curious what people do in cultures without the drip feed and their reaction to it.


I've often wonder the effect of refresh rate. Back in the days of NTSC/PAL TV days, Americans had NTSC running at 60Hz and the UK ran at 50Hz refresh rates (same as the electricity supply runs out btw). This would yield American friends/colleagues comming to the UK and for a few weeks watching TV would complain about the flicker. Reason for this was their visual system had got used to the higher refresh rate and suddenly seeing something lower, stood out and once noticed, was hard to un-notice and would for them make for a noticeable flicker.

I've often wondered what effects refresh rate has upon the brain and how fast/how we process information and equally, how we adapt.

So with all that in mind, I wonder how old people who grew up with a 60hz TV compare to those accustomed to 50hz and if the statistics of cognitive decline are not the same - more interesting questions.


Personally, both 50 and 60 Hz are horribly flickery. Back when I used a CRT I couldn't stand anything below ~85 Hz and would get headaches. Now that the screen no longer flickers, I guess in terms of motion fluidity there's a perceivable difference between 50 and 60 Hz. Something I personally cannot unsee, though, is 2:3 pulldown, which is something you don't see in Europe (at the expense of slightly faster movies).


I’m pretty convinced on the theory that some people see “faster” than others. Most people can be taught to notice flicker, but I really feel like some see more naturally.


"Watching television for more than 3.5 hours per day is associated with a dose-response decline in verbal memory over the following six years, independent of confounding variables."

associated

Not "caused".

It is still an interesting study, but this should be the first thing to recognize, whether it claims association or causation.


I also wonder if the type of programming matters : if watching quiz shows that jog memory and recall are different from watching soap opera, or sports, or learning programs for instance.


It might not be that simple. I think intellectual effort of these shows varies more within type than across. Tipping Point and Only Connect are both quiz shows but the first has questions like

How many degrees are there in a circle?

The latter has questions like; What can come fourth in this sequence?

All this happened more or less.

Sherlock Homes took his bottle from the corner...

There were four of us - George and William Samuel Harris...


To summarise; BBC4 - good, ITV - not so much.


Maybe there’s a progression though?

I have no idea what came fourth in the sequence, does that mean I should expect rapid age-related cognitive-decline?

I’m probably going to blame that on all the hard drugs I’ve abused, rather then my lack in other areas.


It was the best of times, it was the worst of times...


The brain thrives on two things, movement and questions. Passively letting an uncritical experience wash over the mind wipes the chalk from the slate.

Having active shared experiences with other people while asking and answering questions is what keeps us us.


> Passively letting an uncritical experience wash over the mind

I don't watch much television but I think that consuming and processing narrative is not entirely passive. Your brain is still doing work to connect dots in stories, you're building internal networks of events and characters.

I think there's a lot more interaction on the receiving end there than you're giving credit.

Truthfully what television removes (though not entirely) from story processing versus reading is a huge chunk of imaginative work and some memory requirements.


You may have a point but it’s very challenging to prove: Even the most engaging program can be viewed passively simply by disengaging focus.


> consuming and processing narrative is not entirely passive.

while I agree with this, modern television requires far less "interaction on the receiving end" than you give it credit for. It lacks novelty.


> modern television [...] lacks novelty.

Can you elaborate on what you mean by “modern”? We’re in a golden age of television, according to some. I can think of at least a dozen shows released in the past few years that have deep and complex storylines.


I think it works both ways, we have some much choice now that you can easily seek out both ends of the spectrum. There is no shortage of mindless TV for people to watch if that is what they are after.


I mean modern broadcast television, the one with ads


Wow! I'm 57 years old so I'm already into the sample set (they started at age 50).

I wonder if my Sunday Afternoon TiVo binges of "Dr. Pimple Popper," "Below Deck," and an occasional "Judge Judy" are making me dumb!? I hope not.

Oh, and I tried the new podiatric reality show: "My Feet Are Killing Me." I think I'm smarter because of it.

Granted, I watch about 4-5 hours of TV a week, not 3.5 hours/day. But only because I don't have enough hours in the day. I enjoy television, and look forward to my Sunday binges.


Anedoctal evidence. Source: My own grandmother (80yo in 2019)

I have observed in my grandmother that she displays markedly lower cognitive decline when compared to her peers (friends and family around the same age).

Here are some (maybe biased) observations.

In the early 80s (at 40+yo) my grandmother was working as a secretary in a small office and went through the change from mechanical typewriters to electronic ones. While she was doing this the majority of her friends where 100% house wives.

In the early 90s(at 50+yo) my grandmother learned to use a T9 mobile phone by herself. Her friends didn't have a mobile phone for years, and even myself would only have a mobile phone much later than her. (in fact i remember her teaching me how to navigate a Nokia that she gave me when she upgraded)

In the early 2000's (at 60+yo) my grandmother bought a Windows XP PC, learned how to use it from scratch (had never used a computer in her life) and started to browse the internet, write emails, copy photos from digital cameras on to the computer etc. At the same time, her friends had no clue how to use a computer.

In the 2010's my grandmother quickly transitioned to iPad to browse the web / iPhone to keep up with grandchildren on social media. Her friends would only come to have a smartphone and join the social media revolution in the mid 2010s.


It’s not clear what point your trying to make here.


I'm not really trying to make any point =) I'm just presenting some anecdotal observations.


Your comment started with the phrase "anecdotal evidence". Anecdotal evidence for what?

Note: I think no one is minding the unscientific nature of your anecdote. I just don't understand what point the anecdote is supposed to underline. Your grandma likes technology and therefore she likes technology? Or something?


It is pretty clear the evidence, although one should call it an observation, is that brain stimulation through change, challenging and active learning activities kept an individual's cognitive abilities higher than peers who have gone through less brain stimulations.

It's sharing this observation, not very scientific as it would need larger samples and better controlled environment but it's an interesting anecdote. Many of us perhaps see the same trend.

I think it's pretty clear that not stimulating your brain lead to drop of cognitive abilities. Parallels are made between a trainned muscle and a trainned brain. We teach kids to memorise poems to stimulate their memory. We forget about stimulation as we grow older, we tend to retire and accept the status quo that there is the active career and some day we quit and no one expect anything to be done as retired folks.

I find my own cognitive abilities dropping after a few months of slope into brain laziness. I find my cognitive speed and performance increased after a few months of rather intense focus on multiple challenging projects. I also find myself unable to perform at high cognitive levels, drop of concentration if I've been on a burn up for months without much brain rest and relaxing time. Again perhaps a good parallel with how muscles performance works.

Anecdotes are interesting, as they help see trends when in correlation with other anecdotes. I rarely hear stories about the opposite phenomenon.


yup sorry, probably phrased it wrongly by including the word "evidence" too late to edit now.

In my view an anecdote is just that. Not supposed to support any broader point or generalization.

Its an individual story, you can make of it what you will! =)


If you’ve got something to say: say it.

If you don’t: probably don’t.

But it says more about HN that your comment is still the top comment after an hour+.


I see, thanks for the explanation. I guess vague comments like mine will tend to be buried after a a couple of hours =)


How much television does she watch?


A lot! Mostly football (not the US sport), any sport that is on TV really, news and hollywood movies.

Also she is a chronic late sleeper (as in going to bed late and waking up late) which both me and my mom seem to have inherited.


There’s always outliers.

Some tiny percent of the population can smoke and drink non-stop, never exercise, eat shit constantly, and die by getting hit by a bus while jogging at 02:30.

Still not real helpful for us who didn’t win the genetic lottery.


In a morbid sort of way I have always admired those who can never exercise and still manage to be killed by a bus jogging. :)


I don’t know enough to know the answer to this, but does the paper address the chance that people that have a cognitive decline watch more tv and it is an effect not a cause?


They say in the study that they looked for reverse causality, and that they checked for study participants that acquired a medical diagnosis of cognitive decline (eg. dementia) during the study period.


That's exactly what I was thinking. The assumption that the TV is the culprit is so powerful that is not even questioned.


I wonder were the effect comes from:

- is that the medium itself (and the fact we watch it passively)?

- is that the fact that it replaces other activities?

- is that the retarded content it shows?

- is that the stressful content that appears in the news all the time (crimes, natural catastrophes, war, terror attack, etc.)?

- is that the way it's shot (with non-stop movement, even when nothing happens,just to keep us watching)?


It's probably because overload of long-term memory is related to Alzheimer's and other dementia: https://www.researchgate.net/publication/12269167_Does_longe...


Instead of watching TV what if i browse reddit or HN and keep commenting? How is that any different? Sure I am not using my brain when watching tv but I am not really turning it off. I am still trying to process the visuals and in some cases trying to understand or predict what might happen. How different is that from mindless browsing?


A randomized controlled experiment: Find a collection of 200 people who you will be able to test 5 years from now. Send copies of this report to a random selection of 100 of them. Send a similar article not about television to the other 100. Test after 5 years.


What else are you going to do but watch tv when your brain starts to go?


Perhaps I should curb my Netflix binging habits.


The very definition of binging is about excess. That somehow it became a normal activity is telling something else.


Of course, cognitive decline could also cause television.


Me watch many tv and have great word speaking.


Ma was right, it’ll rot your brain!


I am surprised that 3.5 hours a day is the threshold where decreased performance begins, as in my little life bubble that seems like an abnormally large amount of TV watching. However Wikipedia[1] tells me the average USA "consumer" is watching 4 hours a day! Yikes.

To a layman like me this seems like great research. A lot of effort was put in to draw a rigorous conclusion about the effects of TV watching on some aspect of mental performance. I just read Amusing Ourselves to Death though, and I find myself compelled to be a lot more dramatic about the negatives of TV watching, on the late author's behalf.

Yes it's pretty bad that (excessive) TV watching literally damages your cognitive ability to some degree, but I would say (and please point me to research) that before that damage starts to show it's already near devastated your personal 'knowledge-base'. To try and paraphrase Postman, the content on TV is either trivial or non-trivial but delivered in such a way as to keep you from holding onto any significance, meaning, or theory. TV as a medium is terribly fractured (commercial breaks, 'Now..This' segues) and fundamentally visual (rather than typographic). Human minds thrive in connectivity, interrelation, and continuity. Our minds thrive in the typographic world. At his most polemic, Postman argues that TV at is worst is totally incoherent, conferring basically no knowledge whatsoever. A person's mind might still performance just fine in tests, but having been starved of knowledge it is practically unintelligent.

Think also of the opportunity cost. At an average of 4 hours a day, as USA 'consumer' watches 1460 hours of TV a year. An average person reads 200-250 words per minute. Taking the upper bound, and the average non-fiction book as 50,000 words. They could finish that average non-fiction book in 4.16 hours. So, they could be reading almost 1 book a day for a whole year instead. Now, that napkin-math result seems implausible to me, but imagine if I was off by an order of magnitude. Say this person could read 35 non-fiction books in a year if that wholesale swapped TV for reading non-fiction. 35 non-fiction books at ~42 hours per book. They would be enormously enriched by this alternative behaviour.

As an extreme (but unfortunately not uncommon) example, instead of getting all red in the face hearing about Socialism coming to the USA from Fox news, our dear average USA 'consumer' could have read Das Kapital, and The Soul of Man Under Socialism, and The Road to Wigan Pier, and Capitalism and Freedom, and On Liberty, and... etc etc. You get the point.

So yeah, I'm really onboard with the thesis of Amusing Ourselves To Death.

1. https://en.wikipedia.org/wiki/Television_consumption


They would be enormously enriched by this alternative behaviour.

Why are you assuming that watching television isn't enriching? What makes books better?


I say a bit about why books are better in the paragraph preceding the one containing what you’ve quoted.

I can make a positive case for reading over TV watching, and the book I mention in the comment does just that, but first I’d like to quibble with your first question. I don’t say that watching TV is not enriching in general. Certain television and certain movies are awesome and even indispensable.

I do think books are better than watching TV though. For starters, you are in a thread about how TV is associated with cognitive decline. Reading on the other hand is promoted by the results of a number of studies showing its relationship to improved cognitive performance.

Perhaps the simplest argument I can think of now is this. When we as a society look to create great educators, lawyers, doctors, scientists, and philosophers, we make sure they’re reading a lot. We don’t have them watch TV. When society builds education systems they fill their curriculums with books, whether at the primary, secondary, or tertiary levels. This is still true despite television existing for almost 100 years. The replacement of reading by television has been actively resisted by educational leaders generally. In the most potent cases, professors would find it totally laughable that their graduate curriculums be facilitated by TV watching rather than books.


In the most potent cases, professors would find it totally laughable that their graduate curriculums be facilitated by TV watching rather than books.

If someone told me that the only way to learn linear algebra or calculus would be from a textbook I'd have believed them when I was at university (about 25 years ago). If someone told me that now I'd send them some great YouTube content that explains it far better than any math textbook I've ever seen. I think that says more about professors than about how people are able learn difficult subjects.

I don't think books are going to go away, and I think it's important to cultivate an enjoyment of reading in children so they can read complex subjects later in life, but the common line that "books are better for you than TV" in adults who aren't reading for education is just something that I think is hard to justify. People always end up comparing low-brow TV with high-brow books to show how TV is bad. That's what I think is unreasonable.


You don't seem to be talking about TV at all, though, but online videos.


Yeah I now realise people replying are not agreeing that TV != any audio-visual content.

When TV is attacked there's defenses bringing up digital textbooks.


>Perhaps the simplest argument I can think of now is this. When we as a society look to create great educators, lawyers, doctors, scientists, and philosophers, we make sure they’re reading a lot. We don’t have them watch TV. //

Not the GP.

You're right that argument is simple, it's also not compelling. The reason for that is most of the information is currently encoded in the form of literature.

It might be that we'll made TV/computer programmes can convey the information, and build knowledge, better; we've just not tested that yet. Is an anatomy book better (alone) for learning than text and visuals that are interactive - maybe VR - for example.

If you take the people who read and make them watch TV it seems likely to me that they'll watch stimulating television, if they read stimulating books?

In short maybe it's the content, and one content group prefers one medium.

I'd totally agree that TV with advertising breaks is anathema -- but Netflix documentaries (say), seems like they might be better than a Mills & Boon bonkbuster?

If it is solely the medium the problem is significant and we need quickly to find out what specifically about the medium is the issue. It's unlikely to get solved though, TV advertising is a cornerstone of Capitalist society; keeping the masses "sedated" by TV (raging about Soaps or MCU plotlines instead of real life) helps to control society.

MOOCs are huge and often don't use books as primary media, they rely on lectures delivered through video. My most recent learning has all been this way and I find it works well for me. But presumably you include "screen based text" with "books"?


In Amusing Ourselves To Death the author does go into a lot of detail about how "the medium is the message", or as he extends it, "the medium is the metaphor". Postman contends that Television as a medium has significant problems.

Given that you're bring up lectures and MOOCS, it's definitely worth being more clear about what Postman understands as constituting the medium of TV. It's not merely the transmission of audio-visual content, it's is the whole technology, taking the concept of "technology" in the broad sense. For example, the usage of music in TV is part of the technology, and it is usually conspicuously absent from MOOC lectures.

Postman would not consider a digital textbook as TV, but if such a textbook were borrow from TV, for example using conventional drama narrative as a device then he would be against that.

Does this more or less resolve the problem? TV != digital textbooks or MOOC lectures.


People watch tv for completely different reasons that they read. One big thing about tv is that it can run while you do other boring chores. And that it makes you hear spoken language when you are lonely. It is closer to human communication then books.


> People watch tv for completely different reasons that they read.

No argument here. There's a reason the book if reference is called "Amusing Ourselves to Death".

> One big thing about tv is that it can run while you do other boring chores.

True. I want to say that I'm not literally saying we should replace all TV watching with non-fiction book reading. I was making a point.

However, you can also listen to the radio or podcasts or audio-books when doing boring chores. It matters whether those alternatives are better.

> It is closer to human communication then books.

It's impossible for this to be true, given that books are human communication and further that written language is the only domain of communication unique to humans.


> It's impossible for this to be true, given that books are human communication and further that written language is the only domain of communication unique to humans.

You was never alone for an extensive period of time with little chance to hear other people talk, didnt you? Uniqueness among animals have nothing to do with anything.

> However, you can also listen to the radio or podcasts or audio-books when doing boring chores. It matters whether those alternatives are better.

There is zero difference for most podcasts and most for-fun audio books. None at all. And it may be my age, but I find most podcasts to be quite annoying. Some are good, but too much of it has cringy style.


I'd also wonder how that "watching" time gets counted.

For example, I know of people who would turn on the TV, then go about their chores, but they aren't paying attention to whatever is on. Would that be better or worse, from a cognitive point of view, than intently watching a program?

As a different example, I like "watching" films or series while doing some work, as in watching for a few minutes when I get stuck on something, then hitting pause and going back to work. Other times, I'd just stare at a blank wall for a couple of minutes. Would that count as daily watching time?


There was another study recently that indicated a high percentage of people have a second screen up when "watching" TV. I think your question is a good one to ask.

My wife and I have something like The Office literally playing all day while we go about doing normal day time things like work, chores, etc... If we do find time to really watch TV, we may watch one or two hour long shows before bed.


I'd guess that such viewing would be counted, which would inflate the numbers. I couldn't find a clear answer when I searched for one though.


For people from generations preceding ubiquitous web access TV is the most common/only low effort & cost entertainment.

If you counted stuff like browsing HN/Reddit/social media in addition to TV/streaming I'm sure most everyone would average more than 4h daily (maybe more on weekends, less on week days).


I wouldn’t be surprised. I myself use HN a lot, daily. It’s all reading on HN though, not TV.

By Neil Postman’s thesis in Amusing Ourselves To Death, audio-visual content consumption on social media is actually worse than TV. He would be aghast at say, TV being replaced by TikTok.


There's radio. Here in the UK, some older people have Radio 4 on all the time at home. It's solid talking, some spontaneous (experts discussing some topic, ...), some scripted (plays, documentaries, ...).


Why was this voted down? I'm fairly certain that radio is just as low-effort and low-cost as TV, and it's been around for longer. If you have poor eyesight, as a lot of old people do, radio could be a lot less effort than TV, particularly if you choose a station in which professional speakers speak standard English (you may also have poor hearing).

I think a comparison with radio would be instructive.


Interested in explanations for downvotes, given that this post is -2 despite contributing to discussion (has 5 replies, 15 comments in tree).


Interesting that the threshold is at 3.5. 3 hours or less is fine.

I wonder if there’s a way to have TVs with watch timers. For better self-control.

I notice at my parents house that the TV is almost always on. It personally drives me insane. I think they want to curb it down, but it’s a habit that’s hard to break... so maybe a screen-time app for the TV can help??


TVs have sleep timers where they turn off after the set amount of time. Or at least they used to. You might be able to use an Alexa plug between the TV and the wall to the same effect.


Yeah, but it's something you actively need to activate every time, and it doesn't count accumulated time over a 24 hour hour period... anyway, was just toying with the idea. (inspired by the iPhone screentime)


The streaming services will present a dialog asking if someone is still watching after a period of time. If I do not respond to that dialog, my ATV will eventually turn off which turns off my TV.


That sounds like a variation of a sleep timer. I'm talking about measuring (and maybe even alerting/stopping) excessive watch time. i.e. if someone is actively flipping channels for 5 hours a day, that's not healthy (according to this research)...

Streaming services sadly have the reverse incentive in this case, to actually make you watch more.


The amount of TV watched might be correlated with cognitive decline, but is certainly not the cause.

I doubt there is any activity that can accelerate the rate of decline apart from the things we put in our bodies (food & drugs). It's better to think of cognitive decline proceeding at some baseline rate and the mental and physical activities we engage in act to buffer this pace. Watching TV happens to be a poor buffer, especially over such long stretches of such passive stimulation (requires little mental or physical exertion). You would get the same low buffer effect staring at a wall, flipping through a picture book, or knitting a scarf for 3.5 hours. This might be obvious but TV isn't causing decline, it's just not helping to prevent it.


"is certainly not"

What do you base your certainty on? It's certainly not fact.

"I doubt there is any activity that can accelerate the rate of decline apart from the things we put in our bodies (food & drugs)."

You missed oxygenation.


Yes, oxygen goes inside our bodies. Also liquids.

I base this certainty partially on my expertise in the field of cognitive decline, but mostly common sense principles. Cognitive decline is a natural part of aging. It happens to everyone regardless of how much TV one watches; even people who watch no TV whatsoever cannot avoid cognitive decline. Thus our best estimate of what causes cognitive decline points towards the aging of the underlying biological apparatus that supports cognition. Furthermore, there are likely to be infants and young children who watch as much TV, and view programming geared towards learning; and as they mature will see gains in cognitive abilities. How is that explained it TV causes cognitive decline.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: