>We use up so many natural resources in such process that it's not worth it if it is not practically useful.
We already have 128 core cpus -- it's not something out of science fiction. It's just that we don't have them for the average consumer.
Besides, one could say the same about today's 4 and 8 cores back in 2000.
>I really don't like this imperative style. You're responding to a stranger who has not said nothing offensive in his comment.
And "speak for yourself" is offensive, how? The message was while this might hold true to you it's not the general case.
>I said "a personal machine" in my comment. A machine which is used to browse the web, take notes, watch movies, write documents, etc. 3D and high-performance audio is not every-day, personal computer use, it's a specialised case.
And yet it's done on general purpose, personal machines.
And not just by high end studio engineers either (although most recording studios I've been to also use consumer PCs, nothing special there either, usually just more expensive cases and fans to be silent): millions of people recording themselves and their band do DAW/audio work that uses high end audio CPUs, and can leverage more power.
Even more millions do video editing, and soon every consumer camera will have 4K on it (already the most popular phones do).
This is not the era of Boomers and Gen X, where such things were uncommon. Video, Photography and Music as hobbies have exploded in an era when everybody can self-publish on YouTube, Bandcamp etc, and younger kids are growing up with those things all around them.
And all those people do it on their personal machines. Not some special workstation, and not at work.
So to define use of computers as "web, take notes, watch movies, write documents" is too 1999.
And while the more expensive CPU/GPU wise stuff (video, audio work) are not as widespread as passive consumption, I'll give you that (but nowhere near to fringe activities), the argument breaks with stuff like 3D games -- which a large majority under 30 play regularly, and a huge chunk of under 20 religiously update CPUs and game consoles to get the highest and greatest.
>> Advanced 3D gaming [...] virtual reality interactive porn [...] local speech recognition and AI processing [...]
>I recall saying useful
What one doesn't have a use for doesn't make it useless. I mentioned some very popular stuff -- speech recognition and AI assistants like Cortana and co are used by hundreds of millions, 3D gaming is done by billions.
You're general response is close to the "no true scotchman". When one replies with stuff we could use that CPU power for, some are not really "personal", the others are not really "useful", etc.
We already have 128 core cpus -- it's not something out of science fiction. It's just that we don't have them for the average consumer.
Besides, one could say the same about today's 4 and 8 cores back in 2000.
>I really don't like this imperative style. You're responding to a stranger who has not said nothing offensive in his comment.
And "speak for yourself" is offensive, how? The message was while this might hold true to you it's not the general case.
>I said "a personal machine" in my comment. A machine which is used to browse the web, take notes, watch movies, write documents, etc. 3D and high-performance audio is not every-day, personal computer use, it's a specialised case.
And yet it's done on general purpose, personal machines.
And not just by high end studio engineers either (although most recording studios I've been to also use consumer PCs, nothing special there either, usually just more expensive cases and fans to be silent): millions of people recording themselves and their band do DAW/audio work that uses high end audio CPUs, and can leverage more power.
Even more millions do video editing, and soon every consumer camera will have 4K on it (already the most popular phones do).
This is not the era of Boomers and Gen X, where such things were uncommon. Video, Photography and Music as hobbies have exploded in an era when everybody can self-publish on YouTube, Bandcamp etc, and younger kids are growing up with those things all around them.
And all those people do it on their personal machines. Not some special workstation, and not at work.
So to define use of computers as "web, take notes, watch movies, write documents" is too 1999.
And while the more expensive CPU/GPU wise stuff (video, audio work) are not as widespread as passive consumption, I'll give you that (but nowhere near to fringe activities), the argument breaks with stuff like 3D games -- which a large majority under 30 play regularly, and a huge chunk of under 20 religiously update CPUs and game consoles to get the highest and greatest.
>> Advanced 3D gaming [...] virtual reality interactive porn [...] local speech recognition and AI processing [...] >I recall saying useful
What one doesn't have a use for doesn't make it useless. I mentioned some very popular stuff -- speech recognition and AI assistants like Cortana and co are used by hundreds of millions, 3D gaming is done by billions.
You're general response is close to the "no true scotchman". When one replies with stuff we could use that CPU power for, some are not really "personal", the others are not really "useful", etc.