> Similarly, I haven't seen anyone actually articulate what the risk from TikTok is.
Profiling of a large population, you put them in cohorts, and slowly shift what you show to these cohorts (based on their preferences, worldviews, etc.) to slowly nudge them into a worldview you'd like. It won't be 100% effective but it can definitely shift perceptions, if each cohort is siloed into their own reality bubbles through what you show them you can stochastically nudge them into a view you want them to hold based on their preferences.
If marketing works even to the people aware of how it works, a concerted effort to use someone's profiling data telling what do they like, dislike, will definitely work on a majority of users.
It's not like it will be blunt, it only has potential if you use this data to slowly shift views by using what's most effective to each cohort, with a large amount of data you can be quite precise in defining these cohorts and using different strategies/tactics for each one depending on what's most effective.
Have you ever worked on anything that did profiling based on accumulated data? I've worked on a few projects back in the early 2010s and even at the time it was scary how much you could infer about your users based on some 100-200 data points collected over a period of 2-5 years. Weaponising that is not the complicated part, the data collection is.
This is fascinating. I think this nuanced approach to shifting the perspectives and beliefs of the population of an adversarial nation is exactly the threat that is being missed by other commentators saying "what does TikTok do that's so bad anyway?" The point is that it is extremely subtle and yet very powerful...if China can convince US citizens that China deserves to rule Taiwan, for instance, the US government may find itself without the popular support or political will to take action to protect Taiwanese democracy in the event of an incursion by China.
>if China can convince US citizens that China deserves to rule Taiwan, for instance, the US government may find itself without the popular support or political will to take action to protect Taiwanese democracy in the event of an incursion by China
What is so awful about the idea that people in the United States might be convinced of something? What does it matter who is doing the convincing? You just don’t like the hypothetical outcome you suggested.
Are you opposed to a Taiwanese propaganda campaign, conducted through a newly popular Taiwanese social media app and directed at convincing U.S. citizens to support Taiwan in the event of an incursion by China? What’s the difference?
I find scary the idea that the U.S. government would try to protect its citizens from anyone’s speech or ideas. The best response to speech you don’t like is to argue forcefully against it; not to suppress it. We can make up our own minds.
I don’t want the government trying to suppress or protect me from thoughts or ideas it thinks are bad.
Because it's 10x harder to debunk bullshit than to claim it. You don't know what you don't know, and unfortunately the majority of people are too lazy to critically evaluate their views. For example, how many people actually read linked articles as opposed to just commenting based on the title?
That's how modern misinformation works, you simply bombard social media networks until the truth is lost in a sea of misinformation.
The difference between the truth and the lie though is that in the end when you actually have to implement policy or predict something, lies tend to eventually collapse in on themselves. Credibility as such emerges for the people/insitutions/frameworks that can consistently predict or give results that reflect reality more. But that can take years or even decades, while gepolitical decisions need to made today.
You might be right, but the existence of a problem doesn’t mean that government intervention will make things better.
I don’t want government deciding, on my behalf, what is or is not bullshit — and then taking legislative steps to suppress ideas it doesn’t like.
Is Communism bullshit? Is anti-Anericanism bullshit? How about liberalism? Conservatism? Homosexuality?
Maybe. But those are for me to decide, based on whatever information people want to use to try and convince me. It is not appropriate for government to legislatively suppress ideas or information it thinks is wrong.
If you think otherwise, do you have a problem with the Chinese internet firewall? From their perspective, China is protecting its citizens from harmful, wrong information. You just disagree about their value judgments. (I assume.)
Profiling of a large population, you put them in cohorts, and slowly shift what you show to these cohorts (based on their preferences, worldviews, etc.) to slowly nudge them into a worldview you'd like. It won't be 100% effective but it can definitely shift perceptions, if each cohort is siloed into their own reality bubbles through what you show them you can stochastically nudge them into a view you want them to hold based on their preferences.
If marketing works even to the people aware of how it works, a concerted effort to use someone's profiling data telling what do they like, dislike, will definitely work on a majority of users.
It's not like it will be blunt, it only has potential if you use this data to slowly shift views by using what's most effective to each cohort, with a large amount of data you can be quite precise in defining these cohorts and using different strategies/tactics for each one depending on what's most effective.
Have you ever worked on anything that did profiling based on accumulated data? I've worked on a few projects back in the early 2010s and even at the time it was scary how much you could infer about your users based on some 100-200 data points collected over a period of 2-5 years. Weaponising that is not the complicated part, the data collection is.