This video is a full hour. Could you give a summary of what's covered 🙏? In particular, does he ground any part of his talk in Manufacturing Consent's Propoganda Model? Most tech discussion of...
This video is a full hour. Could you give a summary of what's covered 🙏?
In particular, does he ground any part of his talk in Manufacturing Consent's Propoganda Model? Most tech discussion of 'fake news' and social media is embarassingly ignorant (or deliberately ignorant) of strong left theories of how these problems arise and why they're harmful.
I watched it last night before bed and I will try my best to summarize: To start he gave a basic overview of machine learning, its current limitations, incredible potential, and how it sometimes...
I watched it last night before bed and I will try my best to summarize:
To start he gave a basic overview of machine learning, its current limitations, incredible potential, and how it sometimes goes very wrong.
He then moved on to talking about "the algorithm" at YouTube (the machine learning black-box behind the recommendations and advertising/demonetization systems there) and how it has gone wrong in the past (largely due to biased data), the repercussions and human cost of those mistakes, and then talked about how YouTube has tried to fix those issues. After which he specifically focused on radicalization via YouTube, since that's a pretty hot topic, a major problem, and one that YouTube has still failed to adequately address.
IIRC in the next part he talked about the parasocial relationships of online media personalities, the commoditization of friendship, the need for some people to develop emotional connections with online personalities, and how that is being exploited.
Then he moved on to discussing online moderation/community standards, and how both extremes (echo chamber vs absolutist free speech) can lead to problems. E.g. On the absolutist free speech end, how that leads to the "Nazi Bar" problem, i.e. if a local bar suddenly started allowing Nazis to congregate there and openly express their ideology to the patrons, eventually all the non-Nazi patrons will abandon the place, which leads to the place becoming a Nazi Bar (where only Nazis go), even if that wasn't the owner's initial intent.
And that's all I remember, since I think I fell asleep at that point. :P The talk was kind of all over the place, since it covered so many varying yet related topics, but was still a worthwhile listen, IMO.
I watched the talk aswell, and this seems like a relatively accurate summary. Although I didn't feel it was 'all over the place'. The goal seems to have been to give a view of science...
I watched the talk aswell, and this seems like a relatively accurate summary.
Although I didn't feel it was 'all over the place'. The goal seems to have been to give a view of science communication, and how there is no single right and perfect way to do this. And he had to cover a number of concepts and explanations to make his points.
Two stand out for me (two days or so after a casual viewing):
The 'echo-chamber' <-> 'nazi bar' dichotomy (highly versus un-regulated speech, heavy vs no moderation). Both are bad, but there is no one right place on those spectra to be. Every forum will have to decide on a policy that works for their goals.
The dilemma of popular presenters versus actual experts [edit: Scott analyses this through the concept of 'parasocial relationships', which seems to be something a number of YouTubers are thinking about]. Popular presenters don't really know what they're talking about and investing in them puts authority where it maybe shouldn't be. But if you're only going to let actual experts communicate science, then less people will listen. Again, Scott's point was that this is a dilemma without a single right answer but one every media producer should be conscious of (and he seems to feel you should do your best to lean toward the expert-side of this spectrum as far as you can get away with).
To be fair, I was very tired when I watched it (which was probably not the best idea), so me thinking it was "all over the place" may be due more to my scatterbrained state of mind, rather than...
To be fair, I was very tired when I watched it (which was probably not the best idea), so me thinking it was "all over the place" may be due more to my scatterbrained state of mind, rather than that actually being the case. :P
Thanks for summarising 🙂. Seems this was pretty squarely in the ballpark of every other discussion of 'truth in the social media age'. Still worthwhile to discuss, but not the full picture by any...
Thanks for summarising 🙂.
Seems this was pretty squarely in the ballpark of every other discussion of 'truth in the social media age'. Still worthwhile to discuss, but not the full picture by any means. After all, every youtube celeb in parasocial relationships with ~millions is getting their worldview from somewhere, and being incentivised by some system bound by certain norms and laws.
Tom is spectacularly correct. The fact that he does it in such an esteemed location is, I suppose, an added bonus.
This video is a full hour. Could you give a summary of what's covered 🙏?
In particular, does he ground any part of his talk in Manufacturing Consent's Propoganda Model? Most tech discussion of 'fake news' and social media is embarassingly ignorant (or deliberately ignorant) of strong left theories of how these problems arise and why they're harmful.
I watched it last night before bed and I will try my best to summarize:
To start he gave a basic overview of machine learning, its current limitations, incredible potential, and how it sometimes goes very wrong.
He then moved on to talking about "the algorithm" at YouTube (the machine learning black-box behind the recommendations and advertising/demonetization systems there) and how it has gone wrong in the past (largely due to biased data), the repercussions and human cost of those mistakes, and then talked about how YouTube has tried to fix those issues. After which he specifically focused on radicalization via YouTube, since that's a pretty hot topic, a major problem, and one that YouTube has still failed to adequately address.
IIRC in the next part he talked about the parasocial relationships of online media personalities, the commoditization of friendship, the need for some people to develop emotional connections with online personalities, and how that is being exploited.
Then he moved on to discussing online moderation/community standards, and how both extremes (echo chamber vs absolutist free speech) can lead to problems. E.g. On the absolutist free speech end, how that leads to the "Nazi Bar" problem, i.e. if a local bar suddenly started allowing Nazis to congregate there and openly express their ideology to the patrons, eventually all the non-Nazi patrons will abandon the place, which leads to the place becoming a Nazi Bar (where only Nazis go), even if that wasn't the owner's initial intent.
And that's all I remember, since I think I fell asleep at that point. :P The talk was kind of all over the place, since it covered so many varying yet related topics, but was still a worthwhile listen, IMO.
I watched the talk aswell, and this seems like a relatively accurate summary.
Although I didn't feel it was 'all over the place'. The goal seems to have been to give a view of science communication, and how there is no single right and perfect way to do this. And he had to cover a number of concepts and explanations to make his points.
Two stand out for me (two days or so after a casual viewing):
The 'echo-chamber' <-> 'nazi bar' dichotomy (highly versus un-regulated speech, heavy vs no moderation). Both are bad, but there is no one right place on those spectra to be. Every forum will have to decide on a policy that works for their goals.
The dilemma of popular presenters versus actual experts [edit: Scott analyses this through the concept of 'parasocial relationships', which seems to be something a number of YouTubers are thinking about]. Popular presenters don't really know what they're talking about and investing in them puts authority where it maybe shouldn't be. But if you're only going to let actual experts communicate science, then less people will listen. Again, Scott's point was that this is a dilemma without a single right answer but one every media producer should be conscious of (and he seems to feel you should do your best to lean toward the expert-side of this spectrum as far as you can get away with).
To be fair, I was very tired when I watched it (which was probably not the best idea), so me thinking it was "all over the place" may be due more to my scatterbrained state of mind, rather than that actually being the case. :P
Thanks for summarising 🙂.
Seems this was pretty squarely in the ballpark of every other discussion of 'truth in the social media age'. Still worthwhile to discuss, but not the full picture by any means. After all, every youtube celeb in parasocial relationships with ~millions is getting their worldview from somewhere, and being incentivised by some system bound by certain norms and laws.