Gina Neff
Use attributes for filter ! | |
Gender | Female |
---|---|
Place of birth | Campton, Kentucky, United States |
Citations | 3,301 |
Spouse | Philip N. Howard |
Doctoral advisor | David C. Stark |
Education | Columbia University |
United World Colleges | |
Date of Reg. | |
Date of Upd. | |
ID | 1526080 |
Gina Neff Life story
Gina Neff is a media and communication scholar whose work centers on the social and organizational impact of new communication technologies. Trained as an organizational sociologist, her research is at the intersection of concerns about work, technologies, communication and organizing.
Can Rishi Sunak's big summit save us from AI nightmare?
... We re concerned about the threats to people, communities, and frankly, the planet, " says Prof Gina Neff, who runs an AI centre at the University of Cambridge...
Child hepatitis cases falsely linked to Covid vaccine
... Looking for answersEvents that are both distressing and unexplained make fertile ground for confirmation bias - when people look for information to support what they already believe - according to Prof Gina Neff, a senior research fellow at the Oxford Internet Institute...
Should bad science be taken off social media?
... This, Prof Gina Neff, a social scientist at the Oxford Internet Institute explained, was to " ensure that people still can speak their mind" - they just aren t guaranteed an audience of millions...
Should bad science be taken off social media?
How do you solve a problem like bad information?
When It ComesContent ='JJ Lin'> to understanding science and making health decisions, it can have life-or-death consequences.
People dissuaded from taking vaccines as a result of reading misleading information online have.
And inaccurate or completely made-up claims about 5G and the origins of Covid-19 have been linked to and.
But completely removing information can look a lot like censorship, especially for scientists whose careers are based on The UnderstandingContent ='JJ Lin'> that facts can and should be disputed, and that evidence changes.
The Royal SocietyContent ='JJ Lin'> is The WorldContent ='JJ Lin'> 's oldest continuously operating scientific institution, and it is attempting to grapple with The ChallengesContent ='JJ Lin'> posed by our newest ways of communicating information.
In a new report, it advises against Social MediaContent ='JJ Lin'> companies removing Content that is " legal but harmful". Instead, The ReportContent ='JJ Lin'> authors believe, Social MediaContent ='JJ Lin'> sites should adjust their algorithms to prevent it going Viral - and stop people Making MoneyContent ='JJ Lin'> off false claims.
But not everyone agrees with that view - especially researchers who are experts in tracking The Way misinformation spreads online, and how it harms people.
The CenterContent ='JJ Lin'> for Countering Digital Hate (CCDH) maintains there are cases when The BestContent ='JJ Lin'> thing to do is to remove Content when it is very harmful, clearly wrong and spreading very widely.
The TeamContent ='JJ Lin'> points to - a video that went Viral at the start of the pandemic, making dangerous and false claims designed to scare people away from effective ways of reducing harm from The VirusContent ='JJ Lin'> , like vaccines and masks, and was eventually Taken DownContent ='JJ Lin'> .
Social MediaContent ='JJ Lin'> companies were better primed for the video's sequel Plandemic 2, which fell flat after being restricted on major platforms, having nothing like the same reach as The FirstContent ='JJ Lin'> video.
" It's a political QuestionContent ='JJ Lin'> . . what balance we see between individual liberties and some form of restrictions on what people can and cannot say, " says Prof Rasmus Kleis Nielsen, director of the Reuters Institute for the Study of Journalism at the University of Oxford.
Prof Nielsen acknowledges that, although it's a relatively small part of people's media diets, science misinformation can lead to disproportionate harm.
But, he adds, given a lack of trust in institutions is a Big DriverContent ='JJ Lin'> of misinformation: " I Imagine ThatContent ='JJ Lin'> there are quite a lot of citizens who would have their worst suspicions confirmed about how society works, if established institutions took a much more hands-on role in limiting people's access to information. "
'Harder to reach'Echoing this concern, the Royal SocietyContent ='JJ Lin'> says: " Removing Content may exacerbate feelings of distrust and be exploited by others to promote misinformation Content . " This " may cause more harm than good by driving misinformation Content . . towards harder-to-address corners of The InternetContent ='JJ Lin'> . "
The fact that those corners are " harder to reach" though, is arguably part of The PointContent ='JJ Lin'> . It reduces The RiskContent ='JJ Lin'> that someone who is not already committed to potentially harmful beliefs, and isn't seeking them out, will be exposed to them by chance.
Some of the had their origin not in obscure corners of The InternetContent ='JJ Lin'> , but on Facebook. And there is little clear evidence that removing Content drives people further into harmful beliefs.
Change The AlgorithmContent ='JJ Lin'>Scientific misinformation is Nothing NewContent ='JJ Lin'> .
The incorrect belief in a link between the MMR vaccine and autism came from a published (and later retracted) academic paper, while widespread unevidenced beliefs in the harm of water fluoridation were driven by The PrintContent ='JJ Lin'> media, campaign groups and word of mouth.
What's changed is the speed at which false facts travel, and the huge numbers of people who can end up reading them.
Rather than removing Content , One Way suggested by The ReportContent ='JJ Lin'> 's authors of tackling misinformation is Making ItContent ='JJ Lin'> harder to find and share, and less likely to appear automatically on someone's feed.
This, Prof Gina NeffContent ='JJ Lin'> , a social scientist at the Oxford Internet Institute explained, was to " ensure that people still can speak their mind" - They just aren't guaranteed an audience of millions.
" They can still post this information, but the platforms don't have to make it go Viral . "
Fact-checkingThe Institute for Strategic Dialogue (ISD), a Think TankContent ='JJ Lin'> which monitors extremism, points out a substantial proportion of misinformation relies on the appropriation and misuse of genuine data and research.
" This is sometimes more dangerous than outright false information, because it can take substantially longer to debunk by explaining how and why this is a misreading or misuse of the data, " its spokesperson says.
That's where fact-checking comes in - Another tool which the Royal SocietyContent ='JJ Lin'> supports.
One of The MostContent ='JJ Lin'> common pieces of vaccine misinformation over The PastContent ='JJ Lin'> Year - which The Bbc has - was the notion that people are being harmed in high numbers by the jab. This claim is based on a
De-platforming individualsThe ISD says research has shown that a small group of accounts spreading misinformation had a " disproportionate influence on The PublicContent ='JJ Lin'> debate across Social MediaContent ='JJ Lin'> ".
" Many of these accounts have been labelled by fact-checkers as sharing false or misleading Content on multiple occasions, yet remain live. "
The Royal SocietyContent ='JJ Lin'> did not investigate removing the accounts of " influencers" who are especially prolific spreaders of harmful misinformation.
But this is seen as an important tool by many disinformation experts, and research into and suggests it can be successful.
When David IckeContent ='JJ Lin'> , a prolific spreader of Covid misinformation as well as anti-Semitic Conspiracy TheoriesContent ='JJ Lin'> , was removed from YouTube, research from the CCDH Found his ability to reach people was considerably reduced.
While his videos remained on alternative video-hosting platform BitChute, their views fell from 150,000 on average before the YouTube ban to 6,711 afterwards. On YouTube, 64 of his videos had been viewed 9. 6 million times.
, Found that the de-platforming of, a former nurse and prolific spreader of Covid misinformation, decreased her reach in the short term.
" Part of the issue is that current models of de-platforming need to be developed. It's not enough to just Take DownContent ='JJ Lin'> a piece of Content , or a small number of accounts, " one of The PaperContent ='JJ Lin'> 's authors - Prof Martin Innes - explains.
Research from Organised CrimeContent ='JJ Lin'> and counter-terrorism shows the need to disrupt the whole network, he says.
But he believes " this level of sophistication isn't embedded yet" in The Way we tackle disinformation that could put people in Danger .
Source of news: bbc.com