Technology and information literacy

I am very curious to see cwebber’s strategy on this.

While not exactly the same thing This reminds me of some of George Lakoff’s work. For those unfamiliar with him he studies the intersection of language and brain neurology and psychology and political groups. Specifically how the choice of words in symbolizing an idea colors people’s reactions to it. Even if the underlying root ideas are the same.

Many of the library-science people and former philosophy students I know are very much of the opinion that a lot of the technological driven solutions to these kinds of problems are always going to be flimsy bandaids at best.

Im sympathetic to the notion that a lot of this is being driven by an information literacy problem and a complete polarization/lack of unity on cultural values.

Though im not exactly overflowing with solutions to those underlying currents either.

4 Likes

You know, I was sort of aware of the problem in a “oh man doesn’t it suck that the world works this way?” thing, but I never even thought of it as a problem associated with any particular field i.e. library science.

Speaking of bandaid solutions, I wonder if you could apply some sentiment analysis on top of (say) Google’s related searches feature.

“Hey, you searched for ‘are vaccines bad’ which produced results with an average of 76% negative sentiment. Related search ‘foundational vaccination studies’ might produce more neutral or positive results”

And vice-versa

1 Like

My hot take on solutions like this is that what’s good for the goose isn’t good for the gander. I intuitively liked the example you cited right, but being something of far left political I often find that such strategies kind of end up countering all radical / non mainstream positions equally, which I find somewhat disconcerting. I know some of google’s early attempts at identifying “fake news” (I dislike this term) has allready done this to some degree.

2 Likes

Yeah, it’s definitely a problem that isn’t easily band-aided over. ML only helps until you realize you didn’t understand what bias you put into it. And political opinions don’t really work as well as science-problems (vaccinations, pregnancy issues, earth shape, ancient aliens) with this kind of thing because, as you noted, positive sentiment and negative sentiment to sides of an issue runs really high in political writings, and promoting more neutral sentiment only gets you to the middle of the road.

Germane to some things going on with my wife and I, one valuable resource has been a site called Evidence-Based Birth. Sample article about the pros and cons of induction in gestational diabetes cases

They give breakdowns of the research and provide references, and even comment on sampling and testing practices in each study e.g. “This study doesn’t compare mothers whose GD was unmanaged vs those managing GD with diet and insulin”, “Note that this indicates a rise in relative risk, absolute risk remains quite low”, etc.

On the science-problem side, we could do with a lot more sites like this. But it still requires a certain amount of science-literacy that we as a society are having a hard time promoting

2 Likes

Id wager too though it could be just as contentious in a more objective field like science too. Such a system doesn’t really account for paradigm shifts, or new discoveries well. For example what would the result of “Is Pluto a planet?” have looked like in 2005 or 2006?

Such a system kind of biases present established thinking across a whole lot of fields as the truth. Which isn’t always going to stay true.

2 Likes

That’s true, ML training and overtake of sentiment is always going to lag behind reality. And going a different route and making sure to blend equal amounts of sentiment is always going to be hard as well.

I’m reminded of Anathem by Neal Stephenson, and also Starfish by Peter Watts, where long ago the internet had become rampant with such levels of crap, fake news, fake information, viruses, the blabbering of machines to each other, that pretty much only specialists operated with the internet directly and used specialized tools to pull the truth from the crap and clean it up for public consumption. It’ll be a sad day if/when we reach that point.

3 Likes

To get back at what I was saying before. Thats why their is a sentiment that it’s a information literacy problem.

We need to be educating people how THEY can distinguish trusted sources from non trusted sources at a good age, engage in critical thinking and how to properly be skeptical. Our education system doesn’t appear to be well geared for these things and we have compounded this problem culturally in many ways across a long span of time.

Weve simply made the unfortunate mistake of doing this around the same time where we invented the technology to drown in information at the same time.

We all nmight eed to be data scientists to a certain degree.

2 Likes

Agreed, that sort of literacy is going to have to become a common skill really quickly. Part of my public school education (from the librarians, actually) was about how to use different reliable databases to get good resources on various topics, but usually the databases and resources were ones you could only get if the institution you were a part of had a license to it.

Honestly, I wish that was a whole two-quarter course high school students were required to take, rather than a half-hour-in-the-library lecture, focusing on public resources. AKA Google-Fu 101

Thats a funny way to spell DuckDuckGology!

4 Likes

The thing im kinda worried about is how do you instill that sort of thing quickly across a whole lot of people when the adults dont actually THINK its an information literacy problem nor are willing to invest in education.

1 Like

I wish I knew. The ultimate, cynical solution is just to teach the children and wait for the people who are too stubborn to learn to die.