1
00:00:00.000 --> 00:00:00.650
Liam Wyatt (WMF): ok.
2
00:00:02.780 --> 00:00:09.957
Maryana Pinchuk: Alright. Thank you. Hello, everyone. I am Mariana Pinchuk. I am the lead of future audiences.
3
00:00:10.440 --> 00:00:16.489
Maryana Pinchuk: and we are going to be giving our monthly updates to you all.
4
00:00:16.869 --> 00:00:30.860
Maryana Pinchuk: As you're joining in the chat. Thank you for those who are already doing so. Please introduce yourselves. Drop a note about where you're where you're coming from, what projects you work on. If this is your 1st time and then a future audiences call, please let us know.
5
00:00:31.010 --> 00:00:34.609
Maryana Pinchuk: and I will bring up our agenda
6
00:00:35.010 --> 00:00:36.769
Maryana Pinchuk: here in a second.
7
00:00:37.400 --> 00:00:38.060
Maryana Pinchuk: See?
8
00:00:38.860 --> 00:00:39.810
Maryana Pinchuk: Go.
9
00:00:40.480 --> 00:00:41.400
Maryana Pinchuk: Okay.
10
00:00:42.930 --> 00:01:11.139
Maryana Pinchuk: so yes, please drop your note in the chat, and we'll have some time for QA. Both around the new experiment that we're working on, which we'll get to in a minute and hopefully some time for open discussion on any topics you want to bring up. So please save your questions, for then. But if you do have thoughts that you want to share, you want to forget we have a notes document that you're welcome to drop your notes onto.
11
00:01:12.160 --> 00:01:15.800
Maryana Pinchuk: Mr. SJ. I'm gonna mute. You cause I can hear your keyboard.
12
00:01:16.710 --> 00:01:25.630
Maryana Pinchuk: There we go. Perfect. All right. Welcome, everyone. Thanks for joining us. I'm Mariana Pinchuk. I am the lead on future audiences.
13
00:01:25.630 --> 00:01:51.689
Maryana Pinchuk: This is our agenda. Today. We are going to every call. We try to recontextualize what future audiences is in case you're new here, going to give you an update on the last fiscal year, which was the 1st year that this team existed very exciting, and give you some insights on our last experiment that we performed citation needed, which we demoed in this very venue in, I believe, April.
14
00:01:51.910 --> 00:02:03.680
Maryana Pinchuk: And then we're gonna talk about a new experiment that we're working on. So I'll turn it over to my colleague Francisco, who's here? To give you a walkthrough and we'll have some time for questions and feedback
15
00:02:04.023 --> 00:02:19.859
Maryana Pinchuk: and also we're gonna be asking you all to test this thing out for us, please. So we'll have some instructions for how you can get access to it. And then, like I said at the end. Any other questions you have? Please. Feel free. We have time. We'll try to get to them.
16
00:02:20.220 --> 00:02:48.600
Maryana Pinchuk: So just to start future audiences. What is future audiences? It is an initiative of the product and technology department at the Wikimedia Foundation. We've been around for about a year, and we have a mandate to test new ways of serving future generations of knowledge, consumers and contributors in new ways and learning. What can be done differently? What kinds of
17
00:02:48.940 --> 00:02:56.209
Maryana Pinchuk: features or technologies, or just ways of approaching knowledge, sharing and knowledge. Consumption
18
00:02:56.210 --> 00:03:19.779
Maryana Pinchuk: can be brought into our movement and where we can go next. But our team doesn't build products like a lot of the other product and tech teams, which is a little confusing. I know we build experiments. We build experimental features or tools that are live for a temporary period of time in order to learn something valuable in
19
00:03:19.780 --> 00:03:36.679
Maryana Pinchuk: whatever space we're testing in. And the ultimate goal of this is to make recommendations for bigger investments that all the other product and technology teams should be making to to help us really continue to evolve as a movement to continue to serve the needs of
20
00:03:36.680 --> 00:03:49.099
Maryana Pinchuk: everyone in the world. With free knowledge. So like, I said, we've been around for about a year and our fiscal year at the Wikimedia Foundation. Starts in July. So we're in
21
00:03:49.100 --> 00:04:12.839
Maryana Pinchuk: the very beginning of the new fiscal year, and we did a lot of experiments last year which we've shared in this venue. And if you want to go and read about all of them on Meta, go to future audiences. All of them are listed there, but our goal is to make strategic recommendations in the kind of draft annual planning period which runs from April through June.
22
00:04:13.335 --> 00:04:30.409
Maryana Pinchuk: So I'll just give you a quick rundown of everything we learned last year. The recommendations or insights that we brought into the annual planning period this past April, June. And what we're we're working on next in this new fiscal year which just kicked off
23
00:04:31.082 --> 00:04:39.279
Maryana Pinchuk: so the 1st thing we learned is that we can, in fact, run quick experiments and learn and turn things off
24
00:04:39.809 --> 00:05:03.190
Maryana Pinchuk: which is not necessarily the the way that we approach sort of standard product building at the Wikipedia Foundation. All of the other product teams 1st of all, are much larger. They have more more resources for software engineers and designers and user research. And it's a big cross functional team usually working on a a very
25
00:05:03.190 --> 00:05:14.329
Maryana Pinchuk: complex involved project that has to think about things like multi-language support. So we can't just build things that work for English Wikipedia. We have to build things that work for every language. Wikipedia.
26
00:05:14.330 --> 00:05:41.659
Maryana Pinchuk: We have to consider things like the kinds of browsers that people are using accessibility, usability. All kinds of considerations have to go into making real products that are fully scaled to meet the needs of our current audiences. But our team is different. We're not trying to build new products. We're trying to learn quickly and learn about things that are very emergent and new and changing very rapidly, such as AI.
27
00:05:41.660 --> 00:05:55.490
Maryana Pinchuk: So last year we ran a lot of small scale. AI experiments starting with a Plugin for Chatgpt that brought information from Wikipedia to Chatgpt users in a kind of a new way.
28
00:05:55.750 --> 00:06:18.090
Maryana Pinchuk: and we ran that experiment learned some stuff and we turned it off. We did not invest in continuing to support that or build that out further, because that is not the role of our team and a big learning, in addition to just learning that we could, in fact, run quick experiments, learn and get what we needed and not have to preserve things forever if they weren't needing
29
00:06:18.448 --> 00:06:34.950
Maryana Pinchuk: the needs of users in real life. What? We really explored a lot around and and learned a lot about was AI the capabilities of AI. And you know, when Chatgbt 1st came around, everyone was really excited, and there were a lot of.
30
00:06:34.980 --> 00:06:50.919
Maryana Pinchuk: you know, probably a little over hyped press pieces on this idea that oh, AI is just gonna generate all of this content. It's just maybe it'll just write Wikipedia for us, and we won't even need humans anymore, because it'll be so good. And it'll just create all this new content.
31
00:06:51.480 --> 00:07:08.369
Maryana Pinchuk: we have not so far found that that is a reality with the tools that exist today. However, one thing that we kept learning, we learned this through the Chat Gpt Plugin, and also through citation needed, which I'll talk about in a minute. Another experiment that we ran pretty quickly.
32
00:07:08.764 --> 00:07:20.420
Maryana Pinchuk: Was that AI? The the thing about AI that's really interesting is that it can really help to kind of augment the navigation and parsing of a lot of really complex content.
33
00:07:20.735 --> 00:07:44.370
Maryana Pinchuk: Which, if you've ever tried to read a long Wikipedia article or tried to search Wikipedia for something you will know that that is a a big problem that exists on our projects today. We've generated so much content over the last 23 years as a movement. And it's big and long and all over the place, and not structured necessarily in the way that people can navigate it easily.
34
00:07:44.460 --> 00:08:05.509
Maryana Pinchuk: So we see a lot of opportunities for using AI in this particular way, not to create new stuff necessarily, but to really figure out how to get people to the information that they need and help them make sense of it. Better so for the other product team. Some of the stuff that they're going to be working on this year. Starting now in July.
35
00:08:05.903 --> 00:08:18.489
Maryana Pinchuk: Are things like researching and prototyping new ways that AI could be used to facilitate better content discovery and browsing on our projects. And for the contributor side.
36
00:08:18.490 --> 00:08:25.240
Maryana Pinchuk: trying to see if there are ways that we can train Llms to detect known article issues. For example.
37
00:08:25.240 --> 00:08:53.999
Maryana Pinchuk: Peacock language. This new product that was released is the best product in the world. This amazing revolutionary new product that's going to change everything was released on January 20. You know the drill about peacock language. So we're trying to see if we can use AI tools to detect that kind of article issue and then serve that through existing experiences geared towards our contributors, such as edit check and suggested edits
38
00:08:54.517 --> 00:09:11.480
Maryana Pinchuk: and the out of fact experiment which I will be talking about in a second is another way in which we think we want to learn more about how AI can be used to facilitate parsing through a bunch of content on Wiki off Wiki, making sense of it and helping an editor to make a decision.
39
00:09:12.890 --> 00:09:30.790
Maryana Pinchuk: So citation needed. This was the last experiment that we conducted, and we ran this last couple of quarters. We just wrapped it up now, and there's a report on Wiki that if you want to read more about all of the things that we found through the course of this experiment. It's really interesting. Project.
40
00:09:31.060 --> 00:09:35.229
Maryana Pinchuk: Just as a reminder. If you weren't here in in April, when we talked about this
41
00:09:35.580 --> 00:09:51.510
Maryana Pinchuk: our goal was to see if we could use Wikipedia, plus the kind of searching, retrieving power of AI to allow people to verify content on websites that they were getting information from online.
42
00:09:51.510 --> 00:10:09.240
Maryana Pinchuk: And we didn't know if people would be interested in this at all. Certainly Wikipedians are interested in this, but when it came to kind of the broader public we weren't sure if this is a behavior that people wanted to do if they would trust Wikipedia as a resource in verifying content.
43
00:10:09.575 --> 00:10:15.560
Maryana Pinchuk: If we could reach new audiences, and if if AI could actually facilitate this at all. These were all
44
00:10:15.630 --> 00:10:42.160
Maryana Pinchuk: big, open questions. As we embarked on building a browsing plugin that would allow you to essentially, highlight claims on websites. Such as the one that you're seeing on the screen on the right here. So you can highlight this claim on any website that you find. And you. Citation needed to check to see what Wikipedia has to say about this claim, whether it's already there, and and there's a reference, or whether it's
45
00:10:42.210 --> 00:10:46.530
Maryana Pinchuk: not on Wikipedia, or whether Wikipedia has something that contradicts that claim.
46
00:10:47.697 --> 00:10:59.500
Maryana Pinchuk: So what we found when we built this, Plugin launched it, put it in the chrome store publicized it in various kind of small lightweight ways. We
47
00:10:59.820 --> 00:11:05.610
Maryana Pinchuk: we wrote a blog post that mentioned it. We asked our sort of staff members with a lot of
48
00:11:05.620 --> 00:11:20.940
Maryana Pinchuk: clout on social media to publicize it among their friends and peers. We emailed some donors who had been interested in kind of keeping tabs with what the Wikimedia Foundation is doing. To just let them know about this experiment.
49
00:11:21.251 --> 00:11:40.878
Maryana Pinchuk: And what we got back was an interesting mix of insights. So, on the one hand, people love the idea. We got lots and lots of great feedback from users from all over the place. That this is a really cool idea really needed. Really, like the concept a lot. And and people did over a thousand people took the
50
00:11:41.300 --> 00:12:02.260
Maryana Pinchuk: took the extra step of actually installing it and trying it out. So they didn't just say that they liked the idea. They they installed it in in their chrome web browser, which is a pretty big step, I mean browser. Browser extensions are kind of finicky, not not super easy to to use. So that was great to know that there, there's something here.
51
00:12:02.260 --> 00:12:13.039
Maryana Pinchuk: But what we saw is that you know we would let people know about this thing. They would test it out, and they didn't keep using it. After that they would use it once or twice, and then they kind of went away.
52
00:12:13.290 --> 00:12:38.679
Maryana Pinchuk: So I think there's a lot more we can learn about how to actually build this in a way that delivers value to these people who really want something like this to exist, and there are probably a lot of different approaches we could take, whether it's the kind of content that this is being presented on how it's being presented. How manual it is versus something that kind of just runs in the background and checks for incorrect claims.
53
00:12:39.027 --> 00:13:04.029
Maryana Pinchuk: But 1 1 kind of next step we're thinking about is showing this to 3rd party platforms who we know might have some issues with not super reliable content appearing on on their their apps, their their services. And really seeing if if we can continue to learn through putting this in a place where it could be a little bit more intuitive and easy to use
54
00:13:04.459 --> 00:13:12.259
Maryana Pinchuk: and work a little bit more specifically rather than generally. So that's 1 thing we're gonna we're gonna be working on
55
00:13:13.158 --> 00:13:22.750
Maryana Pinchuk: another thing that we learned, though, super super crucial. And this really gets back into the next experiment that we'll talk about, and also our overall
56
00:13:22.890 --> 00:13:24.929
Maryana Pinchuk: findings about AI
57
00:13:25.790 --> 00:13:26.639
Maryana Pinchuk: Llms
58
00:13:26.780 --> 00:13:42.100
Maryana Pinchuk: can be really helpful in finding and analyzing and retrieving information, especially if you have a huge messy corpus of information as the Committee of Projects do, but they're not perfect by any means. They make mistakes. They make stuff up.
59
00:13:42.120 --> 00:13:47.099
Maryana Pinchuk: They don't have the kind of context and judgment that human does
60
00:13:47.431 --> 00:14:09.640
Maryana Pinchuk: and for sure there could have been a lot more prompt engineering and sort of careful refinement of the model that we were using, and citation needed to try to minimize those mistakes. But we feel that at the end of the day it's it's never going to be perfect. And there are just some things that human judgment is really really required for. And this this is it.
61
00:14:09.640 --> 00:14:27.020
Maryana Pinchuk: this being really analyzing facts that can be complex and nuanced and multifaceted. We're not in a world in which AI can replace a human being to really make a good judgment about the factuality of something, or how good the reference is.
62
00:14:27.380 --> 00:14:47.940
Maryana Pinchuk: So so that was a really interesting finding. And in that vein, we're gonna talk a little bit about a new experiment that we're starting and I'm gonna hand it over to my colleague Francisco, to talk about this and Llm. Human in the loop off platform contribution experiments. Take it away, Francisco.
63
00:14:47.940 --> 00:14:53.090
Francisco Navas: Alright! Alright! I assume you can hear me. Maybe someone say Yes.
64
00:14:53.770 --> 00:14:54.620
Liam Wyatt (WMF): Yes.
65
00:14:54.750 --> 00:14:59.174
Francisco Navas: Thank you very much. Look at those thumbs. Yeah, hey, folks, I'm from Cisco.
66
00:15:00.600 --> 00:15:18.010
Francisco Navas: Just been on a future island for a little bit, now jumped on to work on this project super exciting and out of fact, as you can see it. As Mariana was saying, it very much comes from some of these learnings about citation needed. Liam wrote. Knowledge is human, Mariana said.
67
00:15:18.420 --> 00:15:23.519
Francisco Navas: Is the Lm really is AI at all, or Lm. As a subset
68
00:15:23.700 --> 00:15:32.039
Francisco Navas: strong enough to tell what is factual or not. Well, I I think if we say no, then we can use the human to do something else.
69
00:15:33.310 --> 00:15:44.279
Francisco Navas: it's always the work of Wikipedians to find information and collate it, collect it, rewrite it, organize it, judge it, and put it into Wiki where is necessary.
70
00:15:44.300 --> 00:15:50.529
Francisco Navas: So we are just at the point of having an Mvp. Ready for. Out of fact.
71
00:15:51.370 --> 00:15:54.289
Francisco Navas: I tried to put the basics down here in this slide
72
00:15:54.310 --> 00:16:04.319
Francisco Navas: you will recognize it looks a lot like citation needed. Of course, it's built off of the framework and the same Llm. Engine, and even
73
00:16:04.640 --> 00:16:07.040
Francisco Navas: tweaked, but very similar.
74
00:16:08.030 --> 00:16:19.450
Francisco Navas: of course. Maybe Daniel, our main engineer on this project, may differ, and he will have more specific but very similar engine for telling the user who selected a statement
75
00:16:19.650 --> 00:16:27.339
Francisco Navas: what is going on in the Wikipedia article. So the main difference, of course, right between citation needed and that of fact is, maybe citation needed
76
00:16:27.420 --> 00:16:30.330
Francisco Navas: was trying to help. You understand if
77
00:16:30.440 --> 00:16:36.030
Francisco Navas: what was on what the text you said selected, if it was on the Wikipedia article.
78
00:16:36.040 --> 00:16:41.849
Francisco Navas: And so we're using that function now without a fact to help you decide whether a statement you selected.
79
00:16:41.860 --> 00:16:54.299
Francisco Navas: if whether it's on or not on or how much the article agrees with that statement you selected. If that selected text belongs on an article that you that you chose.
80
00:16:54.870 --> 00:16:57.019
Francisco Navas: And so, as you can see here
81
00:16:57.310 --> 00:16:59.400
Francisco Navas: we use Lm to help you
82
00:16:59.480 --> 00:17:02.459
Francisco Navas: speed up that process or help you make that some decisions.
83
00:17:03.905 --> 00:17:04.910
Francisco Navas: And
84
00:17:05.190 --> 00:17:11.700
Francisco Navas: you can see how it works here, and I think the next slide means that we have to try some live demo. Okay, we'll get to live, Demo in a second
85
00:17:12.316 --> 00:17:14.700
Francisco Navas: like, with all these experiments.
86
00:17:15.430 --> 00:17:21.800
Francisco Navas: They're specific question based. We have some hypotheses we want to check on
87
00:17:23.040 --> 00:17:25.250
Francisco Navas: Does. Has there ever been?
88
00:17:25.500 --> 00:17:26.710
Francisco Navas: I don't think so.
89
00:17:27.243 --> 00:17:33.159
Francisco Navas: A productive way to contribute from, not Wikipedia org onto Wikipedia org.
90
00:17:33.580 --> 00:17:37.210
Francisco Navas: Can we try that? What results does that produce?
91
00:17:37.822 --> 00:17:39.570
Francisco Navas: That it obviously
92
00:17:39.740 --> 00:17:45.059
Francisco Navas: raises the question of, who could that be for? That's the 3rd question here is that.
93
00:17:45.420 --> 00:17:58.190
Francisco Navas: could there be a tool for productively contributing to Wikipedia? Does that maybe go to, or should be used by non Wikipedia editors, people who don't have experience, your norm, as I call them.
94
00:17:59.550 --> 00:18:08.080
Francisco Navas: simultaneously. Of course, like we have to understand if having the AI 11 in the loop is actually helpful, does it harm the process?
95
00:18:10.060 --> 00:18:12.733
Francisco Navas: do we mean? What can we do here? Yep,
96
00:18:13.560 --> 00:18:14.889
Francisco Navas: it's a good question.
97
00:18:15.470 --> 00:18:22.339
Francisco Navas: I contribute often Wikipedia, through editing Wikiped with data. I only say Wikipedia here, because
98
00:18:22.500 --> 00:18:24.339
Francisco Navas: so far out of fact.
99
00:18:24.540 --> 00:18:29.820
Francisco Navas: is just for Wikipedia in particular English Wikipedia, at least for this Mvp launch.
100
00:18:29.840 --> 00:18:31.010
Francisco Navas: And
101
00:18:31.210 --> 00:18:34.190
Francisco Navas: but yeah, why not the Wikimedia
102
00:18:34.390 --> 00:18:38.500
Francisco Navas: editor world? So the non Wikimedian, I will say
103
00:18:39.730 --> 00:18:41.789
Francisco Navas: And then, finally, a super important question is.
104
00:18:41.920 --> 00:18:47.849
Francisco Navas: what about sources? How can a tool produced by Wmf
105
00:18:49.660 --> 00:18:50.960
Francisco Navas: support
106
00:18:51.270 --> 00:18:55.279
Francisco Navas: source decisions across different Wiki projects?
107
00:18:55.290 --> 00:18:57.099
Francisco Navas: And what place. Does
108
00:18:57.440 --> 00:19:00.049
Francisco Navas: a tool have been doing that?
109
00:19:00.460 --> 00:19:05.659
Francisco Navas: You know, it's all these questions about what it should be automated and what should not be automated.
110
00:19:06.351 --> 00:19:10.659
Francisco Navas: Cool details. Important details like I said, only on English wiki.
111
00:19:10.730 --> 00:19:13.289
Francisco Navas: and it has a post limit per day
112
00:19:14.590 --> 00:19:17.599
Francisco Navas: trying to stop some spam. You could easily see how
113
00:19:17.830 --> 00:19:21.500
Francisco Navas: someone could just spam the heck out of facts. We don't want that.
114
00:19:22.090 --> 00:19:27.800
Francisco Navas: The 1st step we took for reliability. Daniel implemented the head bomb gadget
115
00:19:27.930 --> 00:19:29.060
Francisco Navas: to
116
00:19:29.070 --> 00:19:33.759
Francisco Navas: create a little signal. Once a text from a source is selected about
117
00:19:33.840 --> 00:19:37.409
Francisco Navas: where that source might sit on the perennial sources list.
118
00:19:37.720 --> 00:19:44.429
Francisco Navas: And finally, but very important. Thank you. To Alana and Sj. Who are here, who
119
00:19:44.470 --> 00:19:49.890
Francisco Navas: birth this idea? We've sat for 12 h on a train going to Toronto.
120
00:19:50.270 --> 00:19:58.780
Francisco Navas: The 12 h was a mistake, but hanging out with them was surely not. We would not have even started this idea had it not been for them. So
121
00:19:58.860 --> 00:20:07.750
Francisco Navas: thank you very much for your help, support, and general mentorship. And so here we are. So those are some important basic facts.
122
00:20:08.590 --> 00:20:10.539
Francisco Navas: And I think we could do a live demo now.
123
00:20:11.360 --> 00:20:12.450
Francisco Navas: Daniel.
124
00:20:12.840 --> 00:20:13.810
Francisco Navas: can we
125
00:20:14.090 --> 00:20:15.049
Francisco Navas: think we can?
126
00:20:15.300 --> 00:20:19.069
Francisco Navas: I think we're confident we're confident. I'm happy to run it.
127
00:20:21.390 --> 00:20:22.020
Daniel Erenrich: Sure.
128
00:20:22.770 --> 00:20:23.550
Francisco Navas: Okay.
129
00:20:23.680 --> 00:20:25.900
Francisco Navas: let's see what it
130
00:20:26.330 --> 00:20:29.270
Francisco Navas: does when I try to share my screen.
131
00:20:30.300 --> 00:20:32.800
Francisco Navas: Oh, some privacy problem.
132
00:20:32.860 --> 00:20:34.110
Francisco Navas: Hmm.
133
00:20:34.630 --> 00:20:36.810
Francisco Navas: zoom, yes.
134
00:20:37.515 --> 00:20:42.399
Maryana Pinchuk: Liam, maybe you need to make Francisco co-host so he can share his screen. Possibly.
135
00:20:42.480 --> 00:20:44.270
Liam Wyatt (WMF): He he already is.
136
00:20:44.270 --> 00:20:45.460
Maryana Pinchuk: Oh hmm!
137
00:20:46.880 --> 00:20:48.220
Francisco Navas: About now.
138
00:20:49.462 --> 00:20:51.000
Francisco Navas: But papa, papa!
139
00:20:51.350 --> 00:20:53.200
Liam Wyatt (WMF): That's what you get for the live Demo.
140
00:20:53.410 --> 00:20:57.829
Francisco Navas: There we go, Daniel. I think you gotta do it, cause I can't share
141
00:20:57.900 --> 00:20:59.360
Francisco Navas: my screen on Zoom
142
00:20:59.660 --> 00:21:00.460
Francisco Navas: apologies.
143
00:21:00.460 --> 00:21:07.039
Liam Wyatt (WMF): Think this is not. This is not a zoom privacy problem. This is your computer needs configuration to zoom. Yeah.
144
00:21:08.498 --> 00:21:13.863
Daniel Erenrich: Okay, I hadn't been anticipating demoing. So this is gonna be interesting. Oh, wait! Do I have the same problem
145
00:21:14.690 --> 00:21:15.889
Francisco Navas: Oh, I got it!
146
00:21:16.190 --> 00:21:16.690
Daniel Erenrich: Okay.
147
00:21:16.690 --> 00:21:17.670
Maryana Pinchuk: Screen, alright.
148
00:21:17.670 --> 00:21:19.970
Liam Wyatt (WMF): Worrying, confirming. I can see your screen.
149
00:21:19.970 --> 00:21:20.620
Francisco Navas: Great.
150
00:21:20.840 --> 00:21:22.880
Francisco Navas: All right, we're on wikipedia.com
151
00:21:23.000 --> 00:21:25.670
Francisco Navas: wikipedia.com. I always say that wikipedia.org
152
00:21:27.340 --> 00:21:31.219
Francisco Navas: And here's the extension with a very cool dark.
153
00:21:31.500 --> 00:21:33.239
Francisco Navas: This is the dev mode.
154
00:21:33.810 --> 00:21:35.070
Francisco Navas: So right now
155
00:21:35.800 --> 00:21:37.558
Francisco Navas: try and run it on.
156
00:21:37.960 --> 00:21:41.910
Francisco Navas: Any Wikipedia page, Wikipedia Page, or the extension
157
00:21:42.420 --> 00:21:45.440
Francisco Navas: a chrome extension itself. It won't work.
158
00:21:45.610 --> 00:21:47.059
Francisco Navas: So first, st things 1st
159
00:21:47.220 --> 00:21:48.620
Francisco Navas: prompts me to log in.
160
00:21:51.260 --> 00:21:54.560
Francisco Navas: So I will do that hopefully. I can log in no problem.
161
00:21:54.970 --> 00:21:56.500
Francisco Navas: Don't look at my password.
162
00:22:02.100 --> 00:22:03.650
Francisco Navas: Okay, that's the test.
163
00:22:05.140 --> 00:22:07.740
Francisco Navas: Alright. So I should be logged in now.
164
00:22:08.590 --> 00:22:10.259
Francisco Navas: So now oop
165
00:22:10.470 --> 00:22:13.080
Francisco Navas: trying to move this guy down here.
166
00:22:13.896 --> 00:22:24.159
Francisco Navas: Okay, so say you're a reader of infowars, and for some reason you think that infowars belongs. Information from infowars belongs on Wikimedia
167
00:22:24.170 --> 00:22:25.830
Francisco Navas: on any Wiki project.
168
00:22:26.270 --> 00:22:28.819
Francisco Navas: Please hold, folks, I'm being facetious.
169
00:22:30.690 --> 00:22:32.020
Francisco Navas: and come down here.
170
00:22:32.270 --> 00:22:37.789
Francisco Navas: She's a week just through the headline. So, according to worse, when us universities to require covid vaccine.
171
00:22:40.060 --> 00:22:40.970
Francisco Navas: Okay.
172
00:22:41.420 --> 00:22:44.900
Francisco Navas: so artifact immediately tells you the source is unreliable.
173
00:22:44.950 --> 00:22:46.879
Francisco Navas: If we were to hit, learn more.
174
00:22:47.390 --> 00:22:50.859
Francisco Navas: take us through the reliability learning Sources list.
175
00:22:51.410 --> 00:22:54.390
Francisco Navas: The idea is, of course, if you're not someone
176
00:22:54.480 --> 00:22:56.110
Francisco Navas: who edits Wiki
177
00:22:56.720 --> 00:22:58.290
Francisco Navas: can help you learn about
178
00:22:58.390 --> 00:23:03.039
Francisco Navas: what volunteers have decided around sources
179
00:23:03.640 --> 00:23:06.220
Francisco Navas: for stretches, we'll check the statement.
180
00:23:07.000 --> 00:23:17.279
Francisco Navas: So it's not gonna tell you, if the statement is true or false, it's not the point here what what it can the Lm. Is doing right here is analyzing the presence of the statement on
181
00:23:17.490 --> 00:23:21.040
Francisco Navas: different English Wikipedia articles
182
00:23:21.750 --> 00:23:26.780
Francisco Navas: and trying to help you understand what it mentions. And then it makes an assessment.
183
00:23:27.100 --> 00:23:33.879
Francisco Navas: So, for example, the source, mentions that many us colleges and universities does not have required, but does not provide the exact number. So okay.
184
00:23:34.100 --> 00:23:35.110
Francisco Navas: that's fine.
185
00:23:35.560 --> 00:23:42.170
Francisco Navas: That's a good. That's a pretty good one. Here's just some other relevant ones just about us colleges and COVID-19 in general.
186
00:23:42.290 --> 00:23:44.369
Francisco Navas: Again, this one is less specific.
187
00:23:44.430 --> 00:23:49.720
Francisco Navas: So maybe these are. You think these are not relevant at all. And actually you may want to add this
188
00:23:49.890 --> 00:23:51.520
Francisco Navas: back to the infowars
189
00:23:52.300 --> 00:23:53.250
Francisco Navas: article
190
00:23:53.660 --> 00:23:56.030
Francisco Navas: so you can search on Wiki.
191
00:23:56.580 --> 00:23:58.100
Francisco Navas: Your search will. Then
192
00:23:59.180 --> 00:24:02.760
Francisco Navas: the Lm. Will then again search through that article that you chose.
193
00:24:02.820 --> 00:24:06.090
Francisco Navas: And yeah, let's let's put this on there, for now let's see.
194
00:24:06.440 --> 00:24:15.489
Francisco Navas: I should say that anything coming from the dev version about a fact will add to we'll add the fact to
195
00:24:16.140 --> 00:24:21.210
Francisco Navas: the test Wiki version of the article, so I am not putting on this on Wikipedia right now.
196
00:24:21.380 --> 00:24:22.419
Francisco Navas: Don't wear it
197
00:24:23.536 --> 00:24:26.389
Francisco Navas: so we generate this form. You can link
198
00:24:26.550 --> 00:24:28.100
Francisco Navas: to the article itself.
199
00:24:28.300 --> 00:24:31.479
Francisco Navas: So you want to read it for any more context that you may need.
200
00:24:31.980 --> 00:24:36.089
Francisco Navas: And and this is a form that shows you what the post
201
00:24:36.400 --> 00:24:38.009
Francisco Navas: to a
202
00:24:38.520 --> 00:24:43.850
Francisco Navas: a top page for this article in this case. To the test Wiki would look like.
203
00:24:43.860 --> 00:24:45.589
Francisco Navas: and you can add your additional comment.
204
00:24:47.460 --> 00:24:48.800
Francisco Navas: truly think
205
00:24:48.940 --> 00:24:50.170
Francisco Navas: it's important
206
00:24:50.710 --> 00:24:55.460
Francisco Navas: to add info about Covid to the worst
207
00:24:55.640 --> 00:24:56.380
Francisco Navas: article.
208
00:24:57.540 --> 00:24:58.779
Francisco Navas: Okay, great.
209
00:25:01.680 --> 00:25:04.679
Francisco Navas: So now it's been sent to the top page for infowars.
210
00:25:05.580 --> 00:25:07.150
Francisco Navas: You can see in your new tab.
211
00:25:07.990 --> 00:25:10.879
Francisco Navas: and it automatically automatically generates this
212
00:25:11.560 --> 00:25:12.589
Francisco Navas: this post.
213
00:25:12.900 --> 00:25:17.679
Francisco Navas: Here's an interesting article, because normally
214
00:25:18.241 --> 00:25:26.390
Francisco Navas: out of fact, would use Sito to create a reference that you can then post, but I guess it doesn't work on infowars. That's new information to me.
215
00:25:26.510 --> 00:25:29.799
Francisco Navas: So why don't we try it on a different site?
216
00:25:30.240 --> 00:25:32.039
Francisco Navas: Zion Kamal Harris
217
00:25:32.600 --> 00:25:38.420
Francisco Navas: article. It was an interesting point here about the importance of Michigan and statistics.
218
00:25:39.870 --> 00:25:41.400
Francisco Navas: just to run it again.
219
00:25:43.430 --> 00:25:47.730
Francisco Navas: I think it's very interesting and very valuable to read the
220
00:25:47.900 --> 00:25:50.870
Francisco Navas: assessments that the Llm. Makes
221
00:25:50.900 --> 00:25:53.299
Francisco Navas: and could be helpful in choosing an article.
222
00:25:53.840 --> 00:26:03.270
Francisco Navas: For example, here the claim is partially correct. Hmm. As Democrats did take control of both Houses and the Machine Legislature for the 1st time in 45 years. But it's incorrect about 2018
223
00:26:03.420 --> 00:26:12.980
Francisco Navas: midterms and 2020 Presidential elections. So again. Mvp. I'm don't love correct and incorrect here, but what I think it's trying to say is that in this article
224
00:26:13.170 --> 00:26:15.240
Francisco Navas: those specific points are not present.
225
00:26:15.960 --> 00:26:18.520
Francisco Navas: So that will be something to correct, to correct.
226
00:26:19.920 --> 00:26:21.360
Francisco Navas: Here's a sit toy test.
227
00:26:22.390 --> 00:26:25.249
Francisco Navas: So site. So it should generate a
228
00:26:28.930 --> 00:26:29.810
Francisco Navas: here we go.
229
00:26:30.640 --> 00:26:32.149
Francisco Navas: And so the idea is.
230
00:26:33.410 --> 00:26:36.549
Francisco Navas: there's a signed by me, added by artifact.
231
00:26:37.600 --> 00:26:41.899
Francisco Navas: creates a subject for the new topic.
232
00:26:42.410 --> 00:26:44.440
Francisco Navas: And ideally.
233
00:26:44.590 --> 00:26:49.149
Francisco Navas: if this was a fact that folks on this page wanted to add.
234
00:26:49.370 --> 00:26:54.999
Francisco Navas: they could argue it here, discuss it. Take it from, take this information and make it into an actual edit
235
00:26:55.687 --> 00:26:57.910
Francisco Navas: and then ideally use the reference as A
236
00:26:58.060 --> 00:26:59.799
Francisco Navas: into the article itself.
237
00:26:59.920 --> 00:27:03.519
Francisco Navas: So the idea, of course, as I said, is to support
238
00:27:04.000 --> 00:27:05.220
Francisco Navas: adding facts
239
00:27:05.610 --> 00:27:07.990
Francisco Navas: into part of the
240
00:27:09.220 --> 00:27:10.300
Francisco Navas: from off
241
00:27:10.820 --> 00:27:12.190
Francisco Navas: wikipedia.org
242
00:27:13.011 --> 00:27:14.980
Francisco Navas: and seeing what we learn from there.
243
00:27:15.481 --> 00:27:18.199
Francisco Navas: I think there was maybe one or 2 more slides.
244
00:27:18.250 --> 00:27:21.400
Francisco Navas: and then we'll open up some questions. Should I look, be looking at the chat.
245
00:27:22.086 --> 00:27:27.293
Maryana Pinchuk: I'm I'm kind of trying to drop some notes in here under the chat as well.
246
00:27:27.690 --> 00:27:29.697
Maryana Pinchuk: yeah, Mike, you asked.
247
00:27:31.088 --> 00:27:55.240
Maryana Pinchuk: given that we know that some talk pages can be pretty inactive. Would it help to show kind of when the talk page was last edited? Yes, absolutely. I think that the talk page posting is kind of our very simple proof of concept, the implementation of something like this. But we know that there are. You know, there are lots of unfortunately, pages that aren't monitored very closely and talk pages that aren't necessarily acted on so
248
00:27:55.240 --> 00:28:14.479
Maryana Pinchuk: looking for really interested in hearing ideas about other places that the fact could go. As you can see, we're very hesitant to add anything directly to Wikipedia, because we're trying to keep a kind of like a open funnel, but but but only very you know, careful actual posting to project model.
249
00:28:14.480 --> 00:28:34.079
Maryana Pinchuk: But if you have other ideas like, I don't know. Wiki projects maybe associated with the article, or maybe an entirely new queue of facts that can be reviewed by experienced editors really love to hear thoughts on that. I'm gonna share my screen again, if I can find it. Where is it?
250
00:28:34.600 --> 00:28:35.670
Maryana Pinchuk: There go.
251
00:28:36.800 --> 00:28:37.430
Maryana Pinchuk: Okay.
252
00:28:37.430 --> 00:28:39.129
Francisco Navas: Carolina for the props
253
00:28:42.140 --> 00:28:51.679
Francisco Navas: so hopefully that was smooth and not horrible. To listen to me talk through this for a while. I couldn't see your faces because I was looking at the screen so I couldn't judge. But anyway.
254
00:28:51.850 --> 00:28:55.096
Francisco Navas: you are here. Thank you for coming
255
00:28:56.240 --> 00:28:57.810
Francisco Navas: We'd love for your feedback.
256
00:28:57.830 --> 00:28:59.350
Francisco Navas: There's a couple ways
257
00:28:59.680 --> 00:29:01.970
Francisco Navas: number one, just from this demo.
258
00:29:02.080 --> 00:29:06.570
Francisco Navas: You can, you know. Text me if you have my number, email, me post on the middle talk page.
259
00:29:06.890 --> 00:29:14.630
Francisco Navas: There's the link. It's pretty easy to find you try. This is that a fact comes up on Google as well? We can admit
260
00:29:14.680 --> 00:29:24.470
Francisco Navas: we can send it to you all later as well. If we have permission to email you and then, of course, we would love if you could trial our dev version.
261
00:29:24.570 --> 00:29:26.510
Francisco Navas: we have a
262
00:29:27.230 --> 00:29:32.500
Francisco Navas: it's running, and we have a goal of getting a better version out ideally for Wikimania.
263
00:29:33.200 --> 00:29:37.060
Francisco Navas: In Poland, where Mariana will be in a daily presenting the
264
00:29:37.110 --> 00:29:49.650
Francisco Navas: prod prod version that totally works. And it's incredible. And so we yeah, we'd love all your feedback for that use. You'll need Nikki an end, Wiki, and test Wiki account and auto confirmation status.
265
00:29:50.430 --> 00:29:54.850
Francisco Navas: I did not have test Wiki auto confirmation status, but we can
266
00:29:54.920 --> 00:30:01.060
Francisco Navas: give you test with the auto confirmation status. No problem. You just have to add your name to the list.
267
00:30:01.070 --> 00:30:02.350
Francisco Navas: The list
268
00:30:02.490 --> 00:30:04.769
Francisco Navas: is, where's the list
269
00:30:05.750 --> 00:30:08.030
Francisco Navas: should be at the top of the chat.
270
00:30:08.910 --> 00:30:10.600
Francisco Navas: and I'll repost it. Now.
271
00:30:13.640 --> 00:30:15.269
Francisco Navas: if you go in this, Doc.
272
00:30:15.780 --> 00:30:18.090
Francisco Navas: right below the
273
00:30:18.810 --> 00:30:21.179
Francisco Navas: agenda, there's a sign up
274
00:30:21.950 --> 00:30:23.310
Francisco Navas: to add effect.
275
00:30:23.700 --> 00:30:26.540
Francisco Navas: And then, actually, I will post right now
276
00:30:27.260 --> 00:30:30.030
Francisco Navas: the yeah URL for downloading
277
00:30:30.530 --> 00:30:31.939
Francisco Navas: in the chrome store.
278
00:30:33.360 --> 00:30:34.930
Francisco Navas: So you can go straight there.
279
00:30:38.220 --> 00:30:42.319
Francisco Navas: You're interested. You should be able to download it straight from that page.
280
00:30:43.850 --> 00:30:45.440
Francisco Navas: and yeah, leave us
281
00:30:45.620 --> 00:30:49.090
Francisco Navas: your name on your username, your end, Wiki username
282
00:30:49.150 --> 00:30:52.619
Francisco Navas: in that, Doc. If you want to give this a run
283
00:30:53.600 --> 00:30:56.309
Francisco Navas: or email, me, or whatever you know, reach us, however, you would like.
284
00:30:56.370 --> 00:31:01.730
Francisco Navas: And yeah, you can also leave feedback directly through a form which can be anonymous
285
00:31:02.120 --> 00:31:06.990
Francisco Navas: inside the extension itself. It says, feedback on the bottom next to login, so you can add that.
286
00:31:07.320 --> 00:31:09.160
Francisco Navas: And I think there's 1 more slide
287
00:31:09.180 --> 00:31:12.199
Francisco Navas: things for me to say, and I'll stop yapping at you.
288
00:31:14.150 --> 00:31:14.950
Francisco Navas: yeah, I can.
289
00:31:14.950 --> 00:31:33.830
Maryana Pinchuk: I can take this one. Thanks, Francisco. Yes. So we have this dev version available, and we're giving it to you. Our loyal future audiences, followers 1st before publishing it, publicizing it more widely next week. So I'll be in Wikipedia, in Poland.
290
00:31:34.220 --> 00:31:57.349
Maryana Pinchuk: and we'll have a future audiences talk on August 10.th So if anyone is coming to Wikmania, please come to my talk it will be on the last day, so I'm guessing attendance might start to fizzle but please come. And our plan is to make this available to any anyone who has auto confirmed rights on English Wikipedia, which I believe, just means
291
00:31:57.350 --> 00:32:05.210
Maryana Pinchuk: you have created an account. That is, it's older than 4 days, and I think it's you've made at least 10 edits. Correct me if I'm wrong.
292
00:32:05.210 --> 00:32:21.769
Maryana Pinchuk: Experts on auto confirmed status 10 edits, right? So you don't have to be an English Wikipedian like who's been around forever? If you're if you're home, Wiki is another project but if you do meet that bar on English Wikipedia, you'll be able to use the
293
00:32:21.770 --> 00:32:39.810
Maryana Pinchuk: production version, and we will be putting out a note on the English Wikipedia village pump to let people know that this is happening and this new version that we'll be demoing starting next week. Our intention is to make it post to English Wikipedia talk pages rather than to test Wiki.
294
00:32:39.810 --> 00:32:56.979
Maryana Pinchuk: This is to kind of test the full experience a little more, and to publicize it a little bit more broadly among the community of active editors. So in the talk Page post there will be a note with a link to the project, an explanation of what it is and an ask for more people to test.
295
00:32:57.368 --> 00:33:08.871
Maryana Pinchuk: So we're trying to see if this this, in its current state or something like it, could be valuable to to Wikipedians, who are already active, who are already looking for
296
00:33:09.240 --> 00:33:18.419
Maryana Pinchuk: sources outside of Wikipedia to bring in to Wikipedia, and also soliciting feedback on ways that this could change or improve.
297
00:33:18.490 --> 00:33:38.140
Maryana Pinchuk: So, for example, we know that some of the AI assessment language is maybe not quite perfect, gets a little murky, so we'd love your help in making sure that we're communicating the right things to users who are kind of going through this workflow.
298
00:33:38.450 --> 00:34:01.429
Maryana Pinchuk: Also sort of bigger thinking and suggestions around how this might look. If we were to open it up to more than just experienced Wikipedians, if we were to say, make this available to anyone in the world who wanted to add a fact. I think there are a lot of pros and cons to that. A lot of opportunities to to enable participation on Wikipedia to new audiences.
299
00:34:01.440 --> 00:34:19.230
Maryana Pinchuk: but also a lot of risks that we're very aware of around people abusing it, or contributing infowars or other unreliable sources to to our projects and creating a lot of work for the community to have to kind of clean up
300
00:34:19.300 --> 00:34:20.649
Maryana Pinchuk: and deal with.
301
00:34:20.659 --> 00:34:38.839
Maryana Pinchuk: So we really want to think through. What could a good experience look like? Could there be a way to moderate that type of content or triage it in some way. So it doesn't create a bunch of work. And with the power of AI, I think we can start to do more of that kind of synthesis
302
00:34:38.840 --> 00:34:52.679
Maryana Pinchuk: and analysis that we're bringing to show to editors who use this product. So thoughts around that are going to be really welcome. And yeah, I'm going to stop now and see
303
00:34:53.475 --> 00:34:54.130
Maryana Pinchuk: what
304
00:34:54.310 --> 00:35:04.189
Maryana Pinchuk: what the room is feeling. I've been kind of briefly scanning chat. So if anyone wants to be brave and unmute and speak to your impressions, ideas, thoughts.
305
00:35:04.510 --> 00:35:07.519
Maryana Pinchuk: Yes, Samuel, please. You have the floor.
306
00:35:08.467 --> 00:35:16.529
Samuel Breslow: Thanks so much for having this conversation and demoing this, I think these features look really cool.
307
00:35:16.980 --> 00:35:22.270
Samuel Breslow: I was curious just since you talked to the beginning about the future audience teams role
308
00:35:22.460 --> 00:35:26.219
Samuel Breslow: as doing these kind of quick experiments rather than
309
00:35:26.814 --> 00:35:29.149
Samuel Breslow: more built out products.
310
00:35:29.606 --> 00:35:37.519
Samuel Breslow: I'm I'm glad to see that since I think there's a tendency to have this kind of gulf between
311
00:35:37.620 --> 00:35:42.749
Samuel Breslow: things that are just like user, maintain scripts. And
312
00:35:42.780 --> 00:35:44.779
Samuel Breslow: so these very
313
00:35:44.970 --> 00:35:48.200
Samuel Breslow: can be janky type products
314
00:35:48.210 --> 00:35:57.090
Samuel Breslow: that can be made quickly on the one hand, and the features that the foundation makes, which are very well considered very cautious.
315
00:35:57.583 --> 00:36:00.686
Samuel Breslow: But tend to take a lot longer.
316
00:36:01.550 --> 00:36:02.600
Samuel Breslow: and
317
00:36:03.370 --> 00:36:06.050
Samuel Breslow: yeah, it's interesting to be able to have some
318
00:36:06.140 --> 00:36:10.669
Samuel Breslow: some things in a more middle ground. I'm curious whether there's any pathway
319
00:36:10.720 --> 00:36:19.500
Samuel Breslow: currently that you are considering, for if any of these experiments go particularly well, and there's interest in
320
00:36:19.780 --> 00:36:30.679
Samuel Breslow: having the feature be available beyond the scheduled end date. Would you hand them off to another team to like, build out more fully or
321
00:36:30.770 --> 00:36:33.450
Samuel Breslow: be able to keep them around in some way
322
00:36:33.860 --> 00:36:35.350
Samuel Breslow: that would.
323
00:36:36.030 --> 00:36:42.260
Samuel Breslow: you know, hopefully, not create maintenance work, but allow their benefits to continue.
324
00:36:42.600 --> 00:37:08.040
Maryana Pinchuk: Yeah, that's a really great question. Thank you. Yes. So the process right now kind of looks like the the latter what you outlined. So if we hit on something that's really successful. So let's say we put this out there, and every Wikipedian out there wants to use it, and not just English Wikipedians, but every Wikipedian and and Wikidata, and and lots of other folks in our movement are really excited about this
325
00:37:08.409 --> 00:37:28.740
Maryana Pinchuk: we would then come to the the kind of planning process that the foundation has which occurs every year sort of starts really in January, we really start thinking about what we're gonna prioritize for the next fiscal year and and goes into April to June to like, actually write the draft
326
00:37:29.010 --> 00:37:52.520
Maryana Pinchuk: and, you know, start to like collaborate with teams kind of pitching what they think is really important and gonna be impactful. We would come to that process and say, Hey, look! We're we're getting a ton of great feedback from our community. People really want this thing, and they want this thing for real. They don't just want, you know, a browser extension that's like only for one browser and one language. We really need like to to give this to a full
327
00:37:52.520 --> 00:38:19.640
Maryana Pinchuk: product team, and then that product team would take it and would build it in the right way, quote unquote right? So they would make it available to more users of more browsers and more languages, and you know but the idea would it would be one of many that are kind of going into that process and through internal discussion through discussion with the communities. So we publish a draft of our annual plan on Meta every year
328
00:38:19.640 --> 00:38:33.629
Maryana Pinchuk: and take feedback and do really try to incorporate community feedback as well to decide. Okay, like, yes, this is important. Lots of people want this. But there are 20 other things that are maybe even more important and impactful. And and going to really.
329
00:38:33.630 --> 00:38:58.029
Maryana Pinchuk: you know, move the needle of increasing editors or readers, or whatever. This is a collaborative discussion that we have to engage in with our movement. So yeah, so it's really just listening, getting insights and making recommendations in in the sort of normal annual planning process. But there are also kind of
330
00:38:58.170 --> 00:39:18.650
Maryana Pinchuk: other ways that this could show up, so we we might not see that every comedian in the world wants to use something like this. But we're really hoping to learn something valuable about the process. So there might be other kinds of recommendations that come out of it, such as and we've we've seen this kind of floated again and again. In various other contexts. But we're kind of hitting the same
331
00:39:18.973 --> 00:39:29.010
Maryana Pinchuk: issues while building this product is that we don't really have like a structured reference bank or a structured bank of claims, with references attached to them on on Wikipedia or
332
00:39:29.010 --> 00:39:42.599
Maryana Pinchuk: anywhere. It's Wikidata kind of it's it's it's sort of like that. But but not quite and this this idea has been floating around for a while in lots of different ways. You know, the Wiki site community has been really advocating for for that
333
00:39:42.890 --> 00:40:11.979
Maryana Pinchuk: and it may be the case that maybe that's something we'll learn a lot more about and and have something to to bring to an annual planning conversation around. So it could look it could look like. Yes, this is the product we've we found it. We found the the perfect product that's going to really solve a a major need. And we're gonna recommend that it's built out for real. Or we've learned some things by doing this that are touch on other parts of the whole product ecosystem. And we need to make recommendations about those
334
00:40:12.850 --> 00:40:14.900
Maryana Pinchuk: I see. Marshall has a hand.
335
00:40:15.000 --> 00:40:17.380
Maryana Pinchuk: Would you like to chime in Marshall.
336
00:40:17.780 --> 00:40:34.249
Marshall Miller: Yeah, thanks. I think I just wanna take this opportunity to like, point out a little bit more of the mechanics of how something like this works in our organization. So, for instance, I'm a director of product, and Mariana reports to me. And so I'm in constant contact with Mariana about this work and about these learnings
337
00:40:34.250 --> 00:41:02.339
Marshall Miller: and other teams that report to me include, like the editing team, the growth team, the web team. And these are some of those teams that are building those like longer term high scale features. And so, you know, it's through that like direct communication that the various product managers and teams involved in this is is how we're exchanging information. And and like, we've deliberately set up our org structure so that it's not like Mariana has to be off on the sidelines like insisting that she's got something valuable. It's like
338
00:41:02.610 --> 00:41:07.689
Marshall Miller: work structure, wise. She's plugged into the teams that would actually be doing that implementing.
339
00:41:07.980 --> 00:41:18.760
Marshall Miller: and in terms of a specific example, kind of like Mariana referenced at the beginning like working on the Chat Gpt. Plugin, we learned a lot about how Llms interact with Wikipedia content.
340
00:41:18.910 --> 00:41:42.779
Marshall Miller: And it gave us confidence that we can figure out how to make them interact productively. And so then for the year that we're starting now this fiscal year. There are teams that are working on features like this edit check feature that's going to attempt to use an Llm. To guide new editors to make better edits. And we're doing that because of the confidence we gained from the future audience. Experiment about the Chatgpt Plugin.
341
00:41:42.930 --> 00:41:47.950
Marshall Miller: So that's like it's not like we directly translated the feature. It's like we translated the learnings.
342
00:41:51.940 --> 00:41:52.960
Maryana Pinchuk: Thank you, Marshall.
343
00:41:54.390 --> 00:42:02.090
Maryana Pinchuk: Alright. Other questions feedback on add effect or Meta process questions also welcome
344
00:42:02.600 --> 00:42:03.890
Maryana Pinchuk: gasolana.
345
00:42:05.110 --> 00:42:15.155
Ilana Strauss: Yeah, I don't have anything super technical to say or anything. But I just think this is so cool, like, I don't know. I I, it's very cool to like see it alive.
346
00:42:15.740 --> 00:42:19.949
Ilana Strauss: I don't know. I could just I could see it having an impact on
347
00:42:20.080 --> 00:42:23.440
Ilana Strauss: kind of so many other users who would never
348
00:42:23.560 --> 00:42:25.440
Ilana Strauss: spend that much time.
349
00:42:25.500 --> 00:42:30.394
Ilana Strauss: you know, just all of a sudden being able to be so involved, so great job.
350
00:42:33.380 --> 00:42:45.200
Maryana Pinchuk: Thank you. And for for the benefit of the folks in the room. Do you wanna say a little bit more about how you came into the early kind of design process of this project and sort of where you're coming from in the kind of
351
00:42:45.810 --> 00:42:47.350
Maryana Pinchuk: Wiki work verse.
352
00:42:48.620 --> 00:42:51.339
Ilana Strauss: Oh, man, yeah. Well, I I mean just to
353
00:42:51.610 --> 00:42:56.470
Ilana Strauss: just total coincidence. We're all just like sitting on a train. Going to.
354
00:42:56.760 --> 00:43:03.809
Ilana Strauss: Their 2 are from like a conference and just started chatting about it. He was kind of organic.
355
00:43:04.470 --> 00:43:11.739
Ilana Strauss: But yeah, I run so a crowd sourcing, fact, checking organization. So sort of
356
00:43:12.260 --> 00:43:13.700
Ilana Strauss: I've described it as like
357
00:43:13.760 --> 00:43:29.799
Ilana Strauss: Wikipedia for fact checking to people. So yeah, I kind of came to a conference just to sort of meet people, and you know, see how everybody is thinking kind of in this crowdsourced information. Space.
358
00:43:39.920 --> 00:43:42.230
Maryana Pinchuk: Other questions. Thoughts.
359
00:43:45.785 --> 00:43:46.439
Maryana Pinchuk: Critique
360
00:43:47.350 --> 00:43:49.929
Maryana Pinchuk: Marshall, do you still have your hand up, or is this a legacy hand.
361
00:43:49.930 --> 00:43:55.339
Marshall Miller: It's a new hand in which I have a question to prompt the group when we run out of stuff. People already want to say.
362
00:43:55.590 --> 00:43:57.090
Maryana Pinchuk: Well, critique, please take it away.
363
00:43:59.300 --> 00:44:01.420
Pratik M: Marshall can go 1st if he likes.
364
00:44:01.930 --> 00:44:03.500
Marshall Miller: No, no, I think you should.
365
00:44:04.340 --> 00:44:20.488
Pratik M: Okay, I had a few thoughts. I am new to Vicki in general. I have obviously known about it for a long time, but I haven't edited or contributed in that sense. So I'm looking at it from that lens a little bit.
366
00:44:21.000 --> 00:44:36.980
Pratik M: I was thinking in terms of this Llm. In the loop as Llm. As an assistant kind of a way of looking at things. It could be really interesting if this tool could be a tutor of sort for the ways of
367
00:44:36.980 --> 00:45:00.750
Pratik M: Wikimedia, of you know what the rules are, how to learn about sources, how to learn about where things go, because adding a fact to an article is really contextual, like where something is missing. You need to have that information first.st If you're going to add something meaningful. It can't be just you find something online, and you think it might be useful over there. So I think
368
00:45:00.770 --> 00:45:10.150
Pratik M: that could be an interesting RAM to having more full-time contributors, or people who want to actually contribute properly long term.
369
00:45:10.160 --> 00:45:18.840
Pratik M: And from the other side of things I was also thinking it it might be interesting to have these facts come in. Not.
370
00:45:18.840 --> 00:45:42.259
Pratik M: I mean, obviously, this is a prototype. So the top page is just going. But it could be like a collection of sorts, like an arena channel, or like a pinterest board, or some something of that nature, where facts, instead of adding a fact, you're like adding a recommendation, or adding a direction or a thread of some sort from where things can go further.
371
00:45:42.410 --> 00:45:46.260
Pratik M: I think those are some of the thoughts I wanted to put forward.
372
00:45:50.610 --> 00:45:53.700
Maryana Pinchuk: Awesome. Thank you. Yeah. I think, the
373
00:45:53.980 --> 00:46:03.909
Maryana Pinchuk: the 1st point that you have about kind of using this as a way to educate newer contributors or people who are newer to the wikiverse is a really interesting one.
374
00:46:04.224 --> 00:46:28.720
Maryana Pinchuk: And it makes me think, you know, we've got this kind of very simple message about the the quality of the source that we show to the user. But there could be a lot more there around, not just not just saying like, Oh, sorry this. The source doesn't work on Wikipedia. But maybe bring in more of that like instruction. Learning about how Wikipedia does work, and who decides which sources are notable. And how
375
00:46:28.720 --> 00:46:30.610
Maryana Pinchuk: can contribute to that discussion? And
376
00:46:30.906 --> 00:46:45.739
Maryana Pinchuk: yeah, I think there are a lot of really interesting directions to explore, and that and then the the other point you made about you know where the facts could go is creating new ways of of collecting these contributions. I think. Also, yeah, very interesting. And
377
00:46:46.122 --> 00:47:07.609
Maryana Pinchuk: I'm curious. Once people get a chance to to play around with it a little more. That'll be one of our prompt questions in the feedback is like, Where where could these things go? To? Really be maximally useful? And you know, both for the contributor to make them feel like it went somewhere, but also for our communities to have a place to sift through this stuff and find the really good valuable contributions.
378
00:47:07.700 --> 00:47:09.019
Maryana Pinchuk: Yeah.
379
00:47:09.300 --> 00:47:13.230
Maryana Pinchuk: what is what is on the Pdf. Dare I ask?
380
00:47:15.850 --> 00:47:18.470
Sam Klein: Them into a scratch space, a scratch face for
381
00:47:19.540 --> 00:47:33.589
Sam Klein: notes and other things. I I feel right now for for a lot of test cases we're overloading top pages way that we did for notifications and template messages. And maybe we can do something.
382
00:47:34.130 --> 00:47:34.870
Maryana Pinchuk: Hmm.
383
00:47:35.250 --> 00:47:36.259
Sam Klein: Better than that.
384
00:47:36.970 --> 00:47:38.609
Maryana Pinchuk: Yeah, a really good point.
385
00:47:40.656 --> 00:47:41.123
Maryana Pinchuk: Yeah.
386
00:47:41.600 --> 00:47:43.504
Maryana Pinchuk: There's so many wikis
387
00:47:44.723 --> 00:47:58.589
Maryana Pinchuk: Marshall, did you? Or did anyone else have any thoughts around that topic. I I know it's a lot to think about like, where should this go? But I'm curious. If even in just looking at it, you you had some thoughts around that anyone in the room
388
00:48:01.290 --> 00:48:08.510
Maryana Pinchuk: does anyone feel like talk pages on English? Wikipedia would absolutely not be the place, even for an experiment.
389
00:48:10.050 --> 00:48:23.010
Samuel Breslow: i i i think I share the concern that others have brought up, that a lot of talk pages are very inactive, and that putting things there can make them go into a void. I think both of the ideas that you mentioned.
390
00:48:23.443 --> 00:48:48.550
Samuel Breslow: Putting them on project talk pages which, while still sometimes an active, or at least a little more active that could be good, and if there was a dedicated queue of it, that could also work really well. The question would be, how much of a backlog with that queue form which is probably a function of just how interesting is it to be a reviewer doing this task?
391
00:48:48.873 --> 00:48:51.000
Samuel Breslow: What we tend to see is that
392
00:48:51.170 --> 00:48:53.329
Samuel Breslow: for the fun type jobs
393
00:48:53.500 --> 00:49:10.453
Samuel Breslow: get done a lot or the easy jobs get done a lot, whereas ones that are challenging or unrewarding, like, you know, answering coi edit requests things like that tend to, or, you know, approving articles at Afc. Those can develop extremely long backlogs.
394
00:49:11.210 --> 00:49:12.310
Samuel Breslow: I'm
395
00:49:12.650 --> 00:49:24.899
Samuel Breslow: like, I I think this would probably be most analogous to just like the backlog of citation needed tags on claims which what we've seen is that like.
396
00:49:25.640 --> 00:49:27.979
Samuel Breslow: if there's attention paid to an article.
397
00:49:28.610 --> 00:49:51.899
Samuel Breslow: the citation needed, tags will get addressed. But there's not really interest among the editor community in going through the full backlog and just like actively searching out citation, needed tags in order to try to fix them. There are just so many of them, and there's a few tools that allow you to do that. But it's not something that most editors seem to want to spend their time doing.
398
00:49:53.560 --> 00:50:12.710
Marshall Miller: Samuel. One of the motivations behind. The idea of the putting it on the talk page is, I think, that we've built products in the past that created some new queue that people had to go through, and we were loath to add another queue to all of the queues that Wikipedians have to think about
399
00:50:13.090 --> 00:50:22.890
Marshall Miller: and it was sort of like, what existing queue can this get tacked onto? So it's not like another thing on the to do list like it would. Might come across people's watch lists naturally.
400
00:50:22.940 --> 00:50:34.030
Marshall Miller: So that's sort of like the motivation. And may maybe there's like a different, better queue that people look at as a matter of their like Wiki work, where something like this could could feed into.
401
00:50:36.434 --> 00:50:40.919
Liam Wyatt (WMF): If I could jump in. I actually can't find my hand out thing as well, Marshall. I don't know where.
402
00:50:40.920 --> 00:50:42.539
Marshall Miller: Under the react button.
403
00:50:44.270 --> 00:50:46.249
Liam Wyatt (WMF): Okay, I will
404
00:50:47.120 --> 00:50:55.900
Liam Wyatt (WMF): preemptively react by holding my hand up. To Marshall's Point. There, there have been previous products that created new tabs.
405
00:50:56.010 --> 00:51:08.499
Liam Wyatt (WMF): So I think, was the adequate feedback tool, had a edit talk, and then suggestions that came from the general public. Tab. But of course, creating a new tab is a new workflow, a new place. People haven't
406
00:51:09.310 --> 00:51:14.139
Liam Wyatt (WMF): worked in before and also doesn't have the ecosystem of tools and templates
407
00:51:14.660 --> 00:51:15.610
Liam Wyatt (WMF): board.
408
00:51:16.177 --> 00:51:19.780
Liam Wyatt (WMF): There has been a suggestion, as I put in the chat here, that these
409
00:51:19.950 --> 00:51:21.429
Liam Wyatt (WMF): citation needed
410
00:51:21.740 --> 00:51:28.929
Liam Wyatt (WMF): sorry add effect. Suggestions should go into a central to-do list, and then that could be translated to the respective talk pages.
411
00:51:29.150 --> 00:51:31.480
Liam Wyatt (WMF): And people could work through that to-do list.
412
00:51:31.590 --> 00:51:38.289
Liam Wyatt (WMF): Currently, it's designed around. That's an idea. Currently, it's designed around the idea that the suggestions go onto the relevant talk page.
413
00:51:38.940 --> 00:51:41.360
Liam Wyatt (WMF): and that's where they live, and people respond to them
414
00:51:41.440 --> 00:51:45.969
Liam Wyatt (WMF): on the basis of the responder is interested in the subject.
415
00:51:46.230 --> 00:51:48.759
Liam Wyatt (WMF): not interested in the idea of
416
00:51:48.990 --> 00:51:52.339
Liam Wyatt (WMF): off-site submissions as a worklist per se
417
00:51:53.368 --> 00:51:55.520
Liam Wyatt (WMF): they. So the people would be
418
00:51:55.970 --> 00:52:02.439
Liam Wyatt (WMF): Wikipedians would be interacting with the new suggestions on the top page of the article they care about.
419
00:52:02.530 --> 00:52:09.979
Liam Wyatt (WMF): rather than going to hunt for a list of new suggestions from off-site and working through them one by one
420
00:52:10.880 --> 00:52:15.410
Liam Wyatt (WMF): might be a valid use case. But this is the one we're testing now.
421
00:52:18.798 --> 00:52:22.830
Marshall Miller: Here's something I wanted to ask the group before we have to depart. So
422
00:52:23.300 --> 00:52:31.670
Marshall Miller: the team's kind of doing something counterintuitive here, we're trying to investigate future audiences like people that don't currently participate in Wikipedia.
423
00:52:31.890 --> 00:52:36.999
Marshall Miller: But we're building the add effect extension. And then we're gonna
424
00:52:37.050 --> 00:52:39.689
Marshall Miller: pitch it to existing Wikipedians.
425
00:52:39.700 --> 00:52:58.210
Marshall Miller: because we think that we can get people, those people to use it right. Existing Wikipedians will get the concept. They'll give it a try. But that's what we're hoping. And then from that kind of usage, and from those reactions that will help us learn like, could we imagine non-wikipedians picking something like this up and getting involved that way.
426
00:52:58.330 --> 00:53:03.179
Marshall Miller: and so given that we've got a bunch of Wikipedians in the room. My question is like.
427
00:53:03.310 --> 00:53:19.890
Marshall Miller: you know, concretely, the team intends to take this to English village pump and say, like, here's something we've been working on. People. Wanna try this out and so wanted to kind of take your temperature on like. How do you think that'll be received? Do you think English Wikipedians will want to do this, or what might you?
428
00:53:19.920 --> 00:53:25.590
Marshall Miller: What advice do you have on like how to explain it, or how to encourage that kind of usage? Or maybe what might they be worried about?
429
00:53:32.930 --> 00:53:39.180
Samuel Breslow: Yeah, I think any tool that is ultimately destined for
430
00:53:40.060 --> 00:53:44.520
Samuel Breslow: like people who are not currently Wikipedia editors is going to
431
00:53:45.120 --> 00:53:46.849
Samuel Breslow: raise some
432
00:53:47.040 --> 00:53:50.504
Samuel Breslow: hesitations just because of the history of
433
00:53:51.490 --> 00:53:56.337
Samuel Breslow: Was it like suggested edits, or something like in 2011 way far back?
434
00:53:57.030 --> 00:53:58.190
Maryana Pinchuk: Back. Tool, yes.
435
00:53:58.190 --> 00:54:05.077
Samuel Breslow: That's what it was. And so yeah, I'm sure you all have that like very much in mind.
436
00:54:05.880 --> 00:54:06.950
Samuel Breslow: and
437
00:54:08.090 --> 00:54:14.470
Samuel Breslow: yeah, I think, like for me. Personally, I find it always persuasive when there's comparisons of like
438
00:54:16.560 --> 00:54:27.489
Samuel Breslow: I forget which. Recent tool it was, but it was looking at like the revert rate between newcomers not using the tool and newcomers who are using the tool and
439
00:54:27.830 --> 00:54:33.260
Samuel Breslow: how it's a success for us when we have a lower revert rate for
440
00:54:34.585 --> 00:54:35.440
Samuel Breslow: editors
441
00:54:35.450 --> 00:54:44.049
Samuel Breslow: who are using a tool, even if newcomers are still in general reverted or making deleterious edits a fairly high rate.
442
00:54:45.590 --> 00:54:46.880
Samuel Breslow: and so.
443
00:54:48.490 --> 00:54:55.179
Samuel Breslow: yeah, I, I think Wikipedians know that there's a hugely pressing need to
444
00:54:55.600 --> 00:55:05.709
Samuel Breslow: bring in more editors and get more people contributing to the project, because there's just so much editing work to be done, and such a lack of editors to do it.
445
00:55:06.850 --> 00:55:09.300
Samuel Breslow: I think other Wikipedians
446
00:55:09.690 --> 00:55:15.190
Samuel Breslow: is there, and I try not to fall into this trap myself, but when it comes to specifics
447
00:55:15.630 --> 00:55:19.079
Samuel Breslow: trying to actually get newcomers involved, then it's like.
448
00:55:19.330 --> 00:55:22.050
Samuel Breslow: Oh, my gosh! We've got this lot of people coming in
449
00:55:22.290 --> 00:55:28.790
Samuel Breslow: doing not great edits. And of course they're not going to be great because they're newcomers and don't know everything yet.
450
00:55:29.492 --> 00:55:31.539
Samuel Breslow: And that can
451
00:55:32.000 --> 00:55:38.870
Samuel Breslow: be where there's then trepidation and the community not wanting tools to be turned on.
452
00:55:39.020 --> 00:55:40.050
Samuel Breslow: So
453
00:55:40.420 --> 00:55:43.679
Samuel Breslow: yeah, I would just emphasize the like. This is
454
00:55:44.030 --> 00:55:47.410
Samuel Breslow: something that it comes to fruition in a more finalized form could
455
00:55:47.430 --> 00:55:54.610
Samuel Breslow: help us address all the content gaps that we have and the need for there to be more editors.
456
00:55:58.630 --> 00:56:00.650
Liam Wyatt (WMF): Samuel. Yes, and and that
457
00:56:00.930 --> 00:56:07.870
Liam Wyatt (WMF): slowly, Sully, careful. That's not scary. One approach is exactly why it's going to test Wiki.
458
00:56:07.970 --> 00:56:10.829
Liam Wyatt (WMF): and only to auto confirmed users
459
00:56:10.930 --> 00:56:18.099
Liam Wyatt (WMF): initially, even though, as exactly as Marshall said, that's not the intended eventual destination for this tool.
460
00:56:18.250 --> 00:56:22.650
Liam Wyatt (WMF): but by getting the existing community to feel comfortable with its
461
00:56:23.380 --> 00:56:24.490
Liam Wyatt (WMF): systems.
462
00:56:24.500 --> 00:56:25.819
Liam Wyatt (WMF): will hopefully
463
00:56:26.241 --> 00:56:34.690
Liam Wyatt (WMF): make that transition to when we hopefully open it up to a wider audience. People will go. Okay, we understand this is being done, and they come careful.
464
00:56:35.190 --> 00:56:38.320
Liam Wyatt (WMF): gradual rate, and not
465
00:56:38.580 --> 00:56:55.269
Liam Wyatt (WMF): scare everyone, and to immediately shutting down experiments. The other aspect is, and this might be a question for the audience and request of the audience is precisely because existing Wikipedians are the 1st people who are going to be using it.
466
00:56:56.770 --> 00:57:04.280
Liam Wyatt (WMF): The 1st thing we've seen a couple of people do is turn it on Wikipedia itself. I write a sentence or a paragraph in Wikipedia.
467
00:57:04.430 --> 00:57:21.160
Liam Wyatt (WMF): and expect that the tool will then go onto the Internet and find a footnote somewhere out there that will help improve the improve the citations of the sentence already written in Wikipedia. That's not how this tool works that's looking at the telescope through the wrong end.
468
00:57:21.400 --> 00:57:25.259
Liam Wyatt (WMF): And we need to make sure people understand that. That's
469
00:57:26.440 --> 00:57:30.209
Liam Wyatt (WMF): you know. Thank you for trying, but that's not how it's intended to work.
470
00:57:30.220 --> 00:57:33.304
Liam Wyatt (WMF): And if you see people complaining about that
471
00:57:33.840 --> 00:57:34.880
Liam Wyatt (WMF): we need to
472
00:57:34.970 --> 00:57:38.549
Liam Wyatt (WMF): find better ways of reorienting the behavior.
473
00:57:40.350 --> 00:57:42.609
Liam Wyatt (WMF): because we expect that will happen
474
00:57:42.760 --> 00:57:48.660
Liam Wyatt (WMF): more with advanced Wikipedians than with the eventual intended audience.
475
00:57:52.380 --> 00:58:14.760
Maryana Pinchuk: Yeah, thanks. Thanks, Samuel and Liam. I think we are at time. But thank you all so much for participating in this conversation reminder that we do these monthly-ish. We had a little gap last couple of months, but we're trying to get back on schedule, so please follow us, and we post on the talk page of Meta
476
00:58:14.760 --> 00:58:27.740
Maryana Pinchuk: Meta. Future audiences. So that's how you can stay up to date with what we're doing for those of you at Wikimedia. I'll see you there, and for everyone else. I'll see you next month, and we'll let you know how this goes. So thank you.
477
00:58:28.140 --> 00:58:29.020
Maryana Pinchuk: Ciao.
|