English subtitles for clip: File:10 Future of Wikipedia.webm

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search
1
00:00:00,329 --> 00:00:05,639
Thank you for staying with us the entire day. We
really appreciate it. We know that Sunday on a

2
00:00:05,669 --> 00:00:11,459
long weekend is not the most perfect time for
Wikipedia day, but shows how much you folks care

3
00:00:11,519 --> 00:00:16,739
about learning about Wikipedia and so glad to
see so many familiar and new faces. So we

4
00:00:16,739 --> 00:00:22,199
thought we for the final session today, we'd
love to have a conversation with Noam Cohen and

5
00:00:22,199 --> 00:00:28,079
some of our panelists that we've had throughout
the day. Because we've had a lot of catch up

6
00:00:28,079 --> 00:00:34,649
with folks who've heard a lot of new ideas, AI
looms large in our conscience. Now in 2024, I

7
00:00:34,649 --> 00:00:38,909
thought be really interesting to hear some of
the reflections of the speakers that we've had

8
00:00:38,909 --> 00:00:44,189
this Wikipedia day, but spoke especially
starting with Noam Cohen. And originally we're

9
00:00:44,189 --> 00:00:48,089
going to have Noam in one of the earlier
sessions, but we thought no, if you don't know

10
00:00:48,089 --> 00:00:53,879
him has been for many years, kind of like the
wiki pedia correspondence, which is not a common

11
00:00:53,879 --> 00:00:58,409
thing to hear of. But he used to come to all of
our wiki mania conferences, which is our

12
00:00:58,409 --> 00:01:03,599
international conference and cover wiki mania as
part of his job with the New York Times and as a

13
00:01:03,599 --> 00:01:09,299
tech correspondent. So we always appreciated the
care and the thoughtfulness that Nome put into

14
00:01:09,599 --> 00:01:14,849
understanding our community, and always seeming
to have his finger on the pulse of what was

15
00:01:14,849 --> 00:01:19,109
going on that particular year when we you saw us
at Wikimedia. So I thought maybe we could start

16
00:01:19,109 --> 00:01:24,869
with some of nomes reflections on being here
today, which I haven't seen know, in many years.

17
00:01:24,869 --> 00:01:30,569
And we used to talk all the time given his
coverage of our events. But in 2024 Gnome, what

18
00:01:30,569 --> 00:01:36,119
are some of your your thoughts about seeing this
community here in New York and and what you see

19
00:01:36,119 --> 00:01:37,439
as trends in the tech industry?

20
00:01:37,980 --> 00:01:42,360
I mean, thanks for inviting me, I'm really, you
know, honored to be here. And, you know, it was

21
00:01:42,360 --> 00:01:49,110
really nice introduction. The thing I would say
recording in progress, Wikimedia was in 2006,

22
00:01:49,110 --> 00:01:53,700
that I went to in Boston, and it was like, a
revelation, because basically, it was seeing all

23
00:01:53,700 --> 00:01:59,970
the people behind the articles, it's so easy to
see Wikipedia as just, you know, a bunch of text

24
00:01:59,970 --> 00:02:04,710
and people writing notes and to actually see the
people that did it was like mind blowing to me,

25
00:02:04,710 --> 00:02:09,750
and it kind of made me realize it's a vibrant
community. And it has like, you know, it has its

26
00:02:09,990 --> 00:02:14,100
various figures, you know, Jimmy Wales was
always a fascinating figure to me like, well,

27
00:02:14,130 --> 00:02:18,570
you know, sometimes they call him benevolent
dictator was he loved or feared revered was

28
00:02:18,570 --> 00:02:23,070
always like, it just seemed like a organic
community. And I was just talking to Annie, you

29
00:02:23,070 --> 00:02:26,520
know, depths of Wikipedia and how, like, in a
lot of ways she feels this role is like a

30
00:02:26,520 --> 00:02:32,970
storyteller for the community. And it's just
very organic. Community, Wikipedia is people

31
00:02:33,060 --> 00:02:37,200
that's like the story, I would say, from the
start, I always wanted to tell about Wikipedia.

32
00:02:37,410 --> 00:02:44,040
So cut to 2024, we're talking about AI. And, you
know, I think of like aI not being about people.

33
00:02:44,040 --> 00:02:47,880
So I kind of think it's missing the story. And
I'm maybe I'm gonna go down in history is like,

34
00:02:48,120 --> 00:02:53,250
you know, missing the big story of the world and
like not getting it. But what Wikipedia is

35
00:02:53,250 --> 00:02:58,410
giving what is special is that is people it is
people making decisions. And like, I know why

36
00:02:58,410 --> 00:03:04,380
people consult the chat bots and chat GBT, it's
helpful. It can help you write things quickly.

37
00:03:04,380 --> 00:03:10,230
It definitely can save time on questions. But
fundamentally, it isn't a source for truth. And

38
00:03:10,230 --> 00:03:16,980
so I think the story, I think, of Wikipedia in
2024, is that we have a like, I would say,

39
00:03:16,980 --> 00:03:21,780
almost like a new Cold War between people who
believe in truth, and people believe in

40
00:03:21,780 --> 00:03:27,960
controlling truth. And I have a story I've been
working on about Russia and how in Russia,

41
00:03:27,990 --> 00:03:32,550
right? I wrote sir, for Bloomberg in the summer
about how Russia is very, you know, Putin has

42
00:03:32,550 --> 00:03:37,650
always hated Wikipedia for many, many years. And
he, you know, recently created the basically

43
00:03:37,680 --> 00:03:43,140
government sponsored a commercial rival, it's a
fork illegal fork of Wikipedia. They hired away

44
00:03:43,140 --> 00:03:49,890
the head of Wikimedia, Russia to run it. Lots of
funding they're getting, and they've also been,

45
00:03:49,920 --> 00:03:55,620
you know, prosecuting Russian Wikipedia, and
they've been fines. And lately, the most recent

46
00:03:55,620 --> 00:03:58,620
thing that happened, which I think is worth,
like taking note of is that the person who

47
00:03:58,620 --> 00:04:05,580
replaced the Russian Wikimedia person who went
to the commercial operation, somebody Stanislav

48
00:04:06,300 --> 00:04:10,890
Kozlowski, you know, he's a professor who lost
his job over this now, he just said he was

49
00:04:10,920 --> 00:04:15,630
fearing that he's gonna become a foreign agent
being listed as a foreign agent, and Wikimedia

50
00:04:15,630 --> 00:04:19,170
Russia's closing down. Basically, it hasn't
happened yet. But they're filing the paperwork,

51
00:04:19,320 --> 00:04:23,700
because they really feel like it's jeopardizing
anyone who's involved with it. So if you count

52
00:04:23,700 --> 00:04:28,620
like Putin and you count, you know, xi in China,
where there isn't, Wikipedia is not allowed to

53
00:04:28,620 --> 00:04:33,480
publish in any language. You know, Modi, in
India, there are like lawsuits that are related

54
00:04:33,480 --> 00:04:38,220
to what Wikipedia publishes. You know, you can
throw in, you can throw on Trump and the right

55
00:04:38,220 --> 00:04:42,510
wing in America, there is a war on truth. And
like, it's kind of I don't want to like,

56
00:04:42,540 --> 00:04:47,730
overstate it, but like, wicked Wikipedia kind of
becomes the only kind of rival there are

57
00:04:47,730 --> 00:04:52,200
definitely news organizations that are committed
to the truth but their commercial operations

58
00:04:52,200 --> 00:04:58,590
and, you know, this idea of a global community
that somehow is like working in the shadows.

59
00:04:58,830 --> 00:05:05,550
That is trying to create a record of what is
happening and it is accurate is really, really

60
00:05:05,550 --> 00:05:10,680
important. And I feel like it is reaching like a
level that's beyond just sort of utility, but

61
00:05:10,680 --> 00:05:15,150
about like, are we going to be able to
democratically tell our story? Or is it really

62
00:05:15,150 --> 00:05:20,940
under threat? So that so I feel like it's so in
some ways, I feel like the AI story takes away

63
00:05:20,940 --> 00:05:26,070
from that, because AI is about just harvesting
data and making predictions and doing lots of

64
00:05:26,070 --> 00:05:30,720
things that are useful, but really not getting
at the nub of like, what is truth on the

65
00:05:30,720 --> 00:05:35,910
Internet? Who gets to tell it? Who's guarding
it? How are we going to protect it, these are

66
00:05:35,910 --> 00:05:40,560
like the real big issues there. So that's what I
sort of see the takeaway, and what's really at

67
00:05:40,560 --> 00:05:47,370
stake in 2024, and Wikipedia and the movement.
So that's why I really like to write about it.

68
00:05:47,370 --> 00:05:52,650
So the things I'm writing about are like about
the Russian story. Similarly, I think the other

69
00:05:52,650 --> 00:05:57,960
thing I was telling Andrew, when he listed his
four principles, I really feel like the fifth

70
00:05:57,960 --> 00:06:04,830
principle of assume good faith is really, really
important. And that's a community based idea. I

71
00:06:04,830 --> 00:06:08,880
remember Jimmy Wales, saying something that I
thought was really deep, which he said, you

72
00:06:08,880 --> 00:06:15,330
could be at a steak restaurant, and everyone's
given a steak knife. And he's like, Oh, my God,

73
00:06:15,360 --> 00:06:19,500
there are people with knives right next to me
what's going on? Everyone should be in a cage,

74
00:06:19,500 --> 00:06:24,060
that way, I'll be safe. Or you can take the duty
like no people use steak knives to attack each

75
00:06:24,060 --> 00:06:28,740
other. And that's cool. And so I think that that
principle that we're competing works on, I'm

76
00:06:28,740 --> 00:06:34,080
assuming good faith is huge. And it's the other
story I've been working on. It's about how, how

77
00:06:34,080 --> 00:06:39,000
Wikipedia English Wikipedia has been writing
about the Hamas, you know, the war in Israel,

78
00:06:39,000 --> 00:06:44,610
Israel Gaza war. And the fact that there is a
community there that is lot of tension, a lot of

79
00:06:44,610 --> 00:06:51,450
accusations a lot of you know, kind of people
even being disciplined, I would say over it, but

80
00:06:51,450 --> 00:06:56,280
fundamentally, is forcing everyone in this
conflict. If you support whatever, if you're

81
00:06:56,280 --> 00:07:00,900
more on the Israeli side, you were forced to
have to work on an article that's going to talk

82
00:07:00,900 --> 00:07:05,100
about all the things Israel is done violating
international law or fill in the blank.

83
00:07:05,310 --> 00:07:09,120
Likewise, you're on the other side, you have to
like, deal with the atrocities that happen. It's

84
00:07:09,120 --> 00:07:13,470
like you you cannot tunnel yourself that way.
There cannot there's there are two versions of a

85
00:07:13,470 --> 00:07:18,600
story they're not. You know, there was a quote,
I saw the Orwell where he said basically, he

86
00:07:18,600 --> 00:07:24,240
really got this illusion in the Spanish Civil
War, because he, he felt like the media and was

87
00:07:24,270 --> 00:07:29,460
was contaminated. You could never believe
anything from the other side. Like even just

88
00:07:29,460 --> 00:07:33,090
basic facts. And he felt that was like, really
dangerous. Even though he was very committed to

89
00:07:33,090 --> 00:07:37,500
the Republican movement. He felt like, we should
be able to believe that our side did things

90
00:07:37,500 --> 00:07:43,140
wrong. And like that's, to me again, that's the
the the innovation of Wikipedia, like we have a

91
00:07:43,140 --> 00:07:49,500
system in America often that's conflict based
and where two sides antagonists like our legal

92
00:07:49,500 --> 00:07:53,970
system works that way. Often, our our political
system works away to people make bold,

93
00:07:53,970 --> 00:07:58,320
outrageous claims. Oh, the truth will be in the
middle. You know, that's sort of our logic. So

94
00:07:58,320 --> 00:08:03,150
yes, both sides in a lawsuit, you just say
extreme things and be ungenerous, and interpret

95
00:08:03,150 --> 00:08:06,630
things the way they want, and then we'll get the
truth in the middle. But no, I think a better

96
00:08:06,630 --> 00:08:11,700
system is to have everyone committed to trying
to do the best. And like we all check each

97
00:08:11,700 --> 00:08:16,590
other. And we're like, in one project pulling
together. So like, I think that's the other very

98
00:08:16,590 --> 00:08:20,370
valuable thing that Wikipedia does. It's like,
it has his huge problems with bias. And that

99
00:08:20,370 --> 00:08:25,470
makes me really sad. Because this great model is
hurt by not having enough inclusivity. And like

100
00:08:25,650 --> 00:08:29,730
the story about Israel, Hamas I'm working on
it's about the Egyptian American editor, who

101
00:08:29,730 --> 00:08:33,990
feels definitely outnumbered and how and I was
in really great length about how does he deal

102
00:08:33,990 --> 00:08:37,830
with it? How does he deal with stress? What
keeps him going? How frustrated is the I mean,

103
00:08:37,830 --> 00:08:42,210
like, I want to tell those stories, too, because
it's like, we're competing needs to reflect the

104
00:08:42,210 --> 00:08:46,710
world much better to do this important job. But
I feel like the basic parameters are really

105
00:08:46,710 --> 00:08:51,210
useful. So that's why I have optimism about it.
But I feel like the stakes are very high. And I

106
00:08:51,210 --> 00:08:52,920
feel that's why I'm glad to be here.

107
00:08:53,759 --> 00:09:00,389
Thank you know that that's that's great insight.
Yes, thank you. That's a, I think you notice

108
00:09:00,389 --> 00:09:04,109
something that that STS University mentioned
this morning as well, in terms of contamination,

109
00:09:04,139 --> 00:09:08,189
he was worried about India and their media
ecosystem. And I think we're all worrying about

110
00:09:08,279 --> 00:09:12,779
how good is the source? How's business sourcing
that we have to put into Wikipedia writes

111
00:09:12,779 --> 00:09:16,889
garbage in garbage out, we need to have good
reliable sourcing. So this is a great segue

112
00:09:16,889 --> 00:09:23,609
because I asked the panelists up here before
they came up, and we put this into the Etherpad.

113
00:09:23,609 --> 00:09:28,049
So I encourage you folks to participate as well.
I know it's a bit tough in Etherpad to do this,

114
00:09:28,049 --> 00:09:32,969
but we actually have left some space in there.
So I pretty much gave three prompts. The first

115
00:09:32,969 --> 00:09:39,539
prompt is what excites you about the future of
wiki pedia or Wikimedia. What concerns you about

116
00:09:39,539 --> 00:09:45,839
the future of Wikipedia Wikimedia, and then fill
in the blank in 20 years Wikipedia will be dot

117
00:09:45,839 --> 00:09:50,369
dot dot. Right. Pretty open ended, but I think
we got a really interesting range of things. I'd

118
00:09:50,369 --> 00:09:54,629
love to start with Sherry because it hits
exactly on what you said about bias and how

119
00:09:54,779 --> 00:10:00,749
contaminated or how good quality are the sources
that we have and sharing, maybe you could

120
00:10:00,749 --> 00:10:03,539
explain a little bit what we talked about
before. It's something that concerns us a lot,

121
00:10:03,599 --> 00:10:11,129
because you have worked with Afro crowd and
trying to make sure that there is a good

122
00:10:11,129 --> 00:10:16,469
representation of history that has been
forgotten in many ways. What was the, what

123
00:10:16,469 --> 00:10:22,019
concerns you about the future of Wikipedia in
terms of, you know, how AI is being used these

124
00:10:22,019 --> 00:10:28,649
days? And that we are training our AI systems on
Reddit on Wikipedia. But if we don't have a good

125
00:10:28,649 --> 00:10:32,819
Wikipedia, what, what good is training it on
there? Right? Okay.

126
00:10:34,080 --> 00:10:40,260
So, I was thinking that, when we were talking
earlier that Wikipedia is such a central, okay,

127
00:10:40,290 --> 00:10:47,550
for me, Cherie speaking as myself. For me,
Wikipedia has always been the central nervous

128
00:10:47,550 --> 00:10:57,990
system of our lexicon, because of the organic
way that the material that we ingest, digest,

129
00:10:58,650 --> 00:11:07,890
reproduce, add, and grow. It really mimics the
order organic, it mimics an organism that's ever

130
00:11:07,890 --> 00:11:15,750
changing, evolving with the world. Everything
humans is flawed. But I think for the world

131
00:11:15,840 --> 00:11:23,640
understanding of itself, they go to Wikipedia,
they go to knowledge from people care about

132
00:11:23,640 --> 00:11:28,860
knowledge, even if it's a passing thing, and
you're like, you know, I want to share this. So

133
00:11:29,040 --> 00:11:35,370
what we know about the world is what AI knows
about the world, in part, in large part, because

134
00:11:35,400 --> 00:11:44,130
a lot of the information that we that that
Wikipedia and wiki projects actually produce is

135
00:11:44,130 --> 00:11:50,400
the data that the world is digesting, and now
even more so because of AI, it's even more

136
00:11:50,400 --> 00:11:58,620
important than ever before, the way that that
algorithms are trained, or that the knowledge

137
00:11:58,620 --> 00:12:03,060
that goes in and is being used, not just
obviously, from Wikipedia, but from the entire

138
00:12:03,060 --> 00:12:10,260
Internet, a large chunk of that I do believe is
Wikipedia. So at the foundation of that

139
00:12:10,260 --> 00:12:19,980
information, are a lot of people in this room.
And in the general, you know, wiki body. So

140
00:12:20,040 --> 00:12:26,550
therefore, the information that's produced from
within the community, and the hive mind of the

141
00:12:26,550 --> 00:12:36,780
community, is really at the UN, rising to an
apex of importance for going forward for

142
00:12:36,780 --> 00:12:41,640
understanding of history of ourselves of the
world, and so forth. So it's even more important

143
00:12:41,640 --> 00:12:48,480
than ever before, that more voices are heard
more more perspectives are included. So that

144
00:12:48,480 --> 00:12:55,530
that, that bias, on whatever level it is,
because there are different levels of bias, you

145
00:12:55,530 --> 00:13:01,650
know, systemic bias, and you know, there's
different kinds of ways that you can look at

146
00:13:01,650 --> 00:13:09,030
bias. But at the same time, we have a very
interesting place to that we're at because,

147
00:13:09,180 --> 00:13:16,410
number one, we're here and talking about it, and
coming up with ways of approaching it so that

148
00:13:16,410 --> 00:13:22,710
it's even better than before. So the better bit,
the information is coming from this huge chunk,

149
00:13:22,950 --> 00:13:33,720
important influential chunk of information for
for, you know, AI going forward, the more

150
00:13:33,720 --> 00:13:39,540
important everyone in this room is and what and
what this group of this global body is doing.

151
00:13:39,900 --> 00:13:47,940
But it also means that there's more impetus and
more onus on us to ensure that we don't back

152
00:13:47,940 --> 00:13:54,720
down on improving that information, but we go
full force forward ahead, and, and really work

153
00:13:54,720 --> 00:13:56,370
on it even more. So. Right.

154
00:13:56,430 --> 00:13:59,310
Thank you. Yeah, that was something that we
haven't really talked about at all. And this

155
00:13:59,310 --> 00:14:03,090
conference is there actually is a Wikimedia
movement strategy process that we are

156
00:14:03,090 --> 00:14:07,740
implementing. And one of those goals is
knowledge, equity, right? So nothing about us.

157
00:14:07,740 --> 00:14:12,900
Without us. We want to have many folks involved
more than we see today. And that Rosie, one

158
00:14:12,900 --> 00:14:16,620
thing you wrote about what excites you is
related to this as well, the ability to morph

159
00:14:17,190 --> 00:14:20,550
ourselves into the Galactic and I didn't notice
until you put collective future probably

160
00:14:20,550 --> 00:14:25,770
referring to your lightning talk today in terms
of your your point there. But tell us a little

161
00:14:25,770 --> 00:14:28,710
bit about what you're thinking about what
excites you about Wikipedia?

162
00:14:31,080 --> 00:14:38,850
Right. So there's been a question in the last
year that some people have been asking is

163
00:14:38,880 --> 00:14:46,950
Wikipedia, just the Encyclopedia of our
generation, and the next generation will have

164
00:14:46,980 --> 00:14:53,040
another encyclopedia just like there was the
Encyclopedia Britannica and no one contributes

165
00:14:53,040 --> 00:15:02,820
really to it anymore. And so I think I'm I am
the eternal out Optimus and so that may in its

166
00:15:02,850 --> 00:15:12,030
own way, you know, influenced what I'm going to
say. But I see Wikipedia as having the potential

167
00:15:12,240 --> 00:15:18,480
for becoming the encyclopedia Galactica, in
that, you know, when we move off of this planet

168
00:15:18,510 --> 00:15:25,440
and populate other planets in the in the galaxy
that, you know, really will extend ourselves

169
00:15:25,440 --> 00:15:33,420
that far away into the future. I make that point
only to say that, I think the only way that this

170
00:15:33,420 --> 00:15:41,970
can happen is if we are willing as a movement to
change, if we are not willing to change with the

171
00:15:41,970 --> 00:15:51,000
times if we are not willing to see how AI can
can be used in a thoughtful way that works for

172
00:15:51,030 --> 00:15:58,290
us. If we're, if we draw lines in the sand and
say, This is the policy and Notability. And we

173
00:15:58,290 --> 00:16:04,410
are not going to change it. And it's been the
same policy that was written in two decades ago.

174
00:16:04,560 --> 00:16:11,730
If we are not willing to make changes in so many
different ways, then we are going to go the way

175
00:16:11,730 --> 00:16:20,100
of the dinosaur. And we will become obsolete.
And so if we become obsolete, how do we attract

176
00:16:20,100 --> 00:16:25,530
new editors? How do we get them? You heard me
talk about, you know, seven out of 10 readers

177
00:16:25,560 --> 00:16:33,990
are men, you know that in itself kind of like?
Like, really? Yeah, that's really true. So I,

178
00:16:34,020 --> 00:16:39,240
for me, it's about change, being willing to
change, being willing to accept ambiguity, and

179
00:16:39,240 --> 00:16:45,360
how we deal with change, like AI, I am not the
expert here on that. But I know that we must go,

180
00:16:46,110 --> 00:16:52,260
we must understand it better, and see how to
harness it. Because otherwise we will become

181
00:16:52,260 --> 00:17:00,840
Encyclopedia Britannica, and something else will
come out from from society and become the new

182
00:17:00,840 --> 00:17:04,170
kind of global encyclopedia. Right?

183
00:17:04,770 --> 00:17:09,240
Thank you. And, Andy, I love your comment,
because we're all focusing on the nuts and

184
00:17:09,240 --> 00:17:13,830
bolts, and you bring it back to the people. So I
think that's also what Nome appreciate is that

185
00:17:13,830 --> 00:17:17,820
you're kind of our resident ethnographer in many
ways of the community. Tell us a little bit

186
00:17:17,820 --> 00:17:19,590
about more about what excites you?

187
00:17:21,600 --> 00:17:26,580
Um, well, I wrote it up there. Every time I come
to one of these things, I meet someone, they say

188
00:17:26,580 --> 00:17:32,850
their username, and I'm shocked that they're,
like, extremely young. There are people that I

189
00:17:32,850 --> 00:17:37,380
have worked with on Wikipedia who were born
after 2010. They're half my age, and I'm not

190
00:17:37,380 --> 00:17:45,750
even that old. I feel so inspired and excited
that even today in 2024, there are a lot of

191
00:17:45,780 --> 00:17:50,910
really bright, really like socialized there are
all these fears, the kids are not okay. They I

192
00:17:50,910 --> 00:17:55,440
think that many of them are okay. And they're
still finding Wikipedia, and they still really

193
00:17:55,440 --> 00:17:58,920
like it. That's what makes me happy. Great.
Thank

194
00:17:58,920 --> 00:18:04,560
you. And please, folks keep adding stuff. I love
seeing the stuff that you're adding here. Even

195
00:18:04,560 --> 00:18:08,070
if we don't get to read them all. I love having
this as a kind of a record of what we're

196
00:18:08,070 --> 00:18:13,320
thinking in 2024. So let's go to some of the
concerns. But you also say any that you also

197
00:18:13,320 --> 00:18:17,250
recognize that we're kind of not necessarily
assuming good faith and doing the best to

198
00:18:17,250 --> 00:18:18,630
welcome new folks, right?

199
00:18:20,010 --> 00:18:26,820
Oh, yeah, well, I have met people here today,
even that have said, I have felt a little bit

200
00:18:26,820 --> 00:18:32,460
bad that some of the articles they submitted to
articles for procreation did not get accepted.

201
00:18:32,850 --> 00:18:41,490
And I hope that there is no guilt in that. Even
though it's hard to not have guilt. Yeah, when

202
00:18:41,490 --> 00:18:45,630
you meet people in person, and you see the
humanity of Wikipedia, it's easy to bounce back

203
00:18:45,630 --> 00:18:51,210
and say, Oh, I see like, I am new to the
project. We're all learning all the time. But

204
00:18:51,210 --> 00:18:57,390
when it's just the white screen in front of you,
and you see the red rejection, it can be easy to

205
00:18:57,390 --> 00:19:00,960
take things personally. And sometimes we bite
the newbies without realizing we're biting the

206
00:19:00,960 --> 00:19:07,230
newbies at all. So we're the people that go out
of their way to be really friendly, welcoming

207
00:19:07,230 --> 00:19:12,240
committees, birthday stuff, the things that
aren't high priority, I really do think that

208
00:19:12,240 --> 00:19:13,020
stuff matters.

209
00:19:14,190 --> 00:19:18,090
Right? I'm wondering, I mean, Rosie, when you
talk about the things change, I do think the

210
00:19:18,120 --> 00:19:22,200
there has been a tilt toward AI exclusion, you
know, delete humanism. And I never understood

211
00:19:22,200 --> 00:19:26,520
that. I was like, What's the big arm of having,
you know, when you don't want promotion? Or

212
00:19:26,520 --> 00:19:30,360
things that are truly not, you know, but if
something's obscure, why what's the harm of

213
00:19:30,360 --> 00:19:33,570
having an article about and as that's kind of
what you're getting at? Right? That having these

214
00:19:33,570 --> 00:19:38,250
vague, you know, barriers and having people
being so on top of it, I assume that's why, you

215
00:19:38,250 --> 00:19:41,970
know, someone will win a Nobel Prize later, you
know, and will not have an article because

216
00:19:42,000 --> 00:19:45,210
someone was, you know, on the watch, and is that
the kind of thing we're talking about?

217
00:19:47,640 --> 00:19:52,020
There's, there's a, there's an interesting joke
that, you know, my book is called the computer

218
00:19:52,020 --> 00:19:56,970
revolution, but after 1020 years, and I've
realized it's a revolution in making a very

219
00:19:56,970 --> 00:20:02,970
conventional encyclopedia, right, we actually
have very conservative, we have policies against

220
00:20:02,970 --> 00:20:08,250
too many photos, right? We say, Oh, we're taking
really serious now we're top 10 website. We

221
00:20:08,250 --> 00:20:13,080
can't do weird things anymore. Right. So we see
that sentiment in our community often, right.

222
00:20:13,920 --> 00:20:14,400
Richard? Yeah,

223
00:20:14,430 --> 00:20:20,880
my concern. My concern, my concern was there
Wikipedia wouldn't be fun. I think that you

224
00:20:20,880 --> 00:20:23,610
know, we've we've heard like this from
everything says, I mean, things like, you know,

225
00:20:23,640 --> 00:20:28,320
any site, he mentioned, citation citation
needed, the the whales and the dolphins the

226
00:20:28,320 --> 00:20:32,130
purposes, that's like one of the few things that
are left, I think that's really important that

227
00:20:32,190 --> 00:20:35,970
that gets people involved, that gets younger
people involved, it gets all sorts of people

228
00:20:35,970 --> 00:20:42,090
involved. A lot of people, you know, editing pop
culture topics, a lot of them start with that

229
00:20:42,090 --> 00:20:45,180
way. At some point, we decided we shouldn't
cover pop culture in great detail that we should

230
00:20:45,180 --> 00:20:49,080
put them on some like specialists wiki. And a
lot of the the modern pop culture that's on

231
00:20:49,080 --> 00:20:53,010
social media, we don't cover that at all.
Because they don't like official standards. The

232
00:20:53,010 --> 00:20:57,450
New York Times doesn't write reviews of social
media, you know, type, like comments and things

233
00:20:57,450 --> 00:21:02,640
like that. I think we have to, like, you know,
be playful, be experimental. I think that also

234
00:21:03,090 --> 00:21:07,920
involves like using AI, like we should use AI in
like a playful way with that we were generating

235
00:21:07,920 --> 00:21:11,460
and doing interesting things with it. We
shouldn't you know, we shouldn't use AI in the

236
00:21:11,490 --> 00:21:15,270
sense that it becomes like this boring robotic
voice thing, which is, which is gonna come for

237
00:21:15,270 --> 00:21:20,310
most of the Internet. And I hope that we can be
like, uh, you know, we've been using bots since

238
00:21:20,310 --> 00:21:23,610
you know, before I was involved in Wikipedia as
an alternative media for a long time. And but we

239
00:21:23,610 --> 00:21:29,220
will use them thoughtfully and with like, the
human touch. And I hope we can be that way. When

240
00:21:29,220 --> 00:21:34,140
the rest of the Internet becomes this like
droning, boring robots, or, or at least not not

241
00:21:34,140 --> 00:21:39,750
not boring is okay. But like not fun robots, you
know, it's okay. Yeah,

242
00:21:39,750 --> 00:21:44,220
that's good point. A lot of our fun parts of our
community are actually off wiki, right? Because

243
00:21:44,220 --> 00:21:49,020
they've been forced off discord, Facebook, other
places. And that's kind of, I don't wanna say

244
00:21:49,020 --> 00:21:53,100
sad, but little bit regretful that that's the
case. Yeah, yeah.

245
00:21:53,220 --> 00:21:56,790
And I think that that's, that would, that would
be actually killed the Wikipedia community,

246
00:21:57,090 --> 00:22:01,590
there are all sorts of governments and
businesses that, you know, maybe don't like the

247
00:22:01,590 --> 00:22:06,240
Wikipedia community or doesn't like the way that
it works. I have rival approaches, which, you

248
00:22:06,240 --> 00:22:09,570
know, varying quality. But I think the thing
that would ultimately kill the Wikipedia

249
00:22:09,570 --> 00:22:13,890
community, so it's not fun anymore, if it's not
fun anymore than people wouldn't contribute.

250
00:22:14,640 --> 00:22:18,150
And, you know, I mean, obviously, people
contribute, because they, they support, they

251
00:22:18,150 --> 00:22:22,380
believe in the ideas, they find the topics
interesting. But I think fundamentally, having a

252
00:22:22,380 --> 00:22:26,640
fun and sociable community is a very important
part of sustaining this project like this.

253
00:22:27,510 --> 00:22:30,690
Great point. And there's something that that
yeah, go ahead. Sure.

254
00:22:31,890 --> 00:22:40,080
Why can't be be fun. I mean, I remember when I
when I Okay, so my, one of my first impressions

255
00:22:40,110 --> 00:22:49,110
of wiki community, when Alice brought me on,
sorry, Alice back or Africa, brought me on was,

256
00:22:49,140 --> 00:22:58,380
how open it was, the how open it was, and how
open minded at least some of the folks that we

257
00:22:58,380 --> 00:23:08,550
connected with were, and I am finding that that
is we've become we've become very in some ways,

258
00:23:08,550 --> 00:23:13,500
we've, we've filled the purpose of scaling up
for the purpose of, of what you were talking

259
00:23:13,500 --> 00:23:19,560
about the well, we've reached as a, you know,
the, this very influential platform, in some

260
00:23:19,560 --> 00:23:28,920
ways, we've lost some of that. It's not wild,
wild west nature, but kind of anything could

261
00:23:28,920 --> 00:23:36,990
happen feeling. And, and, and I feel like
sometimes we can be a little bit too caught up

262
00:23:36,990 --> 00:23:37,590
in

263
00:23:39,450 --> 00:23:41,760
like being reputable, right? I feel

264
00:23:41,760 --> 00:23:48,600
a little bit and but it's not in a negative way,
per se. But, you know, sometimes when you when

265
00:23:48,600 --> 00:23:58,470
you grow up, you know, you lost some of that
some of what you had in your younger stages when

266
00:23:58,470 --> 00:24:05,460
you're just figuring out who you were. And I'm
wondering if, if we're not losing some of that

267
00:24:05,460 --> 00:24:11,490
along the way too, because sometimes it kind of
feels almost like communities come against each

268
00:24:11,490 --> 00:24:17,760
other. And that hurts the information, rather
than working together. Even if you don't agree,

269
00:24:18,210 --> 00:24:24,270
you know, sometimes it feels like a zero sum
game. And then what happens to the knowledge in

270
00:24:24,270 --> 00:24:34,380
between that, you know, maybe that country could
have two names. Just use a trivial example.

271
00:24:34,560 --> 00:24:42,180
Maybe we could have two names, and that's okay.
You know, but, or, you know, you know, there are

272
00:24:42,180 --> 00:24:48,300
some, there's some things I feel that we lose,
because of our humanity and the desire to be

273
00:24:48,300 --> 00:24:57,030
right, versus the desire to be understood and to
or to have information included, that it is also

274
00:24:57,060 --> 00:25:04,620
right for some. So I don't know if that It is
very AI discussion topic oriented. But I do feel

275
00:25:04,620 --> 00:25:12,090
it is something that helps the community to grow
and thereby helps the information also to

276
00:25:12,090 --> 00:25:19,200
continue to develop, as it attracts more people
who want to be a part of the community. In

277
00:25:19,200 --> 00:25:19,830
general,

278
00:25:21,060 --> 00:25:29,520
I want to segue on what you were saying there,
Sherry, about communication, and how different

279
00:25:29,520 --> 00:25:39,390
groups can react to each other. And it's because
a couple of things here. It was only in the last

280
00:25:39,420 --> 00:25:46,740
day or two that I learned there was a discord
channel related to people who were doing some of

281
00:25:46,740 --> 00:25:56,910
the organizing work panelists and such for this
conference, and I was like, Okay, well, can

282
00:25:56,910 --> 00:26:03,930
someone send me a link? So they sent me a link,
and then I'm in this discord channel now. And so

283
00:26:03,930 --> 00:26:11,670
that's fine. And generally, there are all these
telegram channels that we comedians have used

284
00:26:11,700 --> 00:26:18,840
initially for conferences. But now there's just
such a plethora, it is for me impossible to keep

285
00:26:18,840 --> 00:26:27,510
up with all of them. So I don't, although I was
in a conversation with Andrew, in a Google Chat

286
00:26:27,690 --> 00:26:34,410
regarding something to do with AI. And he said,
Oh, it's in that AI chat group. And I'm like,

287
00:26:34,920 --> 00:26:41,910
okay, like which one, and there, he gives me a
link. It's an AI chat group in telegram. Great.

288
00:26:41,910 --> 00:26:46,260
Now I'm in that one, too. So now I can follow
that conversation. And it makes me think like,

289
00:26:46,440 --> 00:26:52,230
Okay, are you on signal? Because there's stuff
happening there? Are you on Discord? There's

290
00:26:52,290 --> 00:26:57,720
stuff happening there? Are you on telegram?
Where are you? What are you hearing? What are

291
00:26:57,720 --> 00:27:04,500
you reading? And how about all the people who
aren't on any of those? Or are on some of those

292
00:27:04,530 --> 00:27:10,230
are just none of those are a few of those? Like,
how are we communicating? And who's outside the

293
00:27:10,230 --> 00:27:17,400
loop? And isn't part of the conversation? And
are they left behind? Or are we just chatting

294
00:27:17,400 --> 00:27:23,370
because we want fun, which is part of it. And we
want to be connected? Are we chatting so that we

295
00:27:23,370 --> 00:27:31,920
don't have two groups that are at odds with each
other? I have no answers. But I do think about

296
00:27:31,920 --> 00:27:37,830
how we communicate and where we communicate and
how much we communicate. And the overload at

297
00:27:37,830 --> 00:27:44,400
least that I sometimes feel from communications.
So yeah, and

298
00:27:44,400 --> 00:27:48,720
you touched on a pet peeve of Richard and he's,
he's interested in trying to create a kind of

299
00:27:48,720 --> 00:27:53,790
communication suite that we can rely on, not
just a smattering of WhatsApp Signal, Telegram

300
00:27:53,790 --> 00:27:55,080
and all these other things. Right, Richard?

301
00:27:55,290 --> 00:27:59,610
Yeah, I think, you know, I think that Wikimedia
movement is, you know, where the, the main

302
00:27:59,610 --> 00:28:04,260
nonprofit technology project to the world, I
mean, he started as an encyclopedia thing, but I

303
00:28:04,260 --> 00:28:08,970
think there's a lot of important nonprofit
technology needs in the world. And, you know,

304
00:28:09,060 --> 00:28:13,560
the, there's gonna be, you know, like, the
Wikipedia started, you know, it was around the

305
00:28:13,560 --> 00:28:17,580
same time as MySpace, there's, you know, now
tick tock is latest, there's probably something

306
00:28:17,580 --> 00:28:22,380
that's hotter than Tick, tock snaps, mono
Snapchats, might be a little, but there's gonna

307
00:28:22,380 --> 00:28:25,200
be something, there's something else and they're
all gonna be like commercial, and they're all

308
00:28:25,200 --> 00:28:29,250
gonna have weird motives, and they're all gonna
like, and now they're gonna have more AI stuff

309
00:28:29,250 --> 00:28:34,140
like, you know, trying to appeal to your most
extreme nature on any given topic. And I think

310
00:28:34,140 --> 00:28:39,690
that it is, there is maybe a role for nonprofit
technology in the social media space. And I kind

311
00:28:39,690 --> 00:28:44,100
of think that Wikimedia could be a part of that.
And it's a little bit of a bold thing to do. But

312
00:28:44,250 --> 00:28:50,010
I think that you know, that the world is
demanding it, having something some some

313
00:28:50,010 --> 00:28:54,120
reasonable way to talk to each other. That's not
mediated by, you know, tech companies or

314
00:28:54,120 --> 00:28:57,720
governments. And maybe we can meet, it can be a
part of that, probably just for our own

315
00:28:57,720 --> 00:29:01,500
intercultural purposes to start with, but I
think that, you know, there is some need for

316
00:29:01,500 --> 00:29:06,000
that, and maybe some of our culture could
provide some of that they're all all these weird

317
00:29:06,330 --> 00:29:10,530
technology, corporations are looking at our
like, our moderating moderating systems and

318
00:29:10,530 --> 00:29:14,790
trying to copy it and doing weird things with
it. But, you know, our community does have

319
00:29:14,790 --> 00:29:18,750
something to offer, and I hope we can be
independent of the tech stuff. And if it's

320
00:29:18,990 --> 00:29:20,430
technology companies, yeah,

321
00:29:20,520 --> 00:29:24,600
that's good point. Final thing we're gonna do,
because we're the last thing standing between

322
00:29:24,600 --> 00:29:30,780
you and I think a pretty unusual cake. That is
Wikipedia themed. So last thing we're going to

323
00:29:30,780 --> 00:29:33,990
do, and I'll give him some time to think about
this, while we marched through other people's

324
00:29:33,990 --> 00:29:40,020
answers, in 20 years, Wikipedia will be dot, dot
dot, and it's interesting. A lot of our answers

325
00:29:40,020 --> 00:29:46,080
are based around how human it's going to be, or
what roles humans will have. But any What is

326
00:29:46,080 --> 00:29:47,190
your response to this?

327
00:29:48,990 --> 00:29:56,820
I only did one word. Yeah, I just wrote human I
am i That's the only reason I like Wikipedia.

328
00:29:56,850 --> 00:30:01,050
Every reason that I like Wikipedia has to do
with it being made by real really great people.

329
00:30:01,560 --> 00:30:07,110
One of the first user pages that I stumbled upon
that made me feel really empowered, was sammies,

330
00:30:07,110 --> 00:30:10,680
because it just has this very simple line. I
don't know if she even really thought very much

331
00:30:10,680 --> 00:30:15,810
when she wrote it. She just says, Welcome to
Wikipedia. This site was made by real people

332
00:30:15,810 --> 00:30:22,890
just like you smiley face. And I thought just
like me, and, and yeah, I start started editing

333
00:30:22,920 --> 00:30:29,460
a lot more after seeing that, um, AI is great
for a lot of things. But if Wikipedia is the

334
00:30:29,460 --> 00:30:34,200
only search result on Google, that's written by
humans, which in a lot of cases that already is,

335
00:30:34,740 --> 00:30:36,720
that makes it even more valuable.

336
00:30:38,010 --> 00:30:44,280
viewpoint, and I took that, just like you. And I
took the contrarian viewpoint, and as I said,

337
00:30:44,280 --> 00:30:49,260
it's going to be as important as ever, but less
human created than ever. I know, tomato is gonna

338
00:30:49,260 --> 00:30:53,730
be thrown at me. But I think it's inevitable.
But we got to make sure we turn that into a good

339
00:30:53,730 --> 00:30:58,050
thing. Not a bad thing, right? We're not just a
victim of being used for AI. Right? We're in the

340
00:30:58,050 --> 00:31:03,870
conversation, we're actively making things
better, right? The Rosie, what did you say here?

341
00:31:03,870 --> 00:31:04,860
The same and different?

342
00:31:07,860 --> 00:31:12,420
Yeah, I mean, that speaks for itself. There'll
be different, it'll be different in some ways,

343
00:31:12,420 --> 00:31:16,350
and some things will probably never change.
Richard,

344
00:31:16,380 --> 00:31:17,460
how about you? Well,

345
00:31:17,460 --> 00:31:21,300
I guess around, I think we'll be around, I don't
think that I don't think that, you know, these

346
00:31:21,300 --> 00:31:25,020
other platforms are going to be around in 20
years, just their, you know, the nature of

347
00:31:25,020 --> 00:31:29,640
commercial things. And I think we're gonna be
more Wikipedia than ever, I think we got to be,

348
00:31:29,640 --> 00:31:33,960
you know, playful and experimental, and engage,
you know, we've always had this vision of

349
00:31:33,960 --> 00:31:38,580
engaging a larger percentage of the population.
And I hope we can actually do that in the next

350
00:31:38,580 --> 00:31:43,470
20 years. Because, you know, Wikipedia is, is,
you know, read by a certain amount of people.

351
00:31:43,470 --> 00:31:47,670
And it's not even like, you know, not enough
people are able to consume it, but not enough

352
00:31:47,670 --> 00:31:51,180
people are able to contribute to it, even more
importantly, and we suffer in terms of our

353
00:31:51,180 --> 00:31:57,360
knowledge base there. And if we can really
engage communities and individuals, and maybe,

354
00:31:57,360 --> 00:32:01,170
you know, things like journalists on like a more
meaningful level now that the journalists have

355
00:32:01,170 --> 00:32:06,390
been abandoned by the technology companies,
maybe they can find some new allies, and in a

356
00:32:06,390 --> 00:32:10,710
sort of a more nonprofit space, and I hope he
can become even more Wikipedia than yesterday,

357
00:32:10,860 --> 00:32:12,510
tomorrow. Next write

358
00:32:12,540 --> 00:32:13,230
about you know.

359
00:32:16,950 --> 00:32:21,750
You know, I guess I think it will certainly I
think it'll be human and I guess the thing I

360
00:32:22,080 --> 00:32:27,360
another thought another strain I mentioned
earlier, but it relates to this is the idea of

361
00:32:27,390 --> 00:32:32,010
primary sources. Another one here, rules about
like secondary sources. And it just, you know,

362
00:32:32,010 --> 00:32:37,680
occurred to me that I like Primary sources are
gonna be very important. You know, I recently

363
00:32:37,680 --> 00:32:42,630
learned about a project in French with a
Wikimedia called link growth. lingua libre,

364
00:32:42,630 --> 00:32:47,700
which is a was designed to allow people, you
know, it was trying to sort of preserve the

365
00:32:47,700 --> 00:32:52,740
French government supporting it, because it's
designed to preserve obscure dialects, and it

366
00:32:52,740 --> 00:32:58,950
makes it very, very easy to record a word or
phrase, and upload it to the comments like me,

367
00:32:58,950 --> 00:33:03,180
it's so easy as I could do it, I could record
some New York isms that I wanted to put up

368
00:33:03,180 --> 00:33:07,500
there. And I just feel like it was an incredible
tool because it was so easy to use. And it

369
00:33:07,500 --> 00:33:13,350
preserves people's speech. And it makes it you
know, in a copyrighted right permissive license,

370
00:33:13,350 --> 00:33:19,440
it just was a brilliant thing. I can just
imagine a million uses for it of other of place

371
00:33:19,470 --> 00:33:24,930
of cultures and communities that are not easily
reached, like the point that wouldn't even be

372
00:33:24,930 --> 00:33:29,640
like a researcher. So go there, if you need
Internet access, and that's about it, and a

373
00:33:29,640 --> 00:33:34,140
computer, I guess, of some sort, or phone, but
then you can really have your story your

374
00:33:34,140 --> 00:33:39,660
pronunciation, uploaded, and I just feel like
Wikipedia, I think in the future, will have to

375
00:33:39,660 --> 00:33:45,960
be more amenable to video and audio. And I think
it will have to sort of, if it really wants to

376
00:33:45,960 --> 00:33:51,180
be more inclusive, we'll have to, again, bend on
this idea that only Secondary sources are what

377
00:33:51,840 --> 00:33:56,610
you can use to write about things or oral
histories are just there, there are things that

378
00:33:56,610 --> 00:34:03,180
are worth preserving that are whatever you want
to call knowledge that needs to be preserved and

379
00:34:03,180 --> 00:34:07,440
kept and belong on Wikipedia. And there, it will
need to change the rules. And they need to be

380
00:34:07,440 --> 00:34:11,520
some innovative technology, which I thought this
lingua libre was really very brilliant

381
00:34:11,520 --> 00:34:15,960
technology to me, and will be very simple. So I
could just see a Wikipedia that would be much

382
00:34:15,960 --> 00:34:26,460
more like more visual audio, more or less, based
on secondary sources. It might it's a hope, I

383
00:34:26,460 --> 00:34:27,360
guess. Right. Right.

384
00:34:27,810 --> 00:34:34,380
All right. Thank you so much for the panel for
your responses. Thank you. And then thank you

385
00:34:34,380 --> 00:34:39,120
for your responses here. Look forward to reading
them. I'm gonna hand it over to Richard and mark

386
00:34:39,120 --> 00:34:42,810
for the last instructions. We're going to go out
and try to take a group photo before the light

387
00:34:42,810 --> 00:34:45,150
goes out. And then the cakes right go ahead.

388
00:34:45,720 --> 00:34:51,300
So yeah, before we take our group photo and get
the fabulous protease AI to the cake is is a

389
00:34:51,300 --> 00:34:57,120
combination of AI and human design, which is
what we're aiming for. So hopefully we're not

390
00:34:57,150 --> 00:35:02,370
too robotic and, and you know, we're building
things So I wanted to welcome everybody to

391
00:35:02,400 --> 00:35:08,130
Wikimedia New York City, you're we're your
friendly local wiki neighborhood group, we have,

392
00:35:08,580 --> 00:35:14,400
you're welcome to join, join our events, we
nyc.org Check out our events. We have two events

393
00:35:14,400 --> 00:35:18,780
coming up next month. On February 8, we have a
hacking night for those who are technically

394
00:35:18,780 --> 00:35:23,790
inclined. So you can go hack and work on things
and and play with things and learn things. And

395
00:35:23,790 --> 00:35:28,170
maybe you can just hack a Wikipedia article.
That's a valid way to hack and February 21, we

396
00:35:28,170 --> 00:35:34,410
have our wiki Wednesday salon, which is sort of
a miniature of this conference, our event or

397
00:35:34,410 --> 00:35:39,300
events are often at a four to four West 54th
Street to play school prime produce, it's sort

398
00:35:39,300 --> 00:35:44,100
of an art collective, and other stuff, and
you're very welcome to join our mailing list and

399
00:35:44,100 --> 00:35:49,290
join our Discord server. And I hope you can join
and participate and if you want ideas of events

400
00:35:49,290 --> 00:35:53,880
and things you want to contribute to Wikipedia
and other Wikimedia projects. We're here where

401
00:35:53,910 --> 00:35:56,940
you know, your local volunteers were very
interested in engaging with with all the New

402
00:35:56,940 --> 00:36:03,780
York communities and stuff beyond so thanks.
Well, the next local Hackathon is February 8,

403
00:36:04,050 --> 00:36:10,740
and and the next and the next wiki Wednesday
salon, which is more of a social slash non tech

404
00:36:10,740 --> 00:36:14,670
event is February 21. And they're both on West
54th street and I hope you can join thanks

405
00:36:20,250 --> 00:36:20,550
oh,