Digital Parish: Establishing safe spaces online

Is it safe to do ministry digitally? A lot of people have foreseen some possible scenario where our digital ministries enable someone with nefarious motivations to do harm to a member of the community. 

Churches and youth organizations have long utilized guidelines to help protect both vulnerable populations of people and the caring people who work with them. But how do we translate those guidelines for digital ministry?

In this session of Pastoring in the Digital Parish, our adjunct professor is an old friend, Pastor Nathan Webb. Nathan shares with us the procedures his all-digital ministry, called Checkpoint Church, is putting in place. By sharing their system, you should be able to discern some necessary procedures you can put into place in order to keep your online ministry space a safe place for all people.

The Episode

Listen on Apple Podcasts logo, light. Listen on Google Podcasts logo small, light. Listen on Spotify small, light button. Listen on Amazon, small, light button

Show Notes 

This is the Safepoint policy from Checkpoint Church.

Nathan Webb also suggested letting some AI bots do the work as a first-wave of content control. Checkpoint Church uses MEE6 to monitor their Discord server.

The United Methodist Church produces resources for community safety called Safe Sanctuaries. They have produced a workbook for digital safety called Safe Sanctuaries in a Virtual World.

A couple recommended episodes that relate to this one:

Ryan Dunn (00:00):

This is Pastoring in the Digital Parish, your resource for community and insights for ministry in the digital realm. I'm Ryan Dunn, the host of this podcast and a fellow practitioner of digital ministry.

(00:14):

Is it safe to do digital ministry? I ask because, no doubt, a lot of people have foresee some possible scenario where our digital ministries enable someone with nefarious motivations to do harm to a member of the community. A blessing of digital space is that it's so wide open and accessible, right? But that can also be a detraction so much so that it sometimes keeps us from endeavoring to minister in digital space, churches and youth organizations have long utilized guidelines to help protect both vulnerable populations of people and the caring people who work with them. But how do we translate those guidelines for digital ministry?

(01:03):

In this session of Pastoring in the Digital Parish, our adjunct professor is an old friend of the podcast. Pastor Nathan Webb, Nathan shares with us, the procedures for his all digital ministry called checkpoint church and what they're putting into place by sharing their system, you should be able to discern some necessary procedures that you can put into place in order to keep your online ministry, a safe space for all people.

Ryan Dunn (01:34):

Nathan is a self proclaimed major nerd in just about every way--his words, not mine. He loves video games, anime cartoons, comic books, tech, and his fellow nerds. And that nerdly love spurred Nathan to found checkpoint church, which is an all digital community for nerds, geeks, and gamers. He lives in North Carolina, kind of the Charlotte area, correct?

Nathan Webb (01:57):

Yes, that's right.

Ryan Dunn (01:58):

And he's ordained in the Western North Carolina conference of the United Methodist church. AND is soon to greet his second child, I guess that could happen at any moment. So in case we have to cut this short we understand, right.

Nathan Webb (02:11):

That's right. I'll let you know if the hospital calls will be on the way <laugh>.

Ryan Dunn (02:15):

Well, Nathan you're our first repeat adjunct professor on pastoring in the digital parish. Back in season one, we talked about discord from the perspective of what you're doing in Discord with Checkpoint Church. So for the listener who missed that episode shame on you. <Laugh> just kidding, but seriously, go back and listen to it. Describe for us what Checkpoint Church is.

Nathan Webb (02:39):

Sure. So Checkpoint Church is an online church. We're online. First, we have maybe goal in the future to figure out what it means to be kind of a both end congregation, but we are very much online. And we started out of a vision, like you said, to reach nerds, geeks and gamers. And we discovered that the best place to reach people was on Twitch, which is an online streaming platform kind of like Facebook live or like YouTube live, but Twitch tends to lean towards the nerds, the geeks and the gamers. So we were like, well, that's where the people are, that's where we need to be. And then we discovered that we needed a place to send them. And so that turned out to be discord, which we kind of see as our digital church building. And I could talk about discord until I'm blue in the face. So do be sure to, to check out that podcast or any of the podcast about discord or instructional through com or whoever it may be. I, I, I like to talk about Discord a lot. And then we've been most recently working on kind of figuring out leadership within, within our ranks and, and just continuing to grow and discover what it means to be a part of an online church and lead in an online church.

Ryan Dunn (03:38):

Yeah. Well, real quick, like what does what does a community gathering look like for checkpoint church? Like, so we're gonna talk about how we have fair practices in our relationships <laugh> and, you know, not imbalanced power and all that kind of stuff. Mm-Hmm <affirmative>. And so with that in mind, like, how are people relating to another in Checkpoint Church?

Nathan Webb (04:02):

It's so weird, you know, I was so used to a, a, so I served as a student pastor before planning, checkpoint, and I served for four years and it was like, man, it's a struggle to get people to hold on for that whole hour, without like rushing off to the Golden Corral after worship on Sunday mornings. And so I was so used to that model of like an hour on Sunday, an hour on Wednesday for Bible study. And that's your gathering and your really devoted people might come for some meetings and things like that. Maybe have a potluck once a quarter that was like at my custom. And then once Checkpoint got started, I discovered that that was nowhere near enough time. And so we gather for like intentional synchronous gathering time, nine hours a week, we stream three hours for three days.

Nathan Webb (04:44):

And so sometimes we're playing Pokemon. Sometimes we're playing games together. Most recently we started a campaign of Stary Valley, which is a farming game. We have four people farming together on Wednesday night, so are the gatherings that we have for sure. But that's kind of our like synchronous gatherings. Our asynchronous stuff is pretty much 24 7 around the clock on discord. Yeah. So I'm not on discord 24 7 because I am, <laugh> not that devoted. I couldn't possibly, I live that monastic life, but it is almost monastic. The, the gathering that we have of just kind of constantly being in community with each other on Discord, people, constantly sharing thoughts. And we respond to questions of the, a day, every Monday through Friday, people just kind of respond on their own time. Whenever new news comes out about anime or video games, people post that whenever people have pair requests, they post that we have community creators.

Nathan Webb (05:33):

So people in our community that'll create art or we'll host streams of their own. And they post them on the discord and people from our discord go and support them. So there's things happening 24-7, but our intentional space is that nine hours a week. And then every Sunday this is kind of like a blend between synchronous and asynchronous. We post a 10 to 15 minute nerdy sermon. So we're working on what it means to watch those together and figure out what it means to maybe even do a Bible study on those. But that's still in the coming months in the future, but this is, this is just a basic gist of what goes on in the week. Yeah. And there's all sorts of things that come up along the way sporadically with the seasons.

Ryan Dunn (06:09):

Yeah. You touched on the big thing that gets me super excited about digital ministry. When, when I was in youth ministry, I used to bemoan that I would get like an hour and a half on a good week to interact with students. Right? But here, you're talking about the possibility of not just nine hours a week of, you know sort of face to face or in person or, you know, synchronous relation. But the access that, that you're granted 24-7 for people to be able to interact is, is a huge opportunity. But you know, that is a lot of space for things to happen as well. Right. So that's right. We have to be, I guess, on guard for some of the ways that that time might be taken advantage of by people with less than noble impulses or, or motivations. Right. So so let's talk about the why of what you've done with implementing safe practices and protocols in your digital space. Like why should we consider having safety guidelines in digital ministry spaces?

Nathan Webb (07:18):

So I think there's a couple reasons for our community in particular, why this was just necessary. We grew to a size where the critical mass was just like, okay, it's time. Like we've got to tackle this question because we're not just a group of like 10 to 20 people maybe gathering every now and again, but now we have 250 people in our discord and that's just not manageable for one person and to constantly be overlooking and figuring out what it looks like. So we knew that we needed to create something so that I'm not the only person trying to maintain a 24 7 not watch, watch sounds too harsh, but just a 24 7, just, just availability. If something does go wrong and we need to be notified, I can't be up at three o'clock in the morning. And so there's, there's just all these different challenges along the way. So we knew that we needed,

Ryan Dunn (08:05):

Well, maybe, maybe one way to think of that is that that's too big of a community for you to be the sole person of accountability. Right.

Nathan Webb (08:13):

Exactly. And so we

Ryan Dunn (08:15):

Responsibility,

Nathan Webb (08:16):

We needed to allow some people to serve in that way. And so we knew that we wanted to step into that role, but then the question became, well, how do we educate? How do we make sure that we're all on the same page? And then how do we make sure that we're creating this, this space in a truly safe way beyond just my own discernment? That's not enough discernment to know exactly what needs to go on. So we decided in that moment that we needed to do this because of the reasons that every other church ever should at least have some conversation about safety is because, you know, in the Methodist church alone, we say, we're gonna do good. We're gonna do no harm. We're gonna do those things. And so those are our goals as well in our community. Those are the only rules that we actually have do good, do no harm and strive to grow.

Nathan Webb (08:55):

And so doing safety is doing good. Practicing safety is actively avoiding harm. And so we wanted to create these spaces because of those pursuits in our community, because there was just the need to, like you said, diversify that accountability and to allow for responsibility to be shared amongst this community of believers. And then also we wanted to intentionally create a holy space and the way I've always defined the term holy is to be set apart. And so if people want to, the internet is huge. If they wanna go find a community that they can just kind of do whatever they want and have no control or no safety guidelines or no healthy practices can. So what, what makes our space different is that we are intentionally creating that set apart, set apart space for people to exist and be safe in.

Ryan Dunn (09:45):

Hmm. You know, I heard in talking about people who are cultivating online communities they have often found that actually setting guidelines around in, in safety practices around their communities really makes it more attractive and more likely that people will engage as opposed to kind of shutting down interaction in a way. We want to know in the back of our minds that the places that we are are safe and, and definitely that includes the digital realm as well. So what was the process that you put or that you utilized for developing these, these practices?

Nathan Webb (10:29):

Yeah, I leaned on a couple different aspects. I leaned first and foremost on our, our precedent within the denomination that I served in the United Methodist church. So I used our safe, sanctuary policies. I read through all of the different ones that are available. I made adjustments, I took language. I, I tried to process as much as possible. With what's already been done. I leaned on some of the, my, my peers and predecessors in the digital space and saw what they had kind of come up with in bits and pieces and chunks across the internet. Okay. And then after I got to on leaning on that, I leaned on discord. So discord itself is, is dealing with this, right? They are a, they are a community in it on the south, they're a community app. And so they wanna also practice safety.

Nathan Webb (11:08):

Now, theirs might not be faith based. It might not be based around even vulnerable populations, but it is something that they're considering. And so they actually have a whole like mob iteration program available online where it's, it's just a series of courses that you can take. It doesn't cost anything, but you can just take that course as a member of a discord and look into what it means to practice good moderation. And that's the word that keeps coming up. So, you know, I'm, I'm used to safe, sanctuary in my ver just given my upbringing. Yeah. But the reality of online safe space is that it's almost moderation is the key word that you're gonna keep finding and looking for is taking members within a community and giving them greater responsibility to observe and maintain that safe space. And so once I figured out, okay, I've leaned on the United Methodist church, I've leaned on discord as a platform.

Nathan Webb (12:04):

Then I leaned on the community. And so I went to the members, the leaders of, of checkpoint church. And I said, Hey, what, what are you seeing? What do you think that we need to do? And so we sat down over a series of a couple meetings. We worked through the content together. I told them my thoughts, I listened to their thoughts. And in the end we wound up coming up with this policy that we were all proud of, that we voted on and decided this was what we want. And we all agreed that like any good, safe, sanctuary policy, we're gonna review it annually and see what it looked looks like. And once we came up with that, I presented it to the conference and to some, some of the conferences lawyers very importantly, wanted to make sure that they a good look at it. And after all that kind of the, all those boxes were checked, we're pretty happy with what we've come up with. And so starting to work on it and see where the cracks are. We found a couple all already where things don't work, how they should or where maybe we need to make adjustments and we've made those as needed. And we'll continue to discern what is the best implementation of this policy?

Ryan Dunn (13:04):

Do you mind sharing? What, what one or two of those cracks or, or not refined policies it's been,

Nathan Webb (13:12):

Yeah, I think it comes down to, you can't know what you don't know. Yeah. And so a lot of the, the lived moments where you're like and this, I think, I think we're gonna come to this question soon, but the, the, the challenge, right, the challenge of online ministry is that there is so much nuance <laugh> and the internet is such a huge space. And so a lot of the like red flags for us we don't get a lot of red flags. We get a lot of yellow flags. We get a lot of like burnt orange flags. <Laugh> there aren't many of them that are just immediate red flags. And so we have to either come together and say like, Hey, what do you think about this? Does this, is this is this wrong? Is this you know, something that's not gonna go, well, I can, I can give you one example would be with the, the thing that happened at the Oscars with will Smith, right?

Nathan Webb (13:59):

Mm. Can we post memes about that a ye a year? I don't know. That's a touchy thing, right? That's a, that's a pretty divisive subject, but on the other hand where could you not find it on the internet? Who knows? So that's something that came up in our conversation with like, okay, so let's say something like, this happens again. How do we wanna address it? Do we want to address it? What does it look like for us to manage this? And what does it look like for us to be micromanaging this, what can we, what should we nip in the bud? And what should we allow for us to process through humor? If that's a way that we're, we're coping with something that is a national conversation topic what do we allow? What do we not allow? Where do we draw that line? And that's an example of like, that's, that's not in the policy <laugh>, and that's why it is so necessary for us to have these guardians. And that's what we call our, our moderators. It's so necessary for us to have these people that are serving in this capacity to help us discern that nuance and to work through these conversations together. And they've done an exceptional job there, but it is, it's just the things that you can't write in the policy. Yeah. that end up being more confusing than anything else.

Ryan Dunn (15:07):

Yeah. Well, and memes represent like, so such a such a prime example of that, because they're so interpretive, right. I mean, a meme can mean one thing to one person and something completely different to, to somebody else. So when, when something like that gets posted that could possibly be offensive or hurtful to, to somebody else, like, what's the protocol for working through that? What do you do with that?

Nathan Webb (15:36):

Yeah. If something is inflammatory or something that we find to be maybe just a, a tick too far, not so obviously far that we need to have it removed. Typically the best measure for us is to direct message the poster and to say, totally appreciate you taking advantage of our platform using this and sharing this creation, right. Especially with user generated content, some, a meme that somebody made on their own. And we say, we so appreciate you using this. Here are some concerns that we might have for this image. And so discord has a feature called spoilers where you can essentially make an image blurred until you click to reveal what's underneath the image. Twitter has a similar thing. I don't know if Facebook does, but they all kind of have this like idea of this is a image that you have to consent to view.

Nathan Webb (16:24):

And so we a advise for people that are kind of towing that line to spoiler their post, and then put a trigger warning. That's not spoiler at the beginning. So you can say, Hey, if this, if, if if, if something makes you uncomfortable, this is what's in this image. So don't click this image if this makes you uncomfortable. And, and that's kind of our, our current best practice. I don't know if that's where we'll end up or where we'll land in the end of the day. Maybe we'll have things where, when something happens similar to an Oscar's event, I'll make an announcement as the pastor of the church and say, Hey we're gonna, we're gonna call a we're gonna call an audible here. We're gonna say no we're not gonna post anything of this nature. If you want to go look at Twitter and see the things that are being posted there, that's totally cool. You can do what you want on your own time, but in this space we're gonna, we're gonna practice doing no harm in this space for now. And so kind of taking it by, by circumstance and allowing for each individual incident to be treated with its own kind of sanctity, I guess.

Ryan Dunn (17:23):

Yeah. Well, and I appreciate that approach because that in and of itself becomes a teachable moment for how we interact with one another, especially in the ways in which we might unintentionally do harm some. Yeah.

Nathan Webb (17:35):

I think, I think a real difficulty is that we see our discord as our church building. And so what conversations that happen in a church building, do you have control over there are some that you do, right. There are some that you really do. You can control who speaks in the sanctuary. You know, if you're preaching, then odds are people can't be talking about whatever they want while you're doing that. <Laugh>. But if you're in the fellowship hall and you step away for a moment and somebody's at their table, who's to say that they're not talking about something that you would deem inappropriate, mm-hmm, <affirmative> it's, it's really hard to control and to know what should we control and what, what is just normal human conversation. And, and that's, that's really tough nuance. Yeah.

Ryan Dunn (18:17):

Well, let's dive into kind of the uniqueness of ministry in digital space for a moment. So what are some of the procedures or policies that you had to bring up that are unique to doing ministry in the digital space?

Nathan Webb (18:31):

Yeah, I think our overlap was the two person rule, which is pretty, pretty well known in the church setting of you wanna make sure there are two people present in a space so that it's not, or, or at least two people, two leaders present in a space whenever you have okay. Yeah. Right. Anybody under the age of 18. So we kind of just honor that as being a, you're gonna have two guardians present in a digital room. So if there's voice or video, then we ask that there are two guardians present. And that's just kind of a way that we try and make this work. What, and so I see that as kind of an overlap, what's really specific is direct messaging. We have the ABI to see messages that are posted in our discord server, but as soon as you go to a direct messaging app, we have no control over that conversation.

Nathan Webb (19:19):

We have no control over where that goes. We have no foresight, we cannot do anything about that. And so that's really tricky is that the internet has so much space that we, again, it comes back to, we cannot know what we do not know. And so what we typically try to advise is we try to nip it in the bud or to see in advance what could happen and to advise with a series of kind of best practices. So we have a whole page in the SafePoint policy that is just a, be a best practices page. It includes the two person rule it, it says <affirmative> for best practice, just don't direct message. There shouldn't really be a need or a direct message to exist. It doesn't mean that if we find out that you're direct messaging, we're gonna kick you out of the group.

Nathan Webb (20:02):

Like, no, that's not, that's not what we're gonna do, but we want to acknowledge with foresight that you don't know, some of these people, you cannot know some, these people, we, I have, you know, some of my, my most devoted leaders in this, in this congregation, I have never seen them in person. We have never shook hands. And so it is such a weird scenario to facilitate any kind of communication between two people that have literally never seen each other's faces never heard each other's voices or do not know one another or know, and another live. And there's, there's strange things that we have to work through there. And so our best foresight is just to say, just don't direct message. Everything that's done should be done in this space. We don't, we don't mind if there's boring stuff written, as you're talking about your side tangent, I don't care.

Nathan Webb (20:49):

I would rather, you have a side tangent than you start an uncomfortable conversation that we know nothing about and that you maybe don't want to get yourself into. And then our final, our final recommendation is we have these guardians. And so we recommend, Hey, if you've got a question, if you've got a concern, if something things come up, we have these people they've been background checked. They've been you know interviewed, they've gone through a process. These are the people that you can direct message. If there's something uncomfortable or something you truly cannot talk about on the discord server, out in the front of everybody, then you can direct message these people. So we, we, we provide a safe space within our safe space for the more confidential conversations beyond just me.

Ryan Dunn (21:30):

Okay. How do you identify who those I guess authorized people are,

Nathan Webb (21:36):

Yeah, it is. So it's, it's self nominated, but they do go through a process. So we, we do go through checker and we give them background information or a background check. And then we also are working on, so our first group obviously didn't have the precedent of the group, but what we're working on now is training this initial group and allowing them to grow in this capacity so that they can discern the future class, if you will. So I don't know exactly what it's gonna look like. I see guardians kind of like trustees in the traditional church context, where they're just kind of this role where they're, they're serving, and they're literally building this, this place that we have this digital space and trying to discern what needs to go, where and when. But they basically go through just kind of an interview one-on-one process where I'll sit down with them, talk, talk with them, get to know them.

Nathan Webb (22:25):

I've seen them, they have a recommendation from somebody within the discord. They've a certain length of time on level two, which is our leadership or our like first level of membership. So they've been around for a long enough time that we know them. And then we reserve the right to say no, if, if they don't line up with something or if we don't know them well enough, or we wanna get to know them better before we say yes to this, then we reserve that. Right. and eventually the, the goal for this program is for us to have this first class trained, to kind of elect a leader within this group to be our, our guardian coordinator and myself, a guardian coordinator. And then somebody outside of the guardians will be through that interview process of discerning new entrance and talking about, Hey, here's what we have a new person. This is who recommended them. Let's talk about this. So that's, that's, that's the basic gist of what we're doing and how we find these people, but okay. With, with, with such a new church plant, we really have to trust on people just to, just to self nominate and then for everything to, to hopefully come up as best as it can.

Ryan Dunn (23:28):

Okay. So avoiding direct messages empowering some people to be I guess, a safe connections within, or authorized and safe connections with within the online community for for our youth pastor who's listening, who is you know, taking their youth group onto discord for some meetings, what might be another policy that they, they want to implement?

Nathan Webb (23:56):

My, my best advice is to do as much research as you possibly can into automated systems. So there is, there is a whole, I'm trying to think of what it's called something dot Gigi. But there are basically the, maybe it's discord dot Gigi. I know that's, that's the actual site. So there are discord bots that you can download top dot Gigi. There are discord bots that you can download, and I would advise that you look into it. In particular, we have a, we have a bot called me six M E E six and that is our our moderation device. And so that allows for us to set up automated control parameters. Sometimes it can be a little bit annoying because some of the things that it can do are maybe a little bit much, but this, that would be, my advice would be to, to learn the different bots that are available.

Nathan Webb (24:47):

And to just, just go with the max settings, you know, not necessarily on everything discern what's best, and what's not best, I don't care if people spam emoji or random things like that. But if people are sharing links, then I'd like to know what that link is. And I'd like to be able to say, here, here's some sites that you can't share links from. That's, that's an important parameter to set up, and that's not something that a me as a, as a, as the pastor or one of my guardians should have to waste their time with. That should be something that a bot can do automatically. So trust in that technology a little bit in the things that it's good at and, and let, let them do the, the menial labor while your guardians take the more nuanced approach.

Nathan Webb (25:25):

So that's, that's how we've kind of approached it, but I think that's a, that's a good practice is to learn as much as you can about the platform that you're on. It's a part of the reason I really, I really discourage people from Facebook is I just feel like Facebook has their own policy and they don't really let you implement extra things. Whereas discord, there are, there are hundreds of bots that you can plug in even on I'm sure on my networks, they probably have bots as well, or any of those, any of those third party platforms that are intentional about forming group spaces they're the ones that are doing it light years above the, the big three

Ryan Dunn (25:59):

Within that me six, are you able to like identify keywords that would put a red flag on a post?

Nathan Webb (26:06):

Yep. You can do keywords. You can, you can do, you know, it has a profanity limit already, but you can intensify that. You can add words to that. I think that the tricky part is, is that there's so much out there in so many new terms being invented every day. So you've gotta kind of also have your finger on the pulse at the same time and acknowledge like, okay, here's, here's, here's something new. Let's make sure we're, we're regularly updating this list regularly, updating this platform. And when things come up say, Hey, here's something that happened. What can we do in the future to make sure this doesn't happen? And I, I'm thankful for the work that's being done by, by platforms like discord, because there are a lot of things that come innate in the platform. Okay. They have a, they have a system where somebody can't just make an email and sign up for your discord in five minutes.

Nathan Webb (26:52):

Sometimes people have to verify, sometimes people have, have, have to be on your platform for a time period before they can even post anything. Yeah. You can make it so that people have to agree to certain things before they can post anything. You can lock down different channels. There are a lot of things that you can do. And so the bigger question for me is, is I think that the, the, the concern is always the lack of control. That's the biggest challenge with online platforms, but it's the same way in real life, right? We have the lack of control, the things that we can't control. Once people leave our doors of our youth building, we have no control over. And so my better question for online platforms is what can we do? What, what is made available to us and how can we best utilize the incredible utility of technology to make sure that we are, we're using all that we can and all the ways that we can to provide the safest space that we can,

Ryan Dunn (27:43):

You know, let's jump back to the kind of too deep leadership thing where you want two accountable people interacting with somebody who's part of a vulnerable population. You know, anybody who has ever rolled out that practice, whether it be in digital space or in person space has heard the objection. You're really limiting my ability to kind of build relationships or to have intimate conversations with people. How have you been responding to that? Objection in digital space.

Nathan Webb (28:15):

Yeah. It's really tricky. It's really tricky. And it is especially tricky whenever we have gone so long without having this <laugh>

Ryan Dunn (28:23):

Right, right, right. People get accustomed. Yeah.

Nathan Webb (28:25):

We started with, so this is just an example. The reason that we have our two person policy is because of a fear, a nightmare that popped up in my head, we have, or we had, whenever we first started our discord within discord, you can set up text channels or you can set up voice and video channels. And so we had voice and video channels. We had six that were open rooms. The idea behind that was that I wanted to have people that were a part of our discord, be able to ping somebody in the discord and say, I wanna play super smash brothers who wants to play super smash brothers with me right now. And they would say, I wanna play. And so then all these people that are playing super smash would go into room number one, and they'd be able to chat with each other while they play smash.

Nathan Webb (29:06):

That was the idea. It was innocent. It was pure. It was straightforward. I knew what I wanted. It had been on other discords it's in other places, it seemed like a good idea. Then I realized anyone can log into these rooms. And at any time of day they can log into these rooms. I have, I have no way of knowing the conversations that go on here, there no recording feature. I can't set up a way to record any of the conversations. There's no way to monitor what goes on in this room. And so this is the equivalent in the church space of going into a room and closing the door, right. Without a window, without anything in the doorway. I can't see in, I do not know what's happening in that space. And so we had this, this fear of like, I don't know what to do about this.

Nathan Webb (29:50):

Thankfully we were still small enough that it wasn't really being used. And, and it may have never ended up being used. And it certainly may have never ended up being a problem, but the problem was there and it was so viable. It was so possible that I knew that something needed to be done. And so we slowly but surely whittled away at that. We cut it back from six rooms to three rooms first, and then those three rooms, I started intentionally creating space. So I allowed for I maybe set up a time with a, with a potential guardian and I said, Hey, would you mind just sitting in me, sitting, sitting in here in this room and we will have lunch and invite people to come have lunch with us during a lunch hour, or would you mind just, just streaming this game with me for a couple hours here on the discord in particular.

Nathan Webb (30:34):

And then after we did that for a little while, we announced the SafePoint policy and the same day that we announced the SafePoint policy, I explained we're closing the rooms. And here's the reason why, and that's what we're doing. And there just, wasn't a question about it. It was, I, I was open to questions that people wanted to ask me, but there was no negotiation in this process. And so kind of having a, a, a firm firm grip here and saying, this is what we're doing allowed for a certain sense of, okay, this is fine. This is what it, this is where we're going. And now the trick is encouraging our guardians now who have the equivalent of the church key to use these rooms, to open them back up regularly so that people feel engaged. And as a replacement for we, we opened up a room called the room request.

Nathan Webb (31:19):

So you can go in this text channel, anytime that you want, you can ping a guardian they're pingable, and you can tell the guardian, Hey, I'd really like to play smash, which you mind hopping on with me. And then we have the opportunity for that kind of relationship to continue to be created in a space that is intentional and safe. And so my hope is that by creating this space, we're not actually limiting connections because it wasn't being used, right. It wasn't being used in the first place. So the argument of you're really limiting my connection. Well, odds are those connections weren't happening. And then second off, we are still allowing for those connections, we're actually making them more intentional by creating a space for those things to happen specifically. And we're, we're making them more accessible in a safer way. And so my question was, N not so much, how can we ensure that we're not limiting people, but how can we make these opportunities safer? Mm. So still, still considering the, the opportunities are the key element. How can we make the opportunities a safer space?

Ryan Dunn (32:17):

Oh, well, well, I I'll offer, this is a parent myself. I have a 14 year old son and unintentionally been fairly successful at, at raising kind of the prototypical preacher's kid, because he has <laugh>, you know, very little want of his own to be involved in, in church. Right. But he, he loves checkpoint church and is, is fairly well invested in there. And from that perspective, you, it is, it's crucial for me to know that my son has this kind of guarded space. I, that's not even the right way to put it because, you know, guarded tends to think of like, it, there, there are bars around it and that kind of thing, but a safe space, a non-harmful space or a space where people are looking out, truly looking out for his, his best interests you know, as he engages within the online realm that's huge, you know, and I wanna be able to, to give him that opportunity to, to interact in online spaces, because it is such a crucial part of his culture and what he does, but, and so it's great to know that you know, there is this intentional space where he is, he is being looked out for and, and protected maybe even without his knowledge.

Ryan Dunn (33:37):

That's wonderful. So it's rate that, that you've put this together. Is there a spot where people can fish bowl your, your safe, sanctuary policies, your safety protocols?

Nathan Webb (33:49):

Yes. In fact, I would, I would really encourage people to take advantage of this. So checkpoint church.com/safepoint. You can go right now, you can watch the video where I kind of walk through this stuff real quickly, our community. And then you can also download the policy for yourself. So I have the PDF right there on our website. It is available for download. All you gotta do is click that PDF link, and you will get access to that right away. I would encourage you to use that. It is not something that I, I have put any kind of intense trademark or copyright on. I want people to make, take advantage of this. And if they have ideas for ways to adapt it even better or ways to practice even better, I wanna be in conversation with you or if you are not sure how to implement this, I wanna be in conversation with you.

Nathan Webb (34:32):

I, I see my responsibility as a church planner online is not only to serve this church plant, but to also help other church plants, to, to live into the connection of the digital church, the online church, because we are so connected without knowing it. I mean, we really have no choice, but to, but to be connected online. So we've got to figure out how to work together and implement these things with one another. And so the reason I created this is because I couldn't find it. And so now it is found it is here. It is their, or for you, please use it. Please take advantage of it and make sure that your community is, is as supplied for as possible with this. And, and I think at the end of the day our, our goal at checkpoint is twofold. We want to, we wanna model healthy, safe, living online for you know, people of people like your son who are on our community, right?

Nathan Webb (35:23):

We want, we want to model what it looks like to be a healthy gamer, a healthy nerd, somebody living in community that doesn't rage quit, that doesn't, you know, absolutely eviscerate people online. There's a time and a place for banter and for, for, for game talk, but there is even more important than that, making sure that we're keeping in the back of our minds, how are we doing good? How are we making sure we don't do harm? And at the end of the day, we may, this, this policy may look totally different in five years, but at the end of the day, I hope that along the way we have modeled the best practice we possibly can for providing a safe base for all people online.

Ryan Dunn (35:58):

Yeah. And talking about all people. I do want a caveat that it, I, it's great for me to know that my son is engaging, not just with, with teens that, you know, checkpoint church is really across generational experience. And so that he is interacting with so many adults who are caring for him is wonderful. So you know, I wanted to throw that out there, that checkpoint church is is definitely a cross-generational experience. It's great.

Nathan Webb (36:25):

Oh yeah, absolutely. We've got people of all over. Yeah. And I, you know, whenever I first started the church, I wanted to reach like the 29 year old, that was who my, who my coach made me pick. But we reached so many youth age and we reached so many pastors <laugh> yeah. That are just nerdy pastors that just want to have an opportunity to talk about their video games and to play along with us. So yeah, we, we really do run the gamut of all ages and all different shapes and sizes and careers and backgrounds. And yeah, it, it really is. It's, it's a fascinating community to be a part of. And I'm thankful for the wide wide birth of, of people that we do have.

Ryan Dunn (36:59):

Cool. Well, thanks for doing what you do and for offering this this example for us, I, I think it's valuable moving forward and, and I hope that we can begin to even maybe share some practices in our spaces that people are adopting a as they take this, these policies and adapt 'em to kind of crossover settings as well. Awesome. Nathan, thank you.

Nathan Webb (37:21):

Yes, absolutely. Thanks for having me.

Ryan Dunn (37:26):

We talked a lot about Discord in this episode. If you wanna refresher on Discord, then check out our previous session with Nathan, which came out in season one. Another episode of Pastoring in the Digital Parish that might interest you would be our session with Dana Malstaff, where we talked about Facebook groups. And that's a good one because Dana gave us several guidelines and procedures that her online communities utilize in order to keep their Facebook groups relatively free from doing harm.

I'm Ryan Dunn. I'd like to thank ResourceUMC, the online destination for leaders throughout the United Methodist church. They make this podcast possible. And of course they host our website, which is pastoringinthedigitalparish.com where you can find links to Checkpoint's SafePoint policy. And several of the other tools that we mentioned in this particular episode, I'll speak with you again in a new episode next week in the meantime, peace.

 

 

 

 

On this episode

Nathan Webb of Checkpoint Church

Rev. Nathan Webb is a self-proclaimed major nerd in just about every way. He loves video games, anime, cartoons, comic books, tech, and his fellow nerds. Spurred by the nerdly love, Nathan founded Checkpoint Church–and all-digital community for Nerds, Geeks and Gamers. He’s ordained in the Western North Carolina Conference of the United Methodist Church and is based in the Charlotte area.

Ryan Dunn, co-host and producer of the Compass Podcast

Our proctor/host is the Rev. Ryan Dunn, a Minister of Online Engagement for United Methodist Communications. Ryan manages the digital brand presence of Rethink Church, co-hosts and produces the Compass Podcast, manages his personal brand, and obsesses with finding ways to offer new expression of grace.

United Methodist Communications is an agency of The United Methodist Church

©2024 United Methodist Communications. All Rights Reserved