Five Good Ideas for values-driven digital transformation
How can non-profits advance equity, self-determination, and reciprocity through our use of technology? Data and digital technologies can be essential tools to help non-profits and charities meet our missions and multiply our impact. Yet, non-profits have not always been well-served by technology, nor have we seen our core values embedded in how technology is developed, used, and regulated. In this session, Amy and Katie offer tangible advice on how we can better incorporate our missions and values in our engagement with technology.
Five Good Ideas
Never put technology ahead of people.
Create diverse tech committees to support decisions, testing, and feedback.
Only collect data you can protect. And give it back to its owner.
Make your values the foundation for technology adoption and investment.
Make your voice heard in the technology policymaking process.
- Book: The Tech That Comes Next: How changemakers, philanthropists, and technologists can build an equitable world
- Resources: NTEN’s Equity Guide for Nonprofit Technology and Tech Accelerate assessment
- Publications: #DataBack: Asserting and supporting Indigenous data sovereignty from Animikii, A Responsibility to Rebuild: Investing in Digital Infrastructure for Civil Society from TAG, Data Empowerment Report and ready-to-use data policies for nonprofits from NTEN, How non-profit executives can build their digital leadership skills from The Philanthropist
- Engage: Sign up for working groups and to stay aware of opportunities with the Canadian Centre for Nonprofit Digital Resilience
Please note: This transcript has been edited for clarity.
Elizabeth McIsaac: Now, while many of you are dialing in from across Canada, I’m speaking to you from Toronto, and I would like to begin today’s session by acknowledging the land where we live and work, and recognizing our responsibilities and relationships where we are. As we are meeting and connecting virtually today, I encourage you to acknowledge the place you occupy. I acknowledge that I am and that Maytree is on the traditional territory of many First Nations, including the Mississaugas of the Credit, the Anishinaabe, the Chippewa, the Haudenosaunee, and the Wendat peoples, and is now home to many diverse First Nations, Inuit, and Métis peoples. We also acknowledge that Toronto is covered by Treaty 13 with the Mississaugas of the Credit. This territory is covered by the Dish With One Spoon Wampum Belt Covenant, an agreement between the Haudenosaunee and the Ojibwe and Allied Nations to peaceably share and care for the lands and resources around the Great Lakes.
It is now my pleasure to introduce today’s session and speakers. The question for us today is, how can you use technology to advance your nonprofit’s values? Data and digital tools can be essential to your mission and even multiply your impact. Yet, nonprofits have not always been well-served by technology, nor have they seen their core values embedded in how technology is developed, used, and regulated. Amy Sample Ward and Katie Gibson will offer tangible advice on how to incorporate your mission and values into your digital world. Amy believes that technology should be accessible and accountable to everyone, especially communities historically and systematically excluded from the digital world. They are the CEO of NTEN, a nonprofit creating a world where missions and movements are more successful through the skillful and equitable use of technology. Amy’s second book, Social Change Anytime Anywhere, was a Terry McAdam Book Award finalist. Their most recent book is The Tech That Comes Next with Afua Bruce.
Katie is a lawyer by training and an activist at heart. She is passionate about using entrepreneurial tools for social impact. Currently, Katie leads strategy and partnerships at the CIO Strategy Council, the Chief Information Officer Strategy Council, a nonprofit focused on Canada’s digital transformation. In this role, she co-founded the Canadian Centre for Nonprofit Digital Resilience. She also leads work on sustainable IT and responsible AI, Artificial Intelligence. We’re going to hear more about that, I think. For their full bios, ideas, and resources, please download the handout in the chat, and it is now my pleasure to welcome Amy and Katie. Over to both of you.
Katie Gibson: Great. Thanks so much, Elizabeth. We’re really delighted to have this opportunity, and we’re actually going to start with the sixth good idea, which is giving Zoom presentations without slides. So this will be more of a dialogue between Amy and me. We each only have two and a half good ideas between us. Hopefully, we’ll get up to five, and we’d also invite all of you to add your ideas or fractions thereof to the chat.
So as Elizabeth said, our world is increasingly digital. Our jobs are increasingly digital. Anybody who’s played around with ChatGPT recently can see that this is only going to continue. As Elizabeth said, data and digital technologies can be essential tools to help nonprofits. Yet, we have not always been well-served by technology, nor have we seen our core values embedded in how technology is developed, used, and regulated. So we’re going to share our ideas today to advance our sector’s values like equity, self-determination, and reciprocity through our use of technology. I will pass it over to Amy.
Amy Sample Ward: Awesome. Thanks, Katie, and thanks, everybody. It is great to be with you all, and I also feel overwhelmed at the idea of five ideas In about 20 minutes. We could talk about just one of these ideas for the whole day. So I’m just putting that out there already that I feel overwhelmed, and if that is something you feel as we get into this, well, I can’t solve it, but at least we’re not alone. We are together in the overwhelm.
1. Never put technology ahead of people.
So the first point that we want to talk through here and of course, acknowledging that there’s folks from so many different types of organizations, sizes of organizations, locations of organizations, all of that is all here. I just wanted to say that at the start because while it may feel like some of these points we’re bringing up are further or closer to your own experience or something you’ve done before, something you’ve never thought of before, all of these things are for all of us to talk about together, and there’s no one on the call today who is disqualified from being in this conversation for any reason. The onus is on all of us, but also, the opportunity is on all of us to really be able to do this and have these conversations together, even if you would never call yourself a technologist.
So all of the disclaimers out of the way, the first point here is that technology should never be put ahead of people. I know that it is tempting when it’s shiny, and splashy, and has many promises about how it’s the solution to everything you’ve ever thought of, but it’s not. It is just a tool, and that’s really important that we keep technology in that place so that you, your community, your teams can remain the humans at the centre of whatever it is you’re working on. Even if you did have the newest, and the coolest, and the best, and the most promised technology that’s available, if your community members, if your program participants, if your staff don’t know how to use it, can’t get online to use it, don’t know what to do with it once they have gotten online to use it, it doesn’t matter that you have the best technology because in that case, it isn’t the best. It’s just shiny and sitting somewhere that no one can use, right?
So if the point is that it is being used, we can let go of that feeling that we always have to have the latest edition of something or we have to get rid of what we already have and go for something new because that will be better. What is best is what is best for your people, for your community members, your participants, your staff, whoever is really needing to be in there. So before you make any other technology decisions, before we get to our other four ideas, we want to start by asking, who is this for? Who is using it? Who will be impacted by it? Whose data is even in this thing, and do those people get to have access to it, or does their data just disappear from them? How do we orient around those conversations about people before we get into conversations about which CRM, or which website, or anything else?
Katie Gibson: I just wanted to say, Amy, totally agree with you, and I think from a staff perspective, tech should save us time and make our jobs easier. If nothing else, it should be helping us free up our time from routine tasks to focus on the things that require people and human relationships, but I’ve certainly had the experience, I’m sure most of us have had the experience of wrestling with systems and applications that really have the opposite effect. That’s just from a staff perspective, and then when we think about our service delivery clients and beneficiaries, similar experiences there as well.
Amy Sample Ward: Yes, and I think there’s honestly a lot we could talk about when we get into conversation in the second half if folks want to go there, and I’m sure folks in the chat would love for you to share any of your thoughts, but there’s a lot to unpack and why we feel pressure, or obligation, or requirement to make technology decisions that are about the technology and are not about the people because the foundation told us that’s the tool we had to use, and the power dynamic between a foundation and a nonprofit being told what to do is real and not the nonprofit’s job to dismantle the foundation’s power, or a vendor, or an IT consultant, feeling like we as staff are hearing all of that myth from the sector that nonprofits aren’t technical, that we don’t have that type of knowledge, and so who are we to make these thoughtful decisions? We need to hire a vendor or a consultant.
You know your mission more than anybody, right? You and your community members are closest to your work, and that means you do have the knowledge to make these decisions because it isn’t just about fancy technical language or the name of what code they used. It’s about what you need, and who those people are, and what you’re trying to work on, and that is already enough to be in these conversations and not feeling like we have to rely on other people to tell us what it is we’re doing.
Katie Gibson: Yes. I’m glad we made it almost eight minutes before Amy started talking about dismantling power structures. So I think that’s a record. Maybe, Amy, if you want to move into the second good idea that we have here.
2. Create diverse tech committees to support decisions, testing, and feedback.
Amy Sample Ward: Yes, and I’m excited to hear if folks want to put in the chat. I am looking at the chat even though I know on Zoom, you can’t really ever tell what people are looking at, but I am watching to see if you share anything there. I’m curious if anyone here has stories they could share from doing this themselves. Our second idea is a recommendation to create diverse tech committees so that you have support for decision-making, for testing and piloting things, for giving you feedback, and those committees often, I find, are best when they include folks from lots of different departments across an organization. Even if there are three total staff, making sure it’s not one person on that tech committee as well as community members.
I find, and I’m very happy to hear if someone disagrees with me, and that’s fine, we can have a conversation, but I very much support community members, people who have been in your programs, who have received your services, being on these tech committees way before I had ever put a board member on that committee because it’s not about who has authority in the structure of this organization that you didn’t get to create that government required authority anyway. It’s about people who are closest to the impact of these decisions. So having a diversity of folks who’ve maybe been in programs for years and years, and somebody who just participated in a program recently, somebody who loves your program, somebody who’s had a rough time in your programs. Right?
A lot of different experiences on that tech committee, and that’s not to say that that tech committee has to be, “Okay. Well, we brought it to you. Tell us what CRM.” They don’t necessarily need to do that, but a committee where you can say, “Hey, what’s it like registering for that event? What was it like when we asked you to share evaluation?” “Oh, that was really rough. I don’t understand the language you’re using.” Right? There’s so many pieces of the tech, right? Whether it’s the platform it was on, how they accessed it. Could they do it on their phone? But also, things like, did they understand a dropdown versus a radio button versus a checkbox? Right? There are so many layers to this, and having these relationship-based places where you can talk things through.
Often, we’ve heard so many stories in the NTEN community over the years of folks who built these communities finding, “Oh, this thing that staff were like, ‘Oh my gosh, we’re going to have to do this. It’s going to be so much work.'” They go to the community and find, ‘Oh, no one else thinks that’s a priority.'” Right? The real priority is something else, and that is already a huge insight. But then, to figure out, “Oh, it’s even something we didn’t even have on our radar.” Right? It wasn’t something we even had thought to consider, and it’s the thing most talked about by the community. Right? So it’s a really great way to find open-ended directional feedback without just saying, “Check which one you want us to vote on.” Right? It’s much more conversational and I think a learning space. I see some folks chatting.
Katie Gibson: Amy, do you have any examples either from NTEN or from organizations in your community of someone who’s done this and it’s worked particularly well?
Amy Sample Ward: Yes. I’m happy to always use ourselves as an example. I never ask someone else to do something that I wouldn’t do. So NTEN has a tech committee, and it has, like I said, lots of different community members. A year and a half ago, when we were doing an update on our website, which included some different layouts and styles, but also, what it really included from the staff’s perspective is a reorganization of the content. Fewer pages, making things more streamlined, in our mind, easier to find. That tech committee, well, of course, they wanted to have conversations about, “Well, how did you build it? What tools are you using?” They wanted to know some of that because they were curious. A lot of their conversation came back to, “Oh, I’ve never seen a menu like this. This was really interesting. I was able to view everything before I had to make a selection. Oh, that’s really helpful feedback. Okay. That’s working.” Right?
So it was a place where we were able to test things that we couldn’t have written in a script. We wouldn’t have known, “Please tell us if you like X, Y, and Z, or how this is working.” Instead, we could just say, “Get in there.” It’s private to the world, but it’s open to them. “Tell us how you’re using it,” and giving them that space. “Let us identify places where we did make changes and places where we didn’t even really know we’d made a big change,” and they could feel that. So happy to chat with folks more in depth, but I don’t run our tech committee, and I wouldn’t take credit for doing so. So If anybody is trying to get this off the ground, or thinking about a tech committee, or maybe you have one, and it’s going sideways, and you need help wrangling it back into a productive space, I know Carl, who is on the NTEN team and runs ours, would be more than happy to connect with you anytime.
3. Only collect data you can protect. And give it back to its owner.
Katie Gibson: Amazing. Of all the ideas we have today, this is one that you can start implementing tomorrow. In addition to Amy and Carl, I see folks in the chat also have some great expertise here which they can lend to this community. So our third good idea is only collect data you can protect and give it back to its owner. We know data lies at the heart of any digital transformation. Our sector’s work is becoming more data-driven, and that’s a good thing, but it also comes with risks. A lot of nonprofits are custodians of very sensitive data, whether that’s personnel health information, other kinds of information that our clients give to us, whether it’s sensitive financial information from our donors. To coin a phrase, “With big data comes big responsibility.” So, for example, if you’re collecting your client data on a clipboard, the risks and consequences of a breach are low. Once that’s in digital form, the risks and consequences start to multiply. So, quite simply, if you can’t protect it, don’t collect it.
A couple of pointers about how to do that: for data that you already collect, start today with a cybersecurity assessment. Also, design a data governance plan that’s going to really look at the decision-making and accountabilities about data and around data. This ties very nicely back to your tech committee. You’ll also want to think about executive level and board level accountabilities, and then another piece, and I know NTEN has an example of this on their website, is designing a data breach plan. So those are some things you could start with for today. For data that you don’t currently collect, but are considering collecting, good idea to undertake a privacy impact assessment before you get started. That can help you identify and mitigate privacy risks right off the bat.
So that’s one piece of this. Amy also mentioned this before, this idea of extractive data practices, you being at one end of a strong data vacuum where you’re sucking up all the data and nothing is going back the other direction. Tech companies are infamous for this, but this is, I think, an area where values-driven nonprofits really need to consciously push back. Fundamentally, data about your constituents belongs to your constituents. What’s more, they should be given access to their own data, and even better, they should be able to benefit from it directly. I know this can sound like a tall order, but again, it’s something to start thinking about from the beginning and a principle to build in to your work.
Amy Sample Ward: Yes. I just wanted to add to that, Katie, a couple things that we hear a lot in the NTEN community, all around the world, honestly, is this feeling of, “Well, what we collect isn’t sensitive.” You don’t get to decide what’s sensitive. If you are able to say that that is data from a certain person, that person can consider that sensitive. Right? There’s a specific definition when Katie says if you can’t protect it, you don’t get to decide that it is worth protecting. Right? If you are collecting that from your constituents, it really should be protected.
The other piece that we hear a lot is, “Well, our data is what’s in our database,” but if you are, say, an organization that is supporting immigrants and refugees navigating legal services for free, that’s already a number of check boxes that make them vulnerable offline and make them very vulnerable online. As an organization, if you are putting up events on Facebook and asking people to RSVP on Facebook, even though that’s not in your database, you have created a data trail for them that is really, really vulnerable.
I think it’s important that while we don’t own Facebook and we cannot tell Facebook how to operate… I mean, the list would be very long if we could. Even though we don’t have that power, you do have the power to say, “Actually, this is not safe for our constituents,” and to say, “Here’s the event. There is no RSVP option. You have to go to our website, or you have to call us, or just show up.” Right? But really, you need to be the steward in that situation because your constituents are doing all they can to navigate very difficult circumstances, and you need to be able to have the space to say, “This is not safe,” and we need to own that. So I just want folks to remember, even if it’s not on your own website or in your own database, you have a role to play in helping your constituents stay safe with their data.
Katie Gibson: Amazing. Yes. Let’s keep rolling.
Amy Sample Ward: Okay.
Katie Gibson: We could talk about data all day and happy to chat more about it at the end.
4. Make your values the foundation for technology adoption and investment.
Amy Sample Ward: I think we will. Yes. So the fourth piece here I think really ties back into our title, and we wanted to make sure that we didn’t just talk about values like an undefined term, something especially that when we’re talking about nonprofits, there’s a lot of different perceptions of what we mean when we say the word “values,” but when we are recommending, for our fourth point here, that you make the values the foundation of your technology, adoption, and investment, what we’re really talking about are very specific values.
I’m going to share those in a moment, but first, I just want to say that while we may have a lot of different ideas about what values are, organizations may have a thing on our website that says, “We value blah, blah, blah,” even if we never said the word “value,” even if we never said what those values were, we are, as people, making decisions based on our values every day, and that means it’s not about, “Oh, when we choose to, we do.” We are always making those decisions. What we value is how we navigate things, and that is why we need to talk about it explicitly, why we need to make sure that we are all focused on the same values so that we can be making decisions about technology, but also our work, our missions in a really intentional way. How we make those decisions influences, of course, the specific technology, but it also influences our organizational culture, whether and how our organization makes an impact. It influences our future decisions. It influences everything.
So one of the resources that we offered and that Liz mentioned at the start was The Tech That Comes Next, the book that Afua Bruce and I wrote last year. I’m not expecting that you buy it or have read it, but I do want to offer the six values that we start that book with. In the book, we frame those values as the things that are required if we are to build an equitable world. So not a world filled with technology or a world that works for technology, but an equitable world for people. So I just want to briefly share what those six values are and offer them here as maybe something we come back to in the second half when we’re in conversation.
So the first is that we value the knowledge and wisdom of lived experience. So we’re not making assumptions about what someone may know or how someone might contribute to that tech committee, for example, just based on what we think they know or maybe whether or not they have credentials. But really, we are saying the lived experience they are bringing is enough and is valuable.
The second is that we value the participation of a diversity of people in everything, in the whole process. Right? So in decision-making, and planning, and budgeting, but also in prioritizing and figuring out what something might be, what the priority should even be. Community members should be involved from the beginning, especially at the beginning when they can scope what is a priority for them.
The third is that we value accessibility as a priority. Accessibility when we’re talking about technology, of course, means things like web standards, and that’s part of it, but just as the other values we’ve talked about and the other points we’ve already made today, valuing accessibility also means how do you communicate with your community members? How and where do you meet to have these conversations? Do you use jargon or industry language, or do you change it so that everyone can be part of the conversation? So really making sure that how you are navigating all of this is accessible in all different ways to everyone.
The fourth is that we value the multiple ways that change is made, knowing the world needs to be different. Currently, not working for a lot of us, so how do we balance that we need immediate supports? There are people who need shelter, and food, and school, and health, and family support in order to participate, but there’s also the reality that folks who have the most impacts right now are the ones who are best to centre and have informed larger system change work.
The fifth is that we value the strength of collectively creating a vision. I think together, we are always going to be better, but that also comes when we’re talking about technology. It comes when we’re talking about our missions. It comes when we’re thinking about where do we want to go. So it’s never just one person or even just an internal team’s opportunity to be part of that vision, but a collective vision.
Last that we value that we are always going to continue pursuing learning, and skill-building, and lived experience in different ways. So the goal of that tech committee, the goal of your organization isn’t to get everybody to all learn the same one thing, but to let everyone to continue learning different things, learning in different ways, developing different skills because those will all continue to support us as we change and evolve, and hopefully, get even better technology that we’ve created as a community based on our needs together. Those are the values. I feel like that was like a, “Let’s go do this,” but open to your thoughts, Katie.
Katie Gibson: Yes. Well, on the topic of, “Let’s go do this.” So I know in conversation with a lot of organizations, especially smaller ones, the rubber hits the road when they’re actually trying to select a vendor or purchase a product. Wondering if you have any examples of how small organizations actually go about applying some of these values when they’re starting out that process or when they’re in that process.
Amy Sample Ward: Mm-hmm. Yes. When you’re trying to adopt a new tool, I think the first piece of that process is always figuring out, “What are our needs? What’s our dream list? What do we absolutely require that this thing do for us, or hold, or whatever?” That, again, is a perfect place where the community should be part of that from the start. Sure, maybe a program participant doesn’t know which report the program staff member needs to pull to run the event, but they know so much about what it’s like experiencing those systems. Both staff feedback and community members can be part of that process, and then even inviting community members to join when you are having meetings, a pitch from a vendor, showing you a demo, having them join. They’re going to have different questions, they’re going to have different priorities, and that only makes those demos stronger.
A piece of that too is really getting at the accessibility note. Having vendors assume high speed internet that you can just hop on that Zoom and do this demo really forces them to say, “Okay. How many different people are joining? What’s the best way to do this demo?” might you be in a room and other people will be directly on their own computer because there’s just different access needs and already gives you an insight into whether that vendor is going to work with you in that way. Right? So there’s lots of moments along the way, but I think just as an example of getting started and including community members, and how that process can be a little different, hopefully that helps.
5. Make your voice heard in the technology policymaking process.
Katie Gibson: Amazing. So I’m going to move on to our last good idea. This is the one that makes my policy heart sing with Joy, which is make your voice heard in the technology policy making process. Of course, many of you engage in some form of policy advocacy work, but most of you don’t consider the relevance of technology policy making to your work. I’d say if you and your communities aren’t well-served by the tech that’s available today, get involved in building the tech that comes next. Get involved in setting some of those legislative, and regulatory, and policy guardrails. The rules of the digital economy are being written as we speak, and we need to be around the table. We need to be there advocating for the needs and values of our communities.
So just to give the example of an organization that’s working with children and youth, are you advocating for a policy response to the mental health harms caused by the social media companies? Many of you may have seen that Seattle school boards are actually suing tech companies for this harm. Are you making noise about the amount of data that educational technology platforms collect about students? This includes platforms that public schools funded by taxpayer dollars require students to use.
Another possibility is, are you advocating for an age-appropriate design code like the one that they now have in the UK, which curtails companies’ abilities to collect, and share, and use children’s data. So that’s just a children and youth example, but you can think of policy issues, and we’re happy to talk about policy issues related to tech right across the number of issues that nonprofits care about and work on. We really urge you to get involved. We were going to end with this line, but I’m going to ask Amy to add on top of this, so.
Amy Sample Ward: Yes. I do want you to talk about or maybe just quickly name a couple other examples, but one thing I wanted to offer as a tee-up as you think of those other examples is there are lots of nonprofits in Canada, US, all over the world that I talk to who have had conversations with lawmakers, whether that’s local, regional, whatever, and they feel like, “In that time, I was really trying to educate on this issue, but technology is not our issue.” But as Katie is saying, tech doesn’t have to be your issue, but being able to say even one sentence, “Cybersecurity is really a concern for us because we do have data from our constituents, and it’s important.” Right?
Katie Gibson: Sure. Yes. So Maytree talks about social and economic rights, and poverty elimination. There’s this idea of the digital welfare state that academics and others talk about. This is really the idea of governments using data analytics and AI to police access to benefits, income supports, and housing benefits, and so on. They do that in the name of detecting and reducing fraud. So the most famous case of this came from the Netherlands, and you can imagine the inevitable result of this. It was poorer communities, more racialized communities that bore the brunt of these false claims of fraud, which led to really tragic outcomes for some of the families.
So that’s one example there. On the environmental front, we’re all climate change advocates I think at this point. A lot of advocacy work to be done relating to sustainable IT, and one concrete example of a policy issue there is a right to repair a law which we don’t currently have in Canada, but that would allow you to actually open up your Mac and fix it rather than having to toss it. I think broadly speaking, we really need to think about and take an intersectional approach to understanding how these emerging technologies affect our communities. Unfortunately, and I say this as a lawyer, right now, the tech policy conversation really is dominated by lawyers and academics, and our voices aren’t being heard around the table.
One just last example I’d given, another concrete one is a few months ago, Toronto Police Services Board came out with a consultation on what guardrails, and guidelines, and policy they were going to have around the use of artificial intelligence by police services. This is obviously a huge issue for over-policed communities in Toronto, and that’s an area where we do need to have our voice heard, or we’re just going to hear from the academics and lawyers on these kinds of questions. Amy, anything you want to add to other… I did want to say, if you’re not at the table, you’re on the menu. Amy, anything else you want to add before we move on to Q&A?
Amy Sample Ward: No, no. Thank you for those examples. They all rile me up and make me want to turn this into a call to action where we all go say in Portland… I’m based in Portland, Oregon, and we’ve been having these ongoing consultations around the city’s investment in surveillance technology and what that means, and getting so many diverse organizations who’ve never been in technology conversations before, but are able to say just what happens when surveillance technology is used around their organization’s location or for the folks that are part of that organization’s community. There’s no time like now to start crafting these policies because once they’re in place, they become the bar against which everything is measured, and we want that bar to be accountable to us. So, let’s go. Also, I see there’s lots of questions already. So, Liz, I haven’t been able to read them, but I did just open it, and I see a whole list.
Elizabeth McIsaac: There are questions, and thank you to the audience who have been participating actively in the chat as well. I think you’ve touched off a bunch of nerves, a bunch of ideas in people, and that was just fantastic. I love the values. I loved all of the different elements that you’ve woven into this imperative around dealing with a digital world. Katie, I really love the example that you gave around monitoring, social welfare policies, and that type of thing. The interesting is that it has never been done in the inverse where it has been digitally designed for defaults that people automatically get the benefits that they are entitled to, and so it really comes back to what are the values that are in place when you design the systems themselves.
I want to jump to some of the questions that come from the chat and carried over into the Q&A, and one of them is a little bit capturing the question that many people in the sector have, which is we’re small. We’re not a big organization with all kinds of capacity, with all kinds of money, with all kinds of people. So a little bit of how do we scale some of these ideas that you’ve put forward, or are there even places to go to like shared platforms of examples of digital policy tools and that type of thing? So how do we recalibrate that to the scale of the organization?
Amy Sample Ward: Totally. Great question. I’ll go first and give Katie time to come up with a better answer while I answer first. In the resources that we shared, in the handout that’s been in the chat, one piece that is in there is the Data Empowerment Report, which includes template policies, things that you can just use and adapt right away. You don’t have to hire anyone to help you come up with what a policy can be. You can just take that and adjust it, and The Equity Guide, which was created by community practitioners that work in nonprofits, foundations, tech companies, everything, worked together to create that Equity Guide around technology use, creation, and funding.
So lots of places where you can start there, and none of that is built on the assumption that you have 100 staff. That’s built on the assumption that you have a mission. There are people that are moving that mission forward, whether they’re staff or whomever, and there are community members involved in some way. That’s it. But if you are bringing people together in that structure, regardless of size, those are considerations, and that is not copy-pastable. It’s not. All the answers are there, but it is meant to be directional and let you say, “Hey, how are you going to handle this, and what works for your community given your circumstance?” So those are two resources that you can use and just… They’re totally free to use, to download, to access. Go use those today as a starting point.
Elizabeth McIsaac: Terrific. Open source.
Amy Sample Ward: Katie? Yes.
Katie Gibson: Yes, and I’d just add a couple of other perspectives on that. One is when we’re talking about small organizations, we know that the executive director plays a role of wearing multiple, multiple hats. I would say that one way to get at this is to make sure that the executive director is wearing that CTO, Chief Technology Officer, hat as well. I think we do hear people saying, “Oh, I’m not a tech person.” At this stage, that would be like saying, “I’m not a money person or a finances or budget person.” If you’re leading an organization, you have to be a tech person, so that’s one place to start, and that’s baby steps. So there’s lots of resources out there. Just start learning one thing at a time and building up your skills.
Another piece I’d say is you have existing processes, so use those. Everybody does strategic planning. Make sure you’re talking about the technology while you’re doing that. You have your annual budgeting, make sure you’re budgeting for technology in there. So it doesn’t have to feel like a whole added piece of work, just integrating it into your existing work. In terms of shared platforms, I’m not going to do constant shout-outs to the Canadian Centre for Nonprofit Digital Resilience, but you’ve hit on exactly our reason for existing, which is this doesn’t make sense to address the digital enablement and resilience of the sector organization by organization. Much better for us to come together as a sector, work together to build some of these tools and platforms that work for everybody, and we have… Yes.
Amy Sample Ward: I just wanted to add to what Katie is saying, Liz, sometimes there’s a bit of reality of a hundred-person organization versus a two-person organization. There are very practical differences in what is happening there, but in the 23 years that NTEN has existed, and operated, and done research into how the sector works, how nonprofits are using technology, we have seen that very few things other than the actual total dollar amount or the total number of staff, but the practices and policies organizations use around technology are not based on how much money you have or how many people you have. When we look at our research data and see organizations that are the most effective with technology, they are every size. They are every… a hundred years old and two-year-old organizations, and what’s most influential on that are a handful of practices that anyone can do like Katie said, but you can name technology in your strategic plan. One of the number one indicators that you will be successful with technology regardless of size, budget, age, anything.
The next is that you can budget for technology separately from other… Your printer paper is not in the budget with the printer machine. Your laptop is not in the same line item as your post-its, and your pens, and your notebooks. Right? That technology costs are separated out so that you can actually plan for them, evaluate them, et cetera, and that technology is something that folks on the leadership team feel responsible for, feel like they are leaders of without having to be technical people. Right? Those are all things that folks… Just to underscore everything Katie said, those are things we have seen consistently for 15 years be the indicators of success, not your budget.
Elizabeth McIsaac: Okay. We’ve got a list of questions to go through. Some of them are straightforward, and it will be a quick answer.
Amy Sample Ward: Okay.
Elizabeth McIsaac: With regards to data protection and having a retention policy to determine the length of which personal data is retained with nonprofits, what are some best practices about reconciling that duration, the shorter being, the better with requirements such as audit purposes?
Amy Sample Ward: Katie, do you want to offer a quick answer or…
Elizabeth McIsaac: Maybe there is no quick answer.
Katie Gibson: I don’t have a quick answer.
Amy Sample Ward: Yes. I would say the answer I have, quick or not, is having a data retention policy that is specific to different types of data. There are requirements like payroll data. It needs to be saved for a certain amount of time, right? So where there is a requirement, great, put that into the policy as its own component of the retention policy. Not all data has to be saved all the time in the same way. While that means a more granular policy requires more attention and maintenance, it does at least let you get rid of stuff in the times when you can get rid of it. So happy to share NTEN’s data retention policy or chat to folks if you have questions about that anytime, for sure.
Katie Gibson: Yes. Just to add on the different classes of data, I know this has come up with youth-serving organizations. Your insurance providers want you to retain data for a very, very long time and that you have different kinds of auditable data, you have data for litigation purposes, you have all these different classes, and that’s going to be unique to each organization.
Elizabeth McIsaac: Yes. So disaggregate what… It’s not all data. Okay. This is a quick one because I think it also got answered in the chat, but are there existing… and you may also have answered it with the open source stuff, but are there existing resources around creating a privacy impact assessment? I think it was suggested that that’s a term that’s been around for a while. New stuff out there.
Katie Gibson: Yes. So what I’d add to that. I saw that getting answered in the chat. We had CCNDR yesterday hosted a vendor and consultant showcase for organizations that are serving nonprofits with their digital needs. A few spoke to these kinds of issues. Not endorsing anybody’s solution, I do remember from yesterday that HelpSeeker does have a specific product that is a privacy impact assessment, but we can find the link to that event from yesterday, and anybody who wants to check out those resources can look for other vendors there.
Elizabeth McIsaac: Here’s a tricky one. It’s about human resources, and I think that’s really important because some of this stuff is getting automated or outsourced and dealt with in different ways. Given what we’ve heard about biases in IT programming and data exposure, is there a values-driven way to use tech to help screen potential staff or volunteers? We know HR firms do not all share values like equity and so forth.
Amy Sample Ward: Great question. Very curious on Katie’s answer. I had seen your mouth is already opening.
Katie Gibson: I mean, my very short answer would be don’t use AI at this point for your HR needs. It’s a just completely unregulated space, and you can look to New York City. They have a bylaw now relating to the use of AI for employment-related issues, and seeing some of the guardrails they’re putting up around it, we don’t have those currently. So I’d say cut that out of the picture for the moment. Amy, I don’t know if you have anything positive to offer there.
Amy Sample Ward: Little positive to say. I think while it’s maybe not the answer that folks want because I think we’ve been told that technology will make things streamlined and efficient, and save us all this time, there’s a big cost that comes with saving that time, and that is harm. Right? So I think the answer is actually technology isn’t a solution for everything. Technology isn’t solving everything, and I think a lot of HR-related things require people. While people are also filled with bias, having multiple people create those processes together, work through hiring together is going to be far better and slightly less harmful than trying to have technology be a solution there.
Elizabeth McIsaac: So this, I think, is an interesting question on the state of the mindset of the sector. Can you comment on the imperative of digital transformation in the sector? Much of what you’ve been saying is amazing for those with a digital mindset, but can you speak to the fixed mindset that many charities have as to why all of this applies to them? There might be a bit of a push that needs to happen, and is there a value proposition?
Katie Gibson: I have two quick ideas on this. One is I see the question comes from Dan Kershaw at Furniture Bank. One answer is ask Dan. Dan is a very passionate advocate for this. I believe it was from him that I heard the phrase that our sector is facing an extinction level event if we don’t appreciate and reckon with the kind of disruption that’s happening. Certainly, really strong advocates out there in the sector, and I’m happy to share more ideas there. Second, just quick ideas is Amy’s book. Not to plug that, but it’s also very helpful for making that case, and a lot of good ideas in there.
Amy Sample Ward: The last I would offer, I agree with everything Katie said and there… I don’t know what digital transformation means. It’s not a term that I use. What are we transforming? I honestly don’t know what that means, and so I would offer, maybe instead of feeling like there’s the pro-digital transformation people and the people that are anti-digital transformation, it’s more about folks who are confident and ready to have technology conversations at a strategic level and folks who aren’t ready for that.
So when we change our mindset to that being the framed, and we say, “Oh, wow. These are folks who are not supported. They are not confident. They’re not ready to make equitable, strong, strategic technology decisions,” well, then what’s the angle? Is it that there are some equity impacts happening here? Likely. Is it that they have big strategic decisions ahead? Likely. Go through that route to technology because that’s the place where it will get connected versus feeling like they’re anti the digital adoption. Right?
Elizabeth McIsaac: So we have a question that I think brings a layer of complexity to the engaging people around the table.
Amy Sample Ward: It’s so simple so far, Liz.
Elizabeth McIsaac: We just want to take it up a notch for the last bit. Do you have recommendations for federations that are collecting data from member organizations and its the member organizations that collect data from the program users? Are there unique needs/challenges that need to be considered in that?
Amy Sample Ward: Yes. Yes, there are.
Elizabeth McIsaac: Can you say a little more?
And I think folks can think about what does that look like for a user to see it in their… when you get your first email that says, “Oh, you just created an account with us to register for this event.” There’s a link. “This is what we will do with the data we collect from you today and ever.” Right? Really, I think there’s fear that folks talking about that or showing it to users will make users scared and go away. Maybe a couple, but they probably would’ve gone away anyway. Right? What you are really doing is saying, “We trust you to know and to be honest with you,” and it’s going to do far more to build your relationship with that organization than one that hides it, and obfuscates it, and makes it difficult to navigate.
Katie Gibson: Yes. We’re just about to publish a report that emerged from our working group on building the cybersecurity and resilience of Canada’s nonprofits. This issue of federations came up time and again, loud and clear when we’re talking about cybersecurity, just this question of distributed responsibility and accountability, and risk profiles across the different layers and levels. One thing that came out of that conversation is to the extent that that national organization can take some leadership around cybersecurity specifically. That can be very helpful. Some of these vendors and solutions are better accessed at scale in a comprehensive and common way, so that could be one way to approach it, but certainly, it gets complicated when those questions of responsibility and accountability aren’t fully understood.
Elizabeth McIsaac: So just one last quick question. Maybe it’s quick. Somebody is asking about the value of cyber insurance. Is that something to be contemplating, and what are the challenges of qualifying for that at this point in time?
Amy Sample Ward: Yes. I think it is important. The pushback I often hear from organizations is, “Yes, we don’t have really interesting or controversial data. No hacker is coming for us because we don’t have interesting data.” No one cares that your data is interesting to them. They are coming for you because they know the data is important to you, and you are obligated to protect it. So that means, actually, everyone is vulnerable here. It’s not something only for that really big organization that does big campaigns and the spotlight is on them. It’s on all of us because all of us care about our data. So when you think of it that way, that cyber insurance is really important and worth talking to your… whoever your insurance broker is about what options you have given the size of your organization, the tools you use, et cetera.
Elizabeth McIsaac: Katie?
Katie Gibson: Yes. This ties back to Dan’s question around motivation and incentive to become more digitally enabled. Helen Knight, who’s a consultant in the sector who always has interesting and intelligent things to say, she talks about for some people, risk is their love language. In some cases, it’s actually this question around cybersecurity risk and the associated question of insurance that actually focuses the mind on the broader question of digital capability and confidence. So it can be a good way to create a bit of a burning platform to say, “Look, if we want to get this insurance, here’s the cause. Here’s what we need to do. Let’s get to the work.”
Elizabeth McIsaac: With that, we’re at 1:59, so I’m going to close us out and say first of all, a huge thank you to both of you. I have learned so much. It’s a complicated field. It’s a broad landscape with lots of stuff in there, and so I think we’re all in a learning mode, and I think you’ve done an excellent job of bringing us through some of that learning. If nothing else, things that we need to learn about because it’s going to be an ongoing journey.