How Will AI Transform Learning Space Design?

For years, higher education has designed learning spaces around technology as a tool for display, capture, collaboration, and connectivity. AI changes that equation: Technology is no longer just an install in a room; it's a participant in the learning experience itself, says Craig Park, director of digital experience design and associate principal at Clark & Enersen. In a recent Spaces4Learning webcast, Park moderated a panel of technology leaders in a vibrant discussion of how AI is influencing learning space design from a facilities and infrastructure perspective.

Our panelists were:

The following conversation has been edited for length and clarity.

Craig Park: Let's get started with how technology is considered at the outset of a new project, and how AI impacts that today. When you think about needs assessment, standards development, and room typologies, how does AI fundamentally change those early phases of learning space design?

Byron Tarry: There are so many players that touch these large-scale, physical projects that we deliver, and there's a huge opportunity to create centralized knowledge bases and see transformation in the journey that happens from those early needs analysis discussions with stakeholders, all the way through to keeping the team aligned. But in serving those stakeholders, the very foundation of needs analysis becomes incredibly challenging in this moment, because none of us has answers more than a year or two out on what AI's impact is going to be. The very foundation of going and asking a stakeholder to tell me what you want and need in this room, and I'll go deliver it for you, is starting to shift. That partnership that we play in building scalability and agility to answer tomorrow's needs as much as today's is going to be foundational to how we really rethink what that design process looks like.

Joe Way: As we look at our spaces, we like to talk about standardization, yet we have to couple that with personalization. That's where I think AI has a lot of opportunities on a needs analysis level. When we're sitting here as a technology owner at a school, we're looking at things in the realm of hundreds — hundreds of standardizations. Yet our customers are very individual. That's where AI can help us out. In that early phase, we're able to bring that in and go, how can I double down on what you need, but how can I also align?

Andrew Vogel: When we design classrooms or even new buildings, it's a 20-year plan, and it's trying to adapt to what's moving day by day with these technologies. For example, we've been making a push to reimagine how we do our computer labs. We're a very bring-your-own-device campus, so we're looking at the infrastructure of more wireless access points and power in the classrooms. But also, do we transform our computer labs into a proctoring station because of AI and academic integrity issues? So we're looking retaining our current modalities, but shifting how those operate in our physical spaces.

Byron Tarry: The personalization proposition is such an important consideration. It is one of the foundational strengths of what this AI opportunity can bring us. For so many years, standardization solved a whole bunch of problems for us, but actually, what it didn't do was answer the true stakeholder: the user. What we were doing was taking the average and hoping that everyone can align to the average. We have a unique opportunity now, as we start to look at what needs analysis means: We're not looking at the average anymore. We need to create agile spaces that can adapt and be reflexive to the moment. That's a real shift in the mindset.

Park: As we look forward as planners of educational spaces, as architects and engineers, that project that we get hired to design will take about three years from the day we get hired to the day it opens. At the university, that project has probably already been in the pipeline, from master planning to budgeting and fundraising, for maybe a year or two. So we're really looking at a five-year look-ahead around a technology that we only have a couple of years of basic understanding or experience of how it's going to impact the future. We really are, in many ways, writing science fiction.

Let's move on to the concept of infrastructure. If I'm renovating a building in 2026 or 2027, what makes a learning space truly AI-ready?

Way: We have to start with understanding where the borders are going to be and where it's okay to let AI have freedom while allowing us to control the output. Those are two very important things. When I'm looking at my infrastructure and the mistakes we're making, the fact is that nobody is an AI expert yet. There is no understanding of what we can bring to the final stakeholder.

At UCLA now, my team is doing a lot of things with AI, but it's not just about using AI for AI's sake. It's about making sure that when we do this, we understand the boundaries and make sure that final stakeholder voice is there. So when we are in that infrastructure planning phase and making a space AI-ready, it's about, okay, what is the final result? And can we control that result?

Park: We need to talk about it and set a governance standard. Any technology should have a governance program before it's implemented.

Tarry: That premise of governance is ultimately responsibility. It's what we in management roles are tasked with doing: to think about risk, to think about what is right and wrong. But then we've got to think about the practical first steps as well. That's where we often get lost, in the abstract of the infinite number of experts out there and the cacophony of advice.

Vogel: The ethics question sticks out to me a lot. We're a big university with a lot of different departments and a lot of different local-context needs. There's going to be situations where AI is not applicable to the classroom, and the best way navigate that is to talk more. Have those committees and that governance. Not everything has to be thrown with AI in it, but we can talk about it. We can talk about best practices of how to not use it, or when it's permitted. The biggest thing is just making spaces flexible enough to either utilize AI or not.

Park: One of the things that we talk about when we're doing long-term planning is pilot projects and sandbox spaces, where new technologies can be tried and proven — or tried and failed. Are your campuses doing any showcase rooms that demonstrate how AI can be used, and do you see them becoming campuswide standards?

Vogel: The barriers I see are barriers we've always dealt with: Is the furniture flexible enough? Is there enough power for students to use the software to run AI? Is your software secure enough? Those are the biggest type of sandbox things we're seeing.

The challenge with AI is that it's been an add-in to what we're currently doing. We don't want to use it to replace current services, so we're adding it on to things — but that means faculty and GTA staff have to be trained up and all speak the same language. On small scales, we're looking at partnering with course coordinators of different units and seeing how they allow AI. Of course, we have blanket policy to provide guidance across the university, but what works for Arts and Sciences won't work for Engineering one-to-one.

Way: When you're working with your faculty there at The Ohio State University, does "AI" mean they just turn on the AI transcriber in the Zoom call or in the LMS, or are there actual tools that are being used in that teaching and learning experience? Because those are two different things, and I don't know that as an industry we've separated out where AI actually allows teaching and learning to move forward and where it's just a note-taker.

Vogel: It's a bit of both. You have folks who want students using AI and experimenting with it to see what works and what doesn't work. And for some folks, that's moving too slow. The School of Music is probably our most innovative. I have seen them using AI to help people with low vision read sheet music, for example. It really gets into those concentrated fields and having those faculty champions. My biggest thing is just trying to train folks and get them to a level where they're comfortable, and then they can figure out if AI is the right tool for them or not. It's important for everyone to be fluent, so they can determine whether or not they want to use it.

Way: You're almost treating every different vertical — whether it be music or film or whatever — as its own sandbox, and saying, "Here you go. Try it. Find out how this works for you."

Vogel: That's what we're trying to do. And the biggest earth-shattering thing is that it's bringing people who are siloed back together to say, "Hey, this works for us. You don't have to do it. But maybe you'll borrow an idea."

Tarry: I use an analogy of the greenhouse: Cast seeds; get as many and varied as you can. And that's not just people with a specific responsibility for this. It's people who go home on a weekend and fiddle around with AI. Let them cast as many seeds as they can into the wild, and then have a mandate to sit and watch that and see which seeds start to sprout and grow and show promise. But then you need to be able to bring those that are progressing and growing healthfully into the greenhouse: That's where we foster them, nurture them, fertilize them, and allow them to strengthen and grow. Ultimately, it has to go back into the wild and it has to become self-sufficient and sustaining.

When you think of that model, it's a shift in our role as leaders and managers to say, "My job is not to be the brilliant innovator who comes up with all the ideas. My job is to create nurturing environments that allow ideas to grow." We're not the people who are going to transform what ChatGPT or Google or anyone is doing. And we're not sitting at the vendor level, putting AI as a technology into a black box that is going to do magic. We're the architects that sit in the middle and have to figure out what all this means.

To give an example, at ISE we had Andy Lee from the University of the Arts London talk about how audio brings context to a classroom. And we started talking about what Shure is doing with their new DCA901 broadcast microphone array: They took their original ceiling microphone and put it into a sporting environment, and instead of just picking up audio, they're looking at different layers of context, from the squeak of the sneakers, to the thud of players banging into each other, to the noise of the basketball hitting the hoop. In the context of education, we can start to think about how to equip the environment so that it's not just about recording a room, but about layering in the context. Ultimately, context is the most critical part of how we leverage technology, whether that's in real time with analytics or in building knowledge bases that we can draw upon after a lecture.

Way: As the great [Managing Director of Integrated Systems Events] Mike Blackman has said, and I've stolen it from him many times: AV teams are the eyes and ears and voice of AI. We are the final deliverer and intaker of everything. If we are the ones who are taking it in, and we are the ones who are putting it out for that user, there's an opportunity for us there. As an AV industry, why are we not sitting at every table? Because if you want AI to be in there, you have to capture it and you have to send it out. We are the ones that can bring it to you, because what are the two things that we do? We capture it and we send it out.

Park: And we process it in the middle, which is the part that gets really interesting. What do you do with that? How can you leverage AI to customize the experience each student gets to meet their particular needs, from learning levels to neurodiversity to anything else?

Tarry: The canvas of what we can do with this is so wide. I love the analogy that we're the people who have taken the signal in, done something with it, and then shared it out. That was a very narrow context in the past. I challenge us to go bigger and wider and broader than that within the classroom, and think of analytics instead of just "it comes in and it goes out." In the digital signage industry, we capture engagement from a dwell time of people looking at a screen. In the call center environment, we capture audio and we look at engagement levels of call center staff. Why aren't we looking at that in our own environment? Within a classroom or a board room or a meeting room, we can do more than just take the stuff in and amplify it and put it on a bigger screen or a louder speaker. We can actually start to look at the context of what is happening there, and then we can start to add value in different ways.

There's a foundational concept in the question: Where is value locked right now that we traditionally haven't had the capacity to unlock? At a conference, for example, there's 200 hours of content that got recorded and captured, and even if it is shared with attendees, no one is going to go look for the needle in the haystack of that content. But if we can bring in AI to be able to hyper-personalize and say, "Go find me the needle in that haystack, the little nugget of information highly relevant to me," we can unlock a huge amount of value. The same applies in a teaching environment. AI offers hyper-personalization, at scale, allowing us to expose something unique that we never could have delivered before.

Way: I don't think many people talk about that value. Are we doing AI for the sake of AI, because it's what we're told we have to do, or avoiding it because we're scared of it, or are we actually asking the question of what is the true value we're bringing?

Vogel: Where I would love to see analytics be used in a way that's not too invasive is things like student attendance. I hate taking student attendance. I don't like to micromanage, but I also know I have students who are going through certain things in life that they may share with me or they may not. If we're capturing their data, whether it's presence or bring your own device, analytical data, where is the line where I can have the system nudge them: "Hey, you've been missing class. What's going on?" Our job is to have that emotional connection to students, but when you have a lecture hall of hundreds, that's harder to capture. That's the value lock that I would like to see: a way to use predictive analytics ethically to support students who are struggling, who may not want to vocalize, and who may not be aware of resources such as student advocacy.

Park: The manufacturers of smart cameras for security surveillance have now built in machine learning — when the camera sees a certain kind of activity, it can predict a certain kind of output, like a fight or somebody stealing. Can we apply that in the classroom itself to capture the facial expression or the body language in a way that would say, "This person is either not paying attention or can't pay attention." I don't know if that's too invasive. That's where we cross that line of, how can we help without being overly "Big Brother"?

Tarry: That's where governance steps in: We make those decisions, and we give people opt-in or opt-out propositions. The question is first, is there a capacity to do something? And then, how do we go and apply that? We can come up with a thousand things to do once we understand there's a capability, but then we look for what that value unlock is and say, "I think there's an opportunity to exploit that more."

With the University of the Arts London example I shared earlier, it started with recording group conversations within a classroom. And the initial feeling from the students was that Big Brother is watching and listening. But when the instructor started to flip that around and started to give back deliverables and outputs that came from those discussions, suddenly the whole mindset of those students shifted to, "Wow, that was great." And look, we do it all the time. We give all sorts of information to Facebook, but it's on an expectation of some value delivered back. And it's that mutual win-win that becomes the critical piece. I'm willing to accept you listening in on my group conversation in the classroom, because you're delivering something back to me that elevated me, amplified me, gave me a huge amount of value in return.

Way: Let's not lose something that Andy pointed out: When we talk about AI, the camera is taking attendance, but in the end, it's also recognizing that the student hasn't been there three times in a row. Maybe there's something going on that they might not tell you. That turns it to student well-being, and we can now connect that to student services, to mental health. That human-centered approach is where there's opportunity. As AV people, we look at AI as translation, transcription, all these little things, and I don't think a lot of us connect it to human-centered. That's my takeaway. I'm going to go talk to my team: "I know we're doing AI tools, but are we making it human-centered? How can we connect it to the livelihood of the student?" That's where AI has true purpose.

Tarry: There's a whole lot of efficiency that AI is driving. I use a financial analogy: You don't take every profit that you get out of your 401k and go spend it. You reinvest it and reinvest it and grow it. We need to spend more time in these types of discussions than ever before. Rather than just taking the efficiency proposition and investing in more "doing," we need to start to budget some of that efficiency to reinvest in the discussion, in the ideation, in the experimentation.

Park: Thinking like a science fiction author, what comes after what comes next, looking out five or 10 years? What does a classroom look like that is enabled with AI?

Vogel: What I would like to see is rooms that have a profile built into them. When the faculty member gets into a room, the login already knows them. It dims the lights the way they like. Students know what chairs are open. And in almost a flipped learning style, every student's phone or smart device has already been pre-loaded with the course content and a to-do list.

Way: The webinar we're going to do five years from now won't be called "How Will AI Transform Learning Space Design?" but "How Did AI Transform the Learning Experience?"

Tarry: We tend to overestimate the impact of new technology in the short term, and we tend to drastically underestimate the impact of technology in the longer term. That's carried true for every new technology shift for decades and decades. Even as we make predictions, we're ill-equipped to actually be able to understand what that means. We tend to look at the jobs of today and how they are iteratively going to be better and more efficient, as opposed to what the transformation will be.

When you start to think about agentic AI and the model context protocol — MCP is an agentic communication protocol that isn't "I ask you this, you tell me the answer to that," it's much more broad-based and contextual. Imagine if agentic systems in a classroom or smart building are talking to each other and starting to learn and understand and transform. That's where we can start to challenge ourselves, not to build a "slightly better mousetrap" than what we've got today, but to develop a transformationally different way of learning and thinking in a few years' time.

Higher education is, and always has been, the preparation to send people out into the workforce to have impact and make change. What happens in the higher education institution has never been more important, in sending people out into the world to have that change impact that is needed to reinvent the way we work in the future.

Featured

  • California K–12 District Finishes Renovations on Multi-Sport Stadium

    The Alameda Unified School District (AUSD) in Alameda, Calif., recently announced the completion of a renovation project on the Encinal Jr. & Sr. High School stadium, according to a news release. The district partnered with Quattrocchi Kwok Architects (QKA) and Bothman Construction on the facility, and funding came from Bond Measure B.

  • Utah Valley University Opens New Engineering Building

    Utah Valley University in Orem, Utah, recently held a grand-opening ceremony for the new Scott M. Smith Engineering Building, according to a news release. The facility is one of the largest engineering buildings in the state at almost 200,000 square feet, and it plays home to the university’s Smith College of Engineering and Technology (SCET).

  • Wisconsin District Breaks Ground on New Elementary School

    The School District of La Crosse in La Crosse, Wis., recently broke ground on a new elementary school that will consolidate the students and staff of two existing schools, according to local news. Funding for the school comes from a $53-million referendum approved in 2024.

  • UT System Approves First Funds for New Campus

    The University of Texas System Board of Regents recently approved funds to build the first facility of a new campus in far west Fort Worth, Texas, according to university news. UTA West will serve as a branch of the University of Texas at Arlington and is scheduled to open in fall 2028.