Smart Testing Tools
AI Resource Management
Interactive Learning Platform
Safe AI Experimentation
Personalized Learning Assistance
Cloud Operations Hub
Versatile Cloud Environments
Cybersecurity Training
Coding and Data Analysis
Digital Teaching Assets
Educational Administration Tools
Tailored Client Education
Workforce Skill Development
GenAI For Campus
Smart Testing Tools
AI Resource Management
Interactive Learning Platform
Safe AI Experimentation
Personalized Learning Assistance
Cloud Operations Hub
Versatile Cloud Environments
Cybersecurity Training
Coding and Data Analysis
Digital Teaching Assets
Educational Administration Tools
Tailored Client Education
Workforce Skill Development
GenAI For Campus
An interview with Matthew Brummund
Matthew Brummund, Junior in Computer Systems Engineering at Arizona State University
Matthew Brummund, a junior at ASU’s Fulton Schools of Engineering, excels in Computer Systems Engineering with a 4.0 GPA. He’s contributed to DORA, a CubeSat project, developing Rust-based software for data retrieval from orbit, and led Project SPYN, creating a maze-solving autonomous vehicle. With over 800 hours of tutoring experience, he’s skilled in Java, Rust, Python, and web development.
Welcome back to Bringing AI to Campus: An Interview Series. This interview features Matthew Brummund, a junior in Computer Systems Engineering at Arizona State University (ASU), who works at ASU’s Cloud Innovation Center in collaboration with AWS. Matthew brings hands-on experience using GenAI for projects and offers a fresh perspective on how students are leveraging AI in the classroom and beyond.
Absolutely! I’m a junior at ASU, studying Computer Systems Engineering. My real dive into AI began when I started working at the ASU Cloud Innovation Center. That’s where I first started using AI in a work setting. A lot of the projects are open-ended—like designing a control system for traffic lights—so I used AI to brainstorm how to approach the logic behind it. While AI doesn’t always give you the exact answer, it helps you get started and refine your ideas.
Definitely. I’m a huge fan of generative AI, and I’ve noticed two main groups among students: those who are enthusiastic and use AI extensively, and those who are more cautious or unfamiliar. In my classes, especially this semester, AI usage has surged.
However, the effectiveness depends on how it’s used. Some students try to get direct answers from AI, which doesn’t help them learn and often leads to mixed results. The most successful approach I’ve seen—and personally use—is treating AI as a brainstorming partner. When I hit a roadblock, I use AI to think about the problem differently. It doesn’t replace learning but enhances it by offering new perspectives.
Currently, I prefer using Claude Instant 3.5 for coding assistance. It performs well in code generation and understanding. That said, the AI landscape is rapidly evolving. Six months from now, the leading tools could be different. It’s essential to stay adaptable and keep up with emerging technologies to find the best fit for our needs.
AI literacy is increasingly important. Students who don’t engage with AI risk falling behind academically and professionally. However, there’s an accessibility issue. Advanced AI tools can be costly, and not all students have the resources to access them.
Communication is key. Faculty should be clear about their AI policies—what’s allowed, what’s not, and how to incorporate it appropriately. This helps students use AI responsibly and effectively.
For universities, adapting curricula to account for AI’s capabilities is crucial. Assignments should encourage critical thinking and learning rather than tasks that AI can complete easily. AI should augment the educational experience, not replace the fundamental learning process.