Generative AI tools may be used for your code contributions to the group project, and for the flier you make to advertise your features.
The core policy is one I (Rob) believe reflects a least-common-denominator policy around AI usage in responsible software engineering contexts: you are responsible for the code you contribute to the project, and you are responsible for the code you review in the project. Telling a groupmate or course staff member “I don’t know, it’s what the AI produced” or “I don’t know, the AI said it made sense” is not in line with the minimal expectations of the course, and repeatedly failing to be accountable for the code you write or the code reviews you sign off on will result in failing the course.
— Rob
Changelog:
- June 2: fixed some typos
Where you’re more than welcome to use AI tools
Search and question-answering
While thinking a bit about your search query can help with locating authoritative sources (adding “mdn”, “docs”, or just adding profanity to a search query are three of my go-to approaches), there’s little question that search engine results today are dramatically less useful for the kind of questions that folks leaned on Stack Overflow to answer a decade ago. If you’re going to get a Google AI answer preview shoved down your throat anyway, you’re just as welcome to use ChatGPT.
Codebase navigation
In the first two individual projects, we wanted you to think about questions like “where should I add a new file to put the React code for Tic-Tac-Toe?” without generative AI tools; learning to navigate a codebase from scratch and with simple and predictable tools like right-click-go-to-definition is very powerful. For the final project, you can make your own choices.
What’s wrong with this code?
I firmly believe that there’s great value to being stuck solving a problem, but I also think there’s great value in learning to navigate Boston without GPS, and I certainly use GPS. Getting lost in code (or on the road) can cost a lot of time, and time is at a premium. Make your own choices here, but beware letting the chatbot lead you on wild goose chases if the first or second suggestion doesn’t nail the problem.
Where you should avoid generative AI tools
Don’t vibe code, please
Because you’re responsible for your code, I strongly recommend writing the first draft of code — where you go from a function’s signature to a first implementation — yourself. Tools should be used only to generate the smallest code components and to help you get details right.
Don’t avoid humans, please
Please do not use generative AI tools as a replacement for communicating with your teammates, pair programming, and learning from other humans.
Do your own reflections, assessments, and reports
The point of reflections is what happens in your brain, not in producing text that course staff gets to read.
Where your group should decide on your approach to AI as a team
The general theme here is “stuff that your group may not have expertise in, but that in the real world I implore you to seek out people with actual expertise rather than just vibe coding your way through it.”
-
Design/CSS/HTML-presentation-layer stuff. We didn’t talk about this in the class, and so if you produce some unmaintainable vibe coded CSS nonsense to get your code to do what you want, we’re not going to wreck your grade over it. Please try to indicate in your code where you’ve done this.
-
AI art: to be frank, I’m opposed to the use of AI art on both an aesthetic and a philosophical level, but I will limit my reactions to maybe making an unhappy face in the final project presentation and we won’t grade based on it.
-
Testing: a lot of people described success at getting tests written by the chatbot. I’d rather you practice test-driven development with AI generated tests than not practice test-driven development, but your group should be on the same page about what the expectations are. Ultimately being responsible for all the code you commit does include tests, but there’s room for teams to take significantly different approaches here.