Definitely, but then again any AI worth a damn isn't going to have a "sad flag" either.
My point was that understanding the nature of an emotion in a trivial way should be orthogonal to how we think about what rights that being should have. At some level, we're all machines. Just because one's software runs in silicon vs gray matter; just because one's hardware was deliberately built and is understandable in computing terms doesn't mean that we really understand what it is to be sentient with respect to rights to be free and exist.
My point was that understanding the nature of an emotion in a trivial way should be orthogonal to how we think about what rights that being should have. At some level, we're all machines. Just because one's software runs in silicon vs gray matter; just because one's hardware was deliberately built and is understandable in computing terms doesn't mean that we really understand what it is to be sentient with respect to rights to be free and exist.