Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I use it many times a day now for:

a.) Searching a broad domain that I'm not familiar with. It finds concepts I simply would have missed if I were using Google.

b.) Asking dumb questions I'm too embarrased to ask a colleague as it often feels like everyone's familiar with the idea but myself.

c.) Understanding usage of math symbols. Often, authors use slightly different symbols in expressions, and that trips me up. Rather than guessing and hoping I'm interpreting symbols correctly, I ask ChatGPT. For this, I often just take a screenshot of the expression(s) I'm curious about, and ask ChatGPT to explain the expressions without providing other context. Then, if ChatGPT gets the domain and context correct, I feel confident that it's explanation of the terms is reasonably trustworthy.

d.) Similar to 'c', but more deeply, clarifying content in text books or online courses, where I can't or don't want to wait for an instructor or peer response. There are so many frustrating occurances in educational material where an author glosses over a concept as "being obvious" or they "leave it to the reader as an excessive to derive". Really chaps my ass - I bought the material to learn efficiently, not spend hours pulling my hair out deriving a concept that's been omitted. So, I ask chat GPT. I recently had an epit 30-minute exchange with ChatGPT-4 on ROC curves. There is actually an unspoken assumption that after plotting measured specificity and sensitivity values (between 0 and 1), you must artificially connect the last point to coordinate (1,1), even though a small subset of models will not actually measure a value of (1,1). The discussion with ChatGPT helped me tease out the logic and see that the reason this concept is ommited is because it's highly unlikely to have model with no (1,1) ROC value, but I did confirm my suspicion that something was indeed being omitted, if not actually misunderstood, by educators explaining the concept. Could just be me, buy that kind of thing will stick in my crawl for years if I let it go unresolved. Bottom line, I think chatGPT, in this sense, is a good sounding board for reflecting how most everyone in the world "thinks" about a concept (especially very niche ones - not that ROC is niche) and you can indeed have a logical argument with chatGPT that is useful.

I expect these will only improve in time, but it's helping me right now.

I will say that writing prompts comprehensively in proper syntax and grammer/punctuation seems to yield considerably better results. Perhaps because a lot of the information archived in literature and white papers is written that way and chatGPT is effectively just an echo chamber for archived material. It seems like people try to use cave man-like "Google Search English" or chat/text slang, results are incomplete or off the mark. I speak English, so I'm not sure if this observation extends to other languages.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: