My goal for my project is to build a tool that transcribes Interviews (e.g, in Sales or Recruiting) and puts the Transcription through ChatGPT (Waiting for the API atm) to make a summary that looks like the notes of the call. Speaker diarization is important, so I don't have more than 4000 tokens input in ChatGPT. I will see how it goes, but if it's reliable enough (looks like it so far), it will save the time it takes to write meeting notes and rewrite them to send them to someone after the call (Hiring Managers etc.) Imagine a 10x Otter.ai or something like that.
Why are you waiting for the API? The OpenAI Playground has API examples you can copy paste. You can go over 4000 tokens if you have a business justification and payment method. You have access to most of their models even the new Codex ones
Edit: Looked at your link and I misunderstood. I think I understand you're waiting for the ChatGPT specific model now?
You are correct that I was incorrect. Thank you for correcting me. I misread their documentation. Sounds like they might increase the token limit in the future, but right now it's 4097 tokens shared with the prompt