I agree! Especially now that the data analysis tools have been integrated by default. It even writes and executes code to validate most of its mathy answers. I tried for a few months to find a good Physics tutor for my high school-aged daughter and eventually just started photographing her homework with GPT-4. I’d ask it to solve the problems and explain its solution to me, then I’d check the answers and teach her myself. It was correct more than 90% of the time over three months, and I relearned high school physics in the process. Even human tutors aren’t always accurate, and in my experience, they also sound confident when they are wrong. Eventually, I decided to just remove the monkey from the machine and got her an account of her own. Almost every day she tells me about something she “finally understands” that she’s been struggling with in class. Her in-class, no-access-to-GPT test scores (after I got her the account) went from high 50s to high 80s.
This is basically how I respond to requests myself. Sometimes a single short sentence will cause me to slowly spit out a few words. Other times I can respond instantly to paragraphs of technical information with high accuracy and detailed explanations. There seems to be no way to predict my performance.
Early on, I noticed that if I ask ChatGPT an unique question that might not have been asked before, it'll split out a response slowly, but repeating the same question would result in a much quicker response.
Is it possible that you have a caching system too so that you are able to respond instantly with paragraphs of technical information to some types of requests that you have seen before?
I cannot tell if this comment was made in just or in earnest.
As far as I understand, the earlier GPT generations required a fixed amount of compute per token inferred.
But given the tremendous load on their systems, I wouldn’t be surprised if OpenAI is playing games with running a smaller model when they predict they can get away with it. (Is there evidence for this?)
Every time this happens at the companies I’ve worked at, it’s been because the business didn’t think the need was worth prioritizing over the other business needs. I’ve never been in an environment where IT got to make the decisions about what gets prioritized.
If the business doesn’t think something should be prioritized, but a marketer decides that they really want to have it, I’m not sure it’s a great idea to go around what the business has prioritized. Most of the cost in software comes from maintenance of legacy systems. I hope this new no-code solution is something a department has decided to maintain indefinitely! I my experience, after the marketing person quits, the no-code solution gets chucked across the fence to IT, the no-code provider stops supporting their tool, and IT is forced to build a new one as an emergency a few years later because “the business has been using it for years.”
In a healthy org, if something is truly good for the business and will create global gains across the company, it should hop to the top of the priority list. Admittedly, I’m still hunting for this theoretical healthy org, though.
Usually the marketer is stuck between a rock and hard place. A semi-technical marketer will report into a completely non-technical org led by a CMO who thinks pivot tables are black magic. But they same CMO has promised the CEO and the board that they can integrate their product analytics into their pipeline performance reports, and have it by next week.
Sure the marketer can say no, but the CMO doesn't understand why it is a no because their buddy CMO's marketing team does it already (not understanding the work that went into making it happen at that company). So the CMO interprets no as a performance issue not a technical or process issue.
And this happens all the time. So at some point the marketer is stuck between building a one off report in a spreadsheet or using the tools they have access to (usually something like zapier, or some marketing automation platform) to build a good enough solution that gets presented at the board meeting, and everyone is happy.
Play this on repeat with, marketing, sales, hr, finance etc every month.
The alternative to low code isn't code, it is excel spreadsheets.
No time to write a full comment now, but I would compare the fully planned org to failed communist states vs the more chaotic capitalism that does better.
I guess this makes sense if you don’t use a Second Brain/PARA/Zettelkasten/etc. system, but my entire life is structured around Obsidian (at home) and OneNote (at work).
If you’re just throwing notes into a note-taking app with no way of processing them, I can see how this would be true, but my system is constantly resurfacing old thoughts, and I make conscious choices about what gets archived and preserved.
I read the full article, and it sounds like the author hasn’t heard of these. Confident article. Not deeply researched in my opinion.
I came here to basically say this same thing. I also use Obsidian with the "Building a Second Brain" system and I can't imagine trying to function without it at this point. Notes can be functional and useful if done well.