I feel like this could really take off if it was presented through some kind of wacky friendly front end.. maybe like some kind of animated assistant, like a cute dog or something office related.. a stapler, no, something more dynamic yet mundane, like a Paperclip!
The problem isn't writing formulas. The issue is that large complex XL spreadsheets become virtually impossible to maintain because you have formulas spread all over the place, sometimes across sheets. I have complex workbooks from 15+ years ago that fall under this category, despite the fact that I go out of my way to document as much of what goes into making these monsters as possible.
The gap XL has not crossed yet (officially, there might be projects out there, yet that's not going to establish a standard) is a translation layer to something that looks like real code, with the flexibility and maintainability of real code.
There are other issues, of course. For example, no easy way to create test suites for complex spreadsheets. Imagine a complex solver of some sort that goes through generations of updates by different people. Imagine having no way to run a quick set of standardized tests to ensure your latest rocket calculations or investment crystal ball spreadsheet isn't going to crash the rocket into a mountain or collapse the investment portfolio due to a coding error.
The widespread use of spreadsheet environments by billions of users presents a unique opportunity for formula-authoring assistance. Although large language models, such as Codex, can assist in general-purpose languages, they are expensive to train and challenging to deploy due to their large model sizes (up to billions of parameters). Moreover, they require hundreds of gigabytes of training data. We present FLAME, a T5-based model trained on Excel formulas that leverages domain insights to achieve competitive performance with a substantially smaller model (60M parameters) and two orders of magnitude less training data. We curate a training dataset using sketch deduplication, introduce an Excel-specific formula tokenizer for our model, and use domain-specific versions of masked span prediction and noisy auto-encoding as pretraining objectives. We evaluate FLAME on formula repair, formula auto-completion, and a novel task called syntax reconstruction. FLAME (60M) can outperform much larger models, such as Codex-Davinci (175B), Codex-Cushman (12B), and CodeT5 (220M), in 6 out of 10 settings.
What a silly path to walk down - rather than cleaning up the hopelessly messy and user unfriendly programming language embedded inside Excel - create an elaborate AI assistant that can write even more messy formula code.
I don't care for fixing my formulas. The formulas I write are fine (possibly thanks to Excel's brevity and immediate feedback). The hard part is writing them in the first place.
I would love a model that would translate natural language like "Sum up A1231 and upwards until you hit a blank cell".
That said, this model is a useful stepping stone in showing what's possible with limited resources. I am happy people are working on it.
My regions are dynamic, so they wouldn't help me. In the example above, I enter daily work logs with a functional table header separating dates by a blank line.
Ctrl+Shift+; is perfect for introducing start and end times. I then compute the daily total, which I then bill. It has been better than any manual time tracker I've tried.
Compared with ActivityWatch which I also use in case I forget to enter time, I get a roughly 5-10% error.