I use LLMs as a sounding board for logical, creative or reasoning tasks all the time as it can provide different points of view that make ME think about a problem differently.
In my experience, LLMs work pretty nicely as rubber ducks, for logical, creative or reasoning tasks. They'll make lots of mistakes, but if you know the field, they are often (not always, though) easy to detect and brush off.
Whether that's worth the environmental or social cost, of course, remains open for debate.
Reasoning is the manipulation and transformation of symbols (i.e. stand-ins for real objects) by well-defined rules, often with the object of finding equivalences or other classes of relationships between seemingly unrelated things.
For example, logical reasoning is applying common logical transformations to propositions to determine the truth relationship between different statements. Spatial reasoning is applying spatial transformations (rotation, translation, sometimes slight deformation) to shapes to determine their spatial relationship, such "can I fit this couch through that doorway if I rotate it in some way?"
Reasoning has the property that a valid reasoning applied to true data always produces a correct answer.