In the editor there’s GitHub copilot autocomplete enabled in the chat assistant and it’s incredibly useful when I’m iterating with code generations.
The autocomplete is so good that even for non-coding interactions I tend to just use the zed chat assistant panel (which can be configured to use different LLM via a drop down)
More generally in multi-turn conversations with an LLM you’re often refining things that were said before, and a context-aware autocomplete is very useful. It should at least be configurable.
Mac default Dictation is ok for non technical things but for anything code related it would suck, e.g if I’m referring to things like MyCustomClass etc
The autocomplete is so good that even for non-coding interactions I tend to just use the zed chat assistant panel (which can be configured to use different LLM via a drop down)
More generally in multi-turn conversations with an LLM you’re often refining things that were said before, and a context-aware autocomplete is very useful. It should at least be configurable.
Mac default Dictation is ok for non technical things but for anything code related it would suck, e.g if I’m referring to things like MyCustomClass etc