Replies: 5 comments
-
Great idea! I actually had this as part of v1: CleanShot.2023-03-21.at.16.36.49.mp4Tiptap doesn't support this natively so I had to build a custom Placeholder component: https://github.com/steven-tey/novel/blob/44878691b9acf237476675c5c6ac8aa9105c8c3b/lib/tiptap/placeholder.ts There were 2 main issues though:
So I ended up scraping it 😅 If anyone can figure out a stellar implementation for this based on the work I've done before, I'm all ears 😄 |
Beta Was this translation helpful? Give feedback.
-
@steven-tey Thank you for the fast response
Have you already tried using gpt-3.5-turbo via the azure openai service. From what I have understood of the many chats on the openai forum, is that many are experiencing much slower responses from the openai api than chatgpt(plus) using the same model. The current answer to these threads are that the openai api is using a shared engine ref and chatgpt is using an isolated one/separate env. The reason I am bringing up azure openai is that many posts mention that they are seeing 2x maybe faster response times for the same model and since the pricing, as far as i can tell, is the same, it's perhaps worth giving it a try. |
Beta Was this translation helpful? Give feedback.
-
I really love this feature! |
Beta Was this translation helpful? Give feedback.
-
Gonna have to try Azure OpenAI soon! Any of y'all wanna whip up a PR for this? 👀 |
Beta Was this translation helpful? Give feedback.
-
Would be awesome to have API for this so I can use my own backend for that! |
Beta Was this translation helpful? Give feedback.
-
Description
It would be nice instead of the adding the AI text directly to the document, it to be shown as a ghost text while you write something, similar to how GitHub copilot works and require
tab
to accept the generated text.Implementation
I don't know how feasible this is using
tiptap
, I checked if there is any plugin for it that allows you to add ghost text i.e. text that is shown, but no selectable & disappears if you don't accept it viatab
, but couldn't find any. So, this may require creating one.Consideration
Since similar to GitHub copilot I would expect the autocomplete to be rerun if I don't accept the text and continue writing, this could result in a higher number of calls to openai. Therefore, it would probably be good to limit the output tokens
Beta Was this translation helpful? Give feedback.
All reactions