-
Notifications
You must be signed in to change notification settings - Fork 186
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wasm support #102
Comments
Thank you 😌 Given that I'm not very familar with wasm ecosystem, but seems like tokio has work in progress for it tokio-rs/tokio#4827 Perhaps via feature flag to swtich between tokio and wasm as initial starting point to support wasm? |
Yes, tokio could be an optional feature. |
I would love to be able to implement a different async solution. I am building a new app and was trying to make it a non async app that uses this app to make async calls by blocking with futures::executor with block_on, then I realized the tokio requirement. I think futures-rs would be a great choice for async, as it is also compatible with no_std environments. I would love to have a no_std async access to the openai api. https://github.com/rust-lang/futures-rs My use case being personal devices that connect to the openAI API for voice to text, and then to chatGPT. I think wasm devs would also appreciate using this crate I may try and help here at some point in the near future. I am currently making a bot to make code upgrades automatically, using your library, so maybe I will point it this direction to test it out.... |
Hi @cosmikwolf It seems that support for different async executor should be a separate issue? |
Hello @64bit have little experience with WASM architecture but would like to pick this up in the coming week. |
Thank you @cosmikwolf and @Doordashcon for offering to contribute! I'll let you guys coordinate on this thread. To consider this resolved we should at least have one working example for AWS lambda or Cloudflare ( or both if you're feeling adventurous :)) |
+1. I skimmed through the code searching for Update: async-openai/async-openai/src/client.rs Line 293 in faaa89f
|
Getting started without streaming and files support, but testable through examples would still be a good first step! |
Hi all! If you can help testing #120 and/or try it on wasm, it would be great. |
Updates from release notes in 0.17.0:
|
I am maintaining the code for wasm support and I am trying to stabilize wasm target(s) in main. So perhaps we can discuss a more detailed plan here? Current State
Implementation Plan (not complete)If you have something in mind, please make a comment or help out the implementation. Tracking List:
|
Since I want to publish a crate that depends on WASM feature of |
Thank you for the heads up. I think that's a good way forward to keep it sustainable. Do you plan to make it permanent? If so, you're welcome to link new crate in README. And so we can also close the experiments branch and remove related doc in README. |
Yeah, I've made a PR. Thanks! |
WASM support has a new home https://github.com/ifsheldon/async-openai-wasm, hence closing |
Thanks for making this crate, it seems very useful :)
Currently the crate always depends on e.g. tokio which means it can't be compiled to wasm for use in frontends (or serverless wasm workers like on AWS/Cloudflare) that want to make OpenAI API requests.
It would be great if this crate could also be compiled to wasm.
The text was updated successfully, but these errors were encountered: