-
Notifications
You must be signed in to change notification settings - Fork 133
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add LLM function calling support #724
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Ignored Deployment
|
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #724 +/- ##
==========================================
- Coverage 59.46% 59.28% -0.19%
==========================================
Files 37 39 +2
Lines 2810 2957 +147
==========================================
+ Hits 1671 1753 +82
- Misses 1034 1091 +57
- Partials 105 113 +8 ☔ View full report in Codecov by Sentry. |
fix: ai provider data race update dependencies wip: remove app.go wip: split interface and service wip: read conf wip: pass credential to function calling service from api server update comment fix test code comments code comments rename main.go to ai.go service will not be expired in BasicAPIServer wip: check finish_reason prompt remove app_id refactor FunctionCallContext fix bug BasicAPIServer response with struct data and retrieval data update example readme groq.com style response json key naming rename to FunctionCall instead of FunctionCallObject follow SSE format use api request instead mock for get_exchange_rate example typo typo fix: make lint Update README.md fix: undo delete doc.go
Description
add LLM function calling support