Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can i set grpc message size #4192

Open
dyjfighting opened this issue Jun 16, 2024 · 3 comments
Open

How can i set grpc message size #4192

dyjfighting opened this issue Jun 16, 2024 · 3 comments

Comments

@dyjfighting
Copy link

in rpc

s := zrpc.MustNewServer(c.RpcServerConf,

	func(grpcServer *grpc.Server) {

		//grpcServer.SetOption(grpc.MaxSendMsgSize(maxMessageSize))
		userCenterClient.RegisterUserServiceServer(grpcServer, userserviceServer.NewUserServiceServer(ctx))
		userCenterClient.RegisterShopServiceServer(grpcServer, shopserviceServer.NewShopServiceServer(ctx))
		userCenterClient.RegisterSubmitTaskTempServiceServer(grpcServer, submittasktempserviceServer.NewSubmitTaskTempServiceServer(ctx))
		userCenterClient.RegisterSubmitTaskServiceServer(grpcServer, submittaskserviceServer.NewSubmitTaskServiceServer(ctx))
		userCenterClient.RegisterShopGoodsLibraryServiceServer(grpcServer, shopgoodslibraryserviceServer.NewShopGoodsLibraryServiceServer(ctx))
		userCenterClient.RegisterFilterFilesServiceServer(grpcServer, filterfilesserviceServer.NewFilterFilesServiceServer(ctx))

		if c.Mode == service.DevMode || c.Mode == service.TestMode {
			reflection.Register(grpcServer)
		}
	})
//s.AddOptions(serverOpts...)
s.AddOptions(grpc.MaxRecvMsgSize(maxMessageSize), grpc.MaxSendMsgSize(maxMessageSize))
defer s.Stop()     rpc error: code = ResourceExhausted desc = grpc: received message larger than max (10591746 vs. 4194304)
@kevwan
Copy link
Contributor

kevwan commented Jun 22, 2024

Set the option in both server and client sides.

@liuwen766
Copy link

liuwen766 commented Nov 25, 2024

How To Set the option in both server and client sides?would you please give an example?

@kevwan
Copy link
Contributor

kevwan commented Jan 28, 2025

I see you're trying to set the gRPC message size limits in go-zero. The error you're getting (grpc: received message larger than max (10591746 vs. 4194304)) indicates that your message size exceeds the default 4MB limit.

You're on the right track with using MaxRecvMsgSize and MaxSendMsgSize, but there are a few ways to set these options in go-zero:

  1. Using AddOptions after server creation (as you're doing):
s := zrpc.MustNewServer(c.RpcServerConf, ...)
s.AddOptions(
    grpc.MaxRecvMsgSize(maxMessageSize),
    grpc.MaxSendMsgSize(maxMessageSize)
)
  1. Alternatively, you can set these options directly in your RpcServerConf configuration:
type Config struct {
    zrpc.RpcServerConf
}

c.RpcServerConf.MaxMessageSize = 20 * 1024 * 1024  // Set to 20MB for example

A few important notes:

  • Make sure to set an appropriate value for maxMessageSize that meets your needs but doesn't consume too much memory
  • Both client and server need to be configured with compatible message size limits
  • Consider if you really need such large messages, as they might impact performance

For reference, here's a complete example:

const maxMessageSize = 20 * 1024 * 1024  // 20MB

s := zrpc.MustNewServer(c.RpcServerConf,
    func(grpcServer *grpc.Server) {
        // Your service registrations...
        userCenterClient.RegisterUserServiceServer(grpcServer, userserviceServer.NewUserServiceServer(ctx))
        // ... other registrations ...

        if c.Mode == service.DevMode || c.Mode == service.TestMode {
            reflection.Register(grpcServer)
        }
    })

// Set both MaxRecvMsgSize and MaxSendMsgSize
s.AddOptions(
    grpc.MaxRecvMsgSize(maxMessageSize),
    grpc.MaxSendMsgSize(maxMessageSize)
)

This should resolve your "ResourceExhausted" error while maintaining the rest of your server configuration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants