A RPC framework based on Netty in 20 minutes

Netty is a high-performance network transmission framework, which is widely used as a basic communication component by RPC framework. For example, in Dubbo protocol, it is used for inter node communication, and in Hadoop, Avro component uses it for data file sharing. So let's try to use netty to implement a simple RPC framework.

First of all, similar to using Dubbo, we first abstract out the API interface of a service. The service provider implements the methods in this interface, and the service consumer directly calls the interface for access:

public interface TestService {
    String test(String message);
}

The service provider implements the interface for consumers to call:

public class TestServiceImpl implements TestService {
    @Override
    public String test(String message) {
        System.out.println("Server has received:"+ message);
        if (message !=null){
            return "hi client, Server has Received:["+ message+"]";
        }else{
            return "empty message";
        }
    }
}

Then we start to use Netty to create the Server side of the service:

public class NettyServer {
    public static void startServer(String hostname,int port){
        EventLoopGroup bossGroup=new NioEventLoopGroup(1);
        EventLoopGroup workerGroup=new NioEventLoopGroup();
        try{
            ServerBootstrap serverBootstrap = new ServerBootstrap();
            serverBootstrap.group(bossGroup,workerGroup)
                    .channel(NioServerSocketChannel.class)
                    .childHandler(new ChannelInitializer<SocketChannel>() {
                        @Override
                        protected void initChannel(SocketChannel ch) throws Exception {
                            ChannelPipeline pipeline = ch.pipeline();
                            pipeline.addLast(new StringDecoder());
                            pipeline.addLast(new StringEncoder());
                            pipeline.addLast(new NettyServerHandler());
                        }
                    });
            ChannelFuture future = serverBootstrap.bind(hostname, port).sync();
            System.out.println("Server start");
            future.channel().closeFuture().sync();
        }catch (Exception e){
            e.printStackTrace();
        }finally {
            bossGroup.shutdownGracefully();
            workerGroup.shutdownGracefully();
        }
    }
}

When creating the Server side, we added the String type encoder and decoder of Netty in the ChannelPipeline, and finally added the handler of our business logic processing.

Similar to Dubbo using its own Dubbo protocol in the call, we need to customize our protocol before calling the service. If the received message is not in accordance with our defined protocol, it will not be processed. Here, we define a simple protocol to specify the beginning of our message:

public class Protocol {
    public static final String HEADER="My#Protolcol#Header#";
}

Create a server-side handler to handle business logic. A new class inherits the ChannelInboundHandlerAdapter, receives the message sent by the client through the channelRead method, judges whether the message starts with our custom protocol header in the method, reads the message, calls the local method, and finally returns the result of the call through writeAndFlush.

public class NettyServerHandler extends ChannelInboundHandlerAdapter {
    @Override
    public void channelRead(ChannelHandlerContext ctx, Object msg) throws Exception {
        System.out.println("msg="+msg);
        if(msg.toString().startsWith(Protocol.HEADER)){
            String result = new TestServiceImpl().test(msg.toString().substring(msg.toString().lastIndexOf("#") + 1));
            ctx.writeAndFlush(result);
        }
    }
    @Override
    public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) throws Exception {
        cause.printStackTrace();
        ctx.close();
    }
}

At this point, we have finished writing the server, and then start writing the client. Because the code of the client is a little special, we write the NettyClientHandler that processes the business logic first, and then implement the Netty initialization method of the client.

In the handler, we need to use multithreading to call the service of the server, and use channelRead to receive the results returned by the server. Therefore, in addition to inheriting the ChannelInboundHandlerAdapter parent class, we need to implement the Callable interface and override the call method.

public class NettyClientHandler extends ChannelInboundHandlerAdapter implements Callable {
    private ChannelHandlerContext context;
    //Results returned
    private String result;
    //Parameters passed in when the client calls the method
    private String param;

    @Override
    public void channelActive(ChannelHandlerContext ctx) throws Exception {
        context = ctx;
    }

    @Override
    public synchronized void channelRead(ChannelHandlerContext ctx, Object msg) throws Exception {
        result = msg.toString();
        //Wake up waiting thread
        notify();
    }

    @Override
    public synchronized Object call() throws Exception {
        context.writeAndFlush(param);
        wait();
        return result;
    }

    @Override
    public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) throws Exception {
        cause.printStackTrace();
        ctx.close();
    }

    public void setParam(String param) {
        this.param = param;
    }
}

In the above code, the variable context is created to store the ChannelHandlerContext of the current handler, which is used to send messages in the call method. After the connection with the server is created, the channelActive method will be executed first to assign a value to the context.

It should be noted that the synchronized keyword of call method and channelRead method is very important. When the wait method is executed, the lock will be released so that the channelRead method can acquire the lock. After reading the message returned by the server, use notify to wake up the thread of call method and return the result.

Having finished with NettyClientHandler, let's go back to writing NettyClient, the startup class of Netty client. First of all, we create a thread pool to perform access requests later. The size of the thread pool is defined as the number of threads available to our cpu.

private static ExecutorService executor 
      = Executors.newFixedThreadPool(Runtime.getRuntime().availableProcessors());

Because the client calls the interface and needs to use the proxy mode to create a proxy object, we create a getProxy method to get the proxy object and enhance the method:

public Object getProxy(final Class<?> serviceClass, final String protocolHead) {
    return Proxy.newProxyInstance(this.getClass().getClassLoader(), new Class<?>[]{serviceClass}, new InvocationHandler() {
        @Override
        public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
            if (clientHandler == null) {
                initClient();
            }
            clientHandler.setParam(protocolHead + args[0]);
            return executor.submit(clientHandler).get();
        }
    });
}

Here, the submit method of thread pool is called to submit the task, and the call method in handler is called to send the request. The args[0] above is the parameter when calling. The initClient method is used to initialize the client side of Netty. The code is as follows:

private static void initClient() {
    clientHandler = new NettyClientHandler();
    NioEventLoopGroup group = new NioEventLoopGroup();
    try {
        Bootstrap bootstrap = new Bootstrap();
        bootstrap.group(group)
                .channel(NioSocketChannel.class)
                .option(ChannelOption.TCP_NODELAY, true)
                .handler(
                        new ChannelInitializer<SocketChannel>() {
                            @Override
                            protected void initChannel(SocketChannel ch) throws Exception {
                                ChannelPipeline pipeline = ch.pipeline();
                                pipeline.addLast(new StringDecoder());
                                pipeline.addLast(new StringEncoder());
                                pipeline.addLast(clientHandler);
                            }
                        }
                );
        bootstrap.connect("127.0.0.1", 7000).sync();
        System.out.println("Client start");
    } catch (Exception e) {
        e.printStackTrace();
    }
}

The channel pipeline on the client side of netty also adds a codec and our own business logic handler.

At this point, the functions of the client and the server are completed. We create a startup class and start the server first:

public class ProviderBootstrap {
    public static void main(String[] args) {
        NettyServer.startServer("127.0.0.1",7000);
    }
}

Restart the client:

public class ConsumerBootstrap {
    public static void main(String[] args) {
        NettyClient consumer = new NettyClient();
        TestService proxy =(TestService) consumer.getProxy(TestService.class, Protocol.HEADER);
        String result = proxy.test("hi,i am client");
        System.out.println("result: "+result);
    }
}

 

Finally, let's look at the running results. First, let's look at the service provider:

The received message starts with our protocol. After the protocol header is removed, the message body is obtained, which is passed to the requested method as the parameter of RPC call method. Look at the service consumer side:

Received the information returned by the service provider. In this way, a simple RPC framework has been implemented.

 

 

Tags: Netty Dubbo network Hadoop

Posted on Fri, 12 Jun 2020 05:13:55 -0400 by florida_guy99