Have to implement a server for handling the following protocol through Ethernet connection:
Establishing a connection
The client connects to the configured server via TCP / IP.
After the connection has been established, the client initially sends a heartbeat message to the
Server:
{
"MessageID": "Heartbeat"
}
Response:
{
"ResponseCode": "Ok"
}
Communication process
To maintain the connection, the client sends every 10 seconds when inactive
Heartbeat message.
Server and client must close the connection if they are not receiving a message for longer than 20 seconds.
An answer must be given within 5 seconds to request.
If no response is received, the connection must also be closed.
The protocol does not contain numbering or any other form of identification.
Communication partner when sending the responses makes sure that they are in the same sequence.
Message structure:
The messages are embedded in an STX-ETX frame.
STX (0x02) message ETX (0x03)
An `escaping` of STX and ETX within the message is not necessary since it is in JSON format
Escape sequence are following:
JSON.stringify ({"a": "\ x02 \ x03 \ x10"}) → "{" a \ ": " \ u0002 \ u0003 \ u0010 \ "}"
Not only heartbeat messages should be used. A typical message should be like:
{
"MessageID": "CheckAccess"
"Parameters": {
"MediaType": "type",
"MediaData": "data"
}
}
And the appropriate response:
{
"ResponseCode": "some-code",
"DisplayMessage": "some-message",
"SessionID": "some-id"
}
It should be a multi-client server. And protocol doesn't have any identification.
However, we have to identify the client at least the IP address from which it was sent.
Could not find some solution on how to add such server to Spring Boot application and enable on startup & handle input and output logic for it.
Any suggestions are highly appreciated.
Solution
Configured following for TCP server:
@Slf4j
@Component
@RequiredArgsConstructor
public class TCPServer {
private final InetSocketAddress hostAddress;
private final ServerBootstrap serverBootstrap;
private Channel serverChannel;
@PostConstruct
public void start() {
try {
ChannelFuture serverChannelFuture = serverBootstrap.bind(hostAddress).sync();
log.info("Server is STARTED : port {}", hostAddress.getPort());
serverChannel = serverChannelFuture.channel().closeFuture().sync().channel();
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
@PreDestroy
public void stop() {
if (serverChannel != null) {
serverChannel.close();
serverChannel.parent().close();
}
}
}
@PostConstruct
launches server during startup of an application.
Configuration for it as well:
@Configuration
@RequiredArgsConstructor
@EnableConfigurationProperties(NettyProperties.class)
public class NettyConfiguration {
private final LoggingHandler loggingHandler = new LoggingHandler(LogLevel.DEBUG);
private final NettyProperties nettyProperties;
@Bean(name = "serverBootstrap")
public ServerBootstrap bootstrap(SimpleChannelInitializer initializer) {
ServerBootstrap bootstrap = new ServerBootstrap();
bootstrap.group(bossGroup(), workerGroup())
.channel(NioServerSocketChannel.class)
.handler(loggingHandler)
.childHandler(initializer);
bootstrap.option(ChannelOption.SO_BACKLOG, nettyProperties.getBacklog());
bootstrap.childOption(ChannelOption.SO_KEEPALIVE, nettyProperties.isKeepAlive());
return bootstrap;
}
@Bean(destroyMethod = "shutdownGracefully")
public NioEventLoopGroup bossGroup() {
return new NioEventLoopGroup(nettyProperties.getBossCount());
}
@Bean(destroyMethod = "shutdownGracefully")
public NioEventLoopGroup workerGroup() {
return new NioEventLoopGroup(nettyProperties.getWorkerCount());
}
@Bean
@SneakyThrows
public InetSocketAddress tcpSocketAddress() {
return new InetSocketAddress(nettyProperties.getTcpPort());
}
}
Initialization logic:
@Component
@RequiredArgsConstructor
public class SimpleChannelInitializer extends ChannelInitializer<SocketChannel> {
private final StringEncoder stringEncoder = new StringEncoder();
private final StringDecoder stringDecoder = new StringDecoder();
private final QrReaderProcessingHandler readerServerHandler;
private final NettyProperties nettyProperties;
@Override
protected void initChannel(SocketChannel socketChannel) {
ChannelPipeline pipeline = socketChannel.pipeline();
pipeline.addLast(new DelimiterBasedFrameDecoder(1024 * 1024, Delimiters.lineDelimiter()));
pipeline.addLast(new ReadTimeoutHandler(nettyProperties.getClientTimeout()));
pipeline.addLast(stringDecoder);
pipeline.addLast(stringEncoder);
pipeline.addLast(readerServerHandler);
}
}
Properties configuration:
@Getter
@Setter
@ConfigurationProperties(prefix = "netty")
public class NettyProperties {
@NotNull
@Size(min = 1000, max = 65535)
private int tcpPort;
@Min(1)
@NotNull
private int bossCount;
@Min(2)
@NotNull
private int workerCount;
@NotNull
private boolean keepAlive;
@NotNull
private int backlog;
@NotNull
private int clientTimeout;
}
and a snippet from application.yml
:
netty:
tcp-port: 9090
boss-count: 1
worker-count: 14
keep-alive: true
backlog: 128
client-timeout: 20
And handler is quite trivial.
Checked locally by running at the console:
telnet localhost 9090
It works fine there. I hope it will be fine for access from clients.