centos7 杀端口 9999,端口被占用了
SyslogReader$Job - port = 9999
2019-10-10 11:03:54.187 [job-0] INFO SyslogReader$Job - proxyPort = 30101
2019-10-10 11:03:54.187 [job-0] INFO SyslogReader$Job - fieldsCount=1
2019-10-10 11:03:54.187 [job-0] INFO SyslogReader$Job - fieldsName =1
2019-10-10 11:03:54.187 [job-0] INFO SyslogReader$Job - parseRule = split
2019-10-10 11:03:54.187 [job-0] INFO SyslogReader$Job - parseExpression = ,
2019-10-10 11:03:54.187 [job-0] INFO SyslogReader$Job - printOrder =1
2019-10-10 11:03:54.187 [job-0] INFO SyslogReader$Job - protocolName = UDP
2019-10-10 11:03:54.217 [job-0] WARN UnstructuredStorageWriterUtil - 您的encoding配置为空, 将使用默认值[UTF-8]
2019-10-10 11:03:54.217 [job-0] INFO JobContainer - jobContainer starts to do prepare ...
2019-10-10 11:03:54.218 [job-0] INFO JobContainer - DataX Reader.Job [syslogreader] do prepare work .
2019-10-10 11:03:54.218 [job-0] INFO JobContainer - DataX Writer.Job [txtfilewriter] do prepare work .
2019-10-10 11:03:54.218 [job-0] INFO TxtFileWriter$Job - 由于您配置了writeMode append, 写入前不做清理工作, [/root/slog] 目录下写入相应文件名前缀 [22] 的文件
2019-10-10 11:03:54.218 [job-0] INFO JobContainer - jobContainer starts to do split ...
2019-10-10 11:03:54.218 [job-0] INFO JobContainer - Job set Channel-Number to 1 channels.
2019-10-10 11:03:54.219 [job-0] INFO JobContainer - DataX Reader.Job [syslogreader] splits to [1] tasks.
2019-10-10 11:03:54.219 [job-0] INFO TxtFileWriter$Job - begin do split...
2019-10-10 11:03:54.229 [job-0] INFO TxtFileWriter$Job - splited write file name:[22__19269ef8_6c12_42b2_9e73_675796dc99c6]
2019-10-10 11:03:54.229 [job-0] INFO TxtFileWriter$Job - end do split.
2019-10-10 11:03:54.229 [job-0] INFO JobContainer - DataX Writer.Job [txtfilewriter] splits to [1] tasks.
2019-10-10 11:03:54.241 [job-0] INFO JobContainer - jobContainer starts to do schedule ...
2019-10-10 11:03:54.244 [job-0] INFO JobContainer - Scheduler starts [1] taskGroups.
2019-10-10 11:03:54.245 [job-0] INFO JobContainer - Running by standalone Mode.
2019-10-10 11:03:54.250 [taskGroup-0] INFO TaskGroupContainer - taskGroupId=[0] start [1] channels for [1] tasks.
2019-10-10 11:03:54.252 [taskGroup-0] INFO Channel - Channel set byte_speed_limit to -1, No bps activated.
2019-10-10 11:03:54.252 [taskGroup-0] INFO Channel - Channel set record_speed_limit to -1, No tps activated.
2019-10-10 11:03:54.260 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[0] attemptCount[1] is started
2019-10-10 11:03:54.262 [0-0-0-writer] INFO TxtFileWriter$Task - begin do write...
2019-10-10 11:03:54.262 [0-0-0-writer] INFO TxtFileWriter$Task - write to file : [/root/slog/22__19269ef8_6c12_42b2_9e73_675796dc99c6]
2019-10-10 11:03:54.396 [0-0-0-reader] WARN Bootstrap - Unknown channel option: SO_BACKLOG=128
2019-10-10 11:03:54.406 [0-0-0-reader] ERROR ReaderRunner - Reader runner Received Exceptions:
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method) ~[na:1.8.0_191]
at sun.nio.ch.Net.bind(Net.java:433) ~[na:1.8.0_191]
at sun.nio.ch.DatagramChannelImpl.bind(DatagramChannelImpl.java:691) ~[na:1.8.0_191]
at sun.nio.ch.DatagramSocketAdaptor.bind(DatagramSocketAdaptor.java:91) ~[na:1.8.0_191]
at io.netty.channel.socket.nio.NioDatagramChannel.doBind(NioDatagramChannel.java:191) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:554) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1237) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.channel.ChannelHandlerInvokerUtil.invokeBindNow(ChannelHandlerInvokerUtil.java:109) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.channel.DefaultChannelHandlerInvoker.invokeBind(DefaultChannelHandlerInvoker.java:214) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.channel.PausableChannelEventExecutor.invokeBind(PausableChannelEventExecutor.java:101) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:1013) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:236) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:357) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:328) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
at io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126) ~[netty-all-5.0.0.Alpha2.jar:5.0.0.Alpha2]
Exception in thread "taskGroup-0" com.alibaba.datax.common.exception.DataXException: Code:[Framework-13], Description:[DataX插件运行时出错, 具体原因请参看DataX运行结束时的错误诊断信息 .]. - java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.DatagramChannelImpl.bind(DatagramChannelImpl.java:691)
at sun.nio.ch.DatagramSocketAdaptor.bind(DatagramSocketAdaptor.java:91)
at io.netty.channel.socket.nio.NioDatagramChannel.doBind(NioDatagramChannel.java:191)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:554)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1237)
at io.netty.channel.ChannelHandlerInvokerUtil.invokeBindNow(ChannelHandlerInvokerUtil.java:109)
at io.netty.channel.DefaultChannelHandlerInvoker.invokeBind(DefaultChannelHandlerInvoker.java:214)
at io.netty.channel.PausableChannelEventExecutor.invokeBind(PausableChannelEventExecutor.java:101)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:1013)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:236)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:328)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
at io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
at io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
at io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
at io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
at io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
at io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
- java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.DatagramChannelImpl.bind(DatagramChannelImpl.java:691)
at sun.nio.ch.DatagramSocketAdaptor.bind(DatagramSocketAdaptor.java:91)
at io.netty.channel.socket.nio.NioDatagramChannel.doBind(NioDatagramChannel.java:191)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:554)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1237)
at io.netty.channel.ChannelHandlerInvokerUtil.invokeBindNow(ChannelHandlerInvokerUtil.java:109)
at io.netty.channel.DefaultChannelHandlerInvoker.invokeBind(DefaultChannelHandlerInvoker.java:214)
at io.netty.channel.PausableChannelEventExecutor.invokeBind(PausableChannelEventExecutor.java:101)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:1013)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:236)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:328)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
at io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
at io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
at io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
at io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
at io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
at io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
at com.alibaba.datax.common.exception.DataXException.asDataXException(DataXException.java:40)
at com.alibaba.datax.core.taskgroup.TaskGroupContainer.start(TaskGroupContainer.java:195)
at com.alibaba.datax.core.taskgroup.runner.TaskGroupContainerRunner.run(TaskGroupContainerRunner.java:24)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论