如何捕获网络数据包到MySQL
我要设计一个 WiFi (802.11) 网络分析仪 目前,我使用 tshark 捕获并解析 WiFi 帧,然后将输出通过管道传输到 perl 脚本,以将解析的信息存储到 Mysql 数据库。
我刚刚发现我在这个过程中错过了很多帧。我检查了一下,帧似乎在管道期间丢失了(当输出传递到 perl 以在 Mysql 中获取时) 事情是这样的
(Tshark)--------帧丢失----> (Perl)--------> (MySQL) 这就是我将 tshark 的输出传输到脚本的方式:
sudo tshark -i mon0 -t ad -T fields -e frame.time -e frame.len -e frame.cap_len -e radiotap.length | perl tshark-sql-capture.pl
这是我使用的 perl 脚本的简单模板(tshark-sql-capture.pl)
# preparing the MySQL
my $dns = "DBI:mysql:capture;localhost";
my $dbh = DBI->connect($dns,user,pass);
my $db = "captured";
while (<STDIN>) {
chomp($data = <STDIN>);
($time, $frame_len, $cap_len, $radiotap_len) = split " ", $data;
my $sth = $dbh-> prepare("INSERT INTO $db VALUES (str_to_date('$time','%M %d, %Y %H:%i:%s.%f'), '$frame_len', '$cap_len', '$radiotap_len'\n)" );
$sth->execute;
}
#Terminate MySQL
$dbh->disconnect;
任何有助于提高性能的想法都会受到赞赏。或者可能有可以做得更好的替代机制。 现在我的性能是 50%,这意味着我可以将捕获的数据包的大约一半存储在 mysql 中。
I'm going to design a network Analyzer for WiFi (802.11)
Currently I use tshark to capture and parse the WiFi frames and then pipe the output to a perl script to store the parsed information to Mysql database.
I just find out that I miss alot of frames in this process. I checked and the frames seem to be lost during the Pipe (when the output is delivered to perl to get srored in Mysql)
Here is how it goes
(Tshark) -------frames are lost----> (Perl) --------> (MySQL)
this is the how I pipe the output of tshark to script:
sudo tshark -i mon0 -t ad -T fields -e frame.time -e frame.len -e frame.cap_len -e radiotap.length | perl tshark-sql-capture.pl
this is simple template of the perl script I use (tshark-sql-capture.pl)
# preparing the MySQL
my $dns = "DBI:mysql:capture;localhost";
my $dbh = DBI->connect($dns,user,pass);
my $db = "captured";
while (<STDIN>) {
chomp($data = <STDIN>);
($time, $frame_len, $cap_len, $radiotap_len) = split " ", $data;
my $sth = $dbh-> prepare("INSERT INTO $db VALUES (str_to_date('$time','%M %d, %Y %H:%i:%s.%f'), '$frame_len', '$cap_len', '$radiotap_len'\n)" );
$sth->execute;
}
#Terminate MySQL
$dbh->disconnect;
Any Idea which can help to make the performance better is appreciated.Or may be there is an Alternative mechanism which can do better.
Right now my performance is 50% means I can store in mysql around half of the packets I'v captured.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
在管道中写入的内容不会丢失,真正发生的情况可能是 tshark 尝试写入管道,但 perl+mysql 处理输入太慢,因此 pipelineb 已满,写入会阻塞,因此 tshark 只是删除数据包。
瓶颈可能是 MySQL 或 Perl 本身,但也可能是 DB。检查CPU使用率,测量插入率。然后选择更快的数据库或写入多个数据库。您还可以尝试批量插入并增加管道缓冲区的大小。
更新
这会将一行读入
$_
,然后您忽略它。Things written in a pipe don't get lost, what's probably really going on is that tshark tries to write to the pipe but perl+mysql is too slow to process the input so the pipeb is full, write would block so tshark just drops the packets.
Bottleneck could be either MySQL or Perl itself but probably the DB. Check CPU usage, measure insert rate. Then pick a faster DB or write to multiple DBs. You can also try batch inserts and increasing the size of the pipe buffer.
Update
this reads a line into
$_
, then you ignore it.对于管道问题,您可以使用 GULP http: //staff.washington.edu/corey/gulp/
从手册页:
For pipe problems, you can improve packet capture with GULP http://staff.washington.edu/corey/gulp/
From the Man pages:
您可以使用 FIFO 文件,然后使用插入延迟读取数据包并插入到 mysql 中。
you can use a FIFO file, then read the packets and inserts in mysql using insert delay.