ajax 自动完成性能问题以访问数据库或过滤预缓存结果
在我现有的 asp.net 应用程序中,我有一个 ajax 自动完成控件,它使用 Web 方法来获取实习生点击存储过程的自动完成结果。但由于每次访问数据库的成本很高并且存在性能问题,因此我们希望缓存所有大约 45,000 行的查找结果,并使用缓存的数据来过滤并获取数据。
我们正在使用 linq 查询通过使用 contains 方法来检查前缀来进行过滤。但如果我使用缓存数据过滤,它比每次都访问数据库的原始实现要花费更长的时间。
您可以建议我什么方法,当用户在 UI 中键入数据时,它可以提供更快的结果集检索。
我知道维护 45k 行数并进行过滤将是一件非常痛苦的事情。点击数据库将是更好的方法。
由于我们面临一些性能问题,请让我知道更好的方法。
用 Jquery 自动完成插件替换 Ajax 自动完成有什么区别..?
代码: 就像任何其他 ajax 自动完成代码一样:
<ajaxToolkit:AutoCompleteExtender
runat="server"
ID="autoComplete1"
TargetControlID="myTextBox"
ServiceMethod="GetUserList"></ajaxToolkit:AutoCompleteExtender>
[WebMethod]
public string[] GetUserList(string prefix)
{
return UserManager.GetUserNamesBySearch(prefix);
}
public string[] GetUserNamesBySearch(string prefix)
{
List<User> userCollection=UserServiceMgr.GetUserList(prefix);
var filteredUsers=from user in userCollection
Where user.FirstName.contain(prefix)
select user.FirstName.
filterdUsers.ToArray();
}
提前感谢
Suri
In my existing asp.net application, i have a ajax autocomplete control, which uses a web method to get the autocomplete results which intern hits a stored procedure. But since hitting the db every time is costly and a performance issue, we wanted to cache all the lookup results which is around 45,000 rows and use the cahced data to filter out and get the data.
we are using linq query to filter by using contains method to check the prefix. But if i use the cached data filtering, it takes much longer than the original implementation which hits db everytime.
Is there any approach you can suggest me, which can give a much quicker result set retrieval when the user types the data in the UI.
I know that maintaining 45k no.of rows and filtering that would be a real pain. Hitting the db would be much more better approach.
Since we are facing some performance issues, please let me know any better approach.
Replacement of Ajax autocomplete with Jquery autocomplete plugin makes any difference..?
Code:
Just like any other ajax autocomplete code:
<ajaxToolkit:AutoCompleteExtender
runat="server"
ID="autoComplete1"
TargetControlID="myTextBox"
ServiceMethod="GetUserList"></ajaxToolkit:AutoCompleteExtender>
[WebMethod]
public string[] GetUserList(string prefix)
{
return UserManager.GetUserNamesBySearch(prefix);
}
public string[] GetUserNamesBySearch(string prefix)
{
List<User> userCollection=UserServiceMgr.GetUserList(prefix);
var filteredUsers=from user in userCollection
Where user.FirstName.contain(prefix)
select user.FirstName.
filterdUsers.ToArray();
}
Thanks in advance
Suri
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
IMO,45k+ 记录对于尝试创建缓存策略来说太多了。
鉴于“缓存键”是部分字符串,您最终将在缓存中得到必须处理的重复结果。
我们还在 ASP.NET MVC 应用程序中使用自动完成 (jQuery),并且数据库中有数百万条记录。
我们使用存储过程,在后端的索引视图上进行全文搜索。
考虑到数据量,它的性能相当不错。
因此,我建议使用 FTS 提供程序(例如 SQL Server 或 Lucene)来优化后端,而不是使用 ASP.NET 缓存。
FTS 非常适合自动完成,因为它内置了针对干扰词和同义词库等内容的智能功能。
IMO, 45k+ records is too much to try and create a cache strategy.
Given that the "cache key" is a partial string, you will end up with duplicate results in the cache which you have to deal with.
We also use auto complete (jQuery) in our ASP.NET MVC application, and we have millions of records in the db.
We use a stored procedure, with Full Text Search over indexes views in the backend.
And given the amount of data, it performs reasonably well.
So i would suggest using a FTS provider such as SQL Server or Lucene to optimize your backend, not the use of ASP.NET cache.
FTS is a natural fit for auto complete, since it has built in smarts for things like noise words and thesauruses.
如果您的基础数据源不会经常更改,您可以考虑将结果存储在应用程序的缓存中,并设置刷新数据的超时。如果数据经常更改,您可以设置对源表的依赖关系,以便当表的数据被修改时,它会使缓存对象失效,然后重建缓存对象。如果数据经常变化,那么我建议您完全按照您现在正在做的事情进行操作。
当我说“重建”缓存对象时,您的应用程序有责任测试缓存对象是否存在,并重建它,如果不存在,则在缓存中重新建立它。
请记住,如果您将使用 SqlDependency,则必须为其设置数据库。 aspnet_regsql.exe 能够设置您的数据库。另外,如果您的应用程序依赖于保证数据准确的事实,那么我也会做您现在正在做的事情,或者使用 SqlDependency。
缓存示例
SqlDependency
aspnet_regsql.exe
If your underlying data source isn't going to change very often you can consider storing the results in the application's Cache and establish a timeout to refresh the data. If the data changes every so often, you can set dependencies on the source table so that when the table's data gets modified it invalidates the cached object and then rebuilds the cached object. If the data changes often then I would recommend doing exactly what you're doing now.
When I say "rebuild" the cached object, it is your application's responsibility to test for the presence of the cached object and rebuild it and reestablish it in the cache if is not present.
Keep in mind that if you will use an SqlDependency then you must set the database up for it. aspnet_regsql.exe has the ability to set up your database. Also, if your application relies on the fact that your data is guaranteed to be accurate then I would also do what you're doing now, or use SqlDependency.
Caching Example
SqlDependency
aspnet_regsql.exe