Spark View Engine:如何设置默认母版页名称?
我将 Spark View Engine 与嵌套母版页一起使用。我有 Application.spark 它定义了网站的基本布局。然后还有其他几个母版,它们本身使用 Application.spark 作为母版页(Default.spark、SinlgeColumn.spark、Gallery.spark...)
如果在视图文件中未指定母版页,则自动选择 Application.spark由 Spark 视图引擎提供。由于我的几乎所有页面都使用“Default.spark”作为主页面,有没有办法全局配置它?
其他可能性是:
在每个 Spark 文件中单独设置 master
。但这确实很烦人。
重命名我的主文件 (Default.spark <-> Application.spark),但这在命名中确实没有任何意义。
I use Spark View Engine with nested master pages. I have Application.spark which defines the basic layout of the website. Then there are several other masters which themselves use Application.spark as master page (Default.spark, SinlgeColumn.spark, Gallery.spark, ...)
If no master page is specified in a view file, then automatically Application.spark is choosen by the Spark View Engine. Since almost all my pages use "Default.spark" as master, is there a way to configure this globally?
The other possibilities would be:
Set the master in each spark file individually
<use master="Default" />
. But that's really annoying.Rename my master files (Default.spark <-> Application.spark) but that really doesn't make any sense in naming.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
有两种方法可以解决您的问题:
更少的工作
对于任何
ActionResult
,您只需在 return 语句中添加第二个参数即可指定要使用的母版页的名称:第一个参数名称是视图,第二个参数名称表示母版页覆盖。 ..但这确实意味着您需要在各处重复它,而且这并不比您之前的好多少。
更多工作
Spark 定位要使用的母版页的方式是通过框架中的以下代码:
注意其中的硬编码
Application.spark
- 这是 Spark 约定。看来您想要做的是覆盖此方法并放入类似的内容:然后它会在找到
Application.spark
或您之前找到您的Default.spark
可以完全摆脱Application.spark
如果它不适合您并且您更喜欢您的约定...为了覆盖它,您需要做的就是创建一个继承的类来自
Spark.Web.Mvc.DefaultDescriptorBuilder
并覆盖上面提到的方法,并在注册视图引擎时使用它,如下所示:这意味着您现在可以指示 Spark 应该在哪里查找主视图并名字会是什么。
希望有帮助,
祝一切顺利,
罗伯特·格雷
There are two ways to solve your issue:
Less Work
With any
ActionResult
, you can simply add a second parameter to the return statement to specify the name of the master to use:the first parameter name is the view and the second signifies a master page override...but that does mean you need to repeat it everywhere, and this isn't much better than what you had before.
More Work
The way that Spark locates the master page to use is via the following code in the framework:
Notice the hardcoded
Application.spark
in there - that's a Spark convention. What it seems that you want to do is override this method and put something like this in instead:Then it will find your
Default.spark
before it findsApplication.spark
or you could get rid of theApplication.spark
entirely if it doesn't ring true for you and you prefer your conventions...In order to override this, all you need to do is create a class that inherits from
Spark.Web.Mvc.DefaultDescriptorBuilder
and override that method mentioned above and use it when you register the view engine like so:This now means that you can now direct where Spark should look for the Master views and what the names will be.
Hope that helps,
All the best,
RobertTheGrey
不知道它是否有效,但尝试添加
一个名为
Views\_global.spark
的文件。它会自动包含在所有视图中。我通常将命名空间和全局变量添加到我的
_global.spark
中。示例:
另一个很酷的功能
由于您是新的 Spark 用户,我想向您展示另一个很酷的 Spark 功能。
编写而不是
您可以
编写使视图更清晰,对吧?魔法是通过一种叫做绑定的东西来完成的。您所需要做的就是创建一个
Views\Bindings.xml
文件并向其添加绑定。在此博客文章中了解更多信息:http: //blog.robertgreyling.com/2010/08/spark-bindings-are-you-tired-of-eating.html
Don't know if it works, but try adding
in a file called
Views\_global.spark
. It's automatically included in all views.I usally add namespaces and global vars to my
_global.spark
.Example:
Another cool feature
Since you are a new spark user I would like to show you another cool spark feature.
Instead of writing
you can write
Makes the views a bit cleaner, huh? The magic is done with something called bindings. All you need to do is to create a
Views\Bindings.xml
file and add bindings to it.Read more in this blog entry: http://blog.robertgreyling.com/2010/08/spark-bindings-are-you-tired-of-eating.html
而不是使用 Application.spark 作为您的基础,并使用 Default.spark 作为您的第二级......
将 Default.spark 重命名为 Application.spark,将 application.spark 重命名为 BaseApplication.spark 等其他名称。
这样,默认情况下会选择您的 application.spark,但如果您想覆盖它,您可以引用 BaseApplication.spark。
有关示例,请参阅三通道渲染部分。
http://sparkviewengine.com/documentation/master-layouts#Threepassrendering
Instead of using Application.spark as your base, and Default.spark as your 2nd level...
rename Default.spark to Application.spark and the application.spark to something else like BaseApplication.spark.
This way, by default your application.spark is picked up, but if you wish to override it you can referece the BaseApplication.spark.
See the three pass rendering section for an example.
http://sparkviewengine.com/documentation/master-layouts#Threepassrendering