反序列化错误 - 无法填充整个缓冲区
我正在使用 Rocket Web 框架、Postgres 数据库和 Diesel 编写一个简单的 Rust 应用程序来管理数据库迁移。代码编译正常,应用程序的其他部分运行正常,但由于某种原因,我的 Expense
端点似乎无法正常工作。
例如,当点击 /expense
端点以获取所有费用时,我在日志中收到以下错误:
Err(
DeserializationError(
Error {
kind: UnexpectedEof,
message: "failed to fill whole buffer"
}
)
)
此错误显然不是很有帮助,并且没有很多详细信息。为什么我会收到此错误?如何解决此问题?
以下是代码的相关部分:
费用迁移
CREATE TABLE expenses (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL UNIQUE,
description TEXT NULL,
amount DECIMAL NOT NULL,
tax_year INT NOT NULL,
purchase_date TIMESTAMP WITH TIME ZONE NULL,
created_at TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT NOW(),
updated_at TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT NOW()
);
费用模型
#[derive( Debug, Serialize, AsChangeset, Deserialize, Queryable, Insertable )]
#[table_name = "expenses"]
pub struct Expense {
pub id: Option<i32>,
pub name: String,
pub description: Option<String>,
pub amount: BigDecimal,
pub tax_year: i32,
pub purchase_date: Option<DateTime<Utc>>,
pub created_at: DateTime<Utc>,
pub updated_at: DateTime<Utc>,
}
impl Expense {
pub fn get_all(conn: &PgConnection) -> Result<Vec<Expense>, Error> {
expenses::table.order(expenses::id.desc()).load::<Expense>(conn)
}
...
}
控制器
#[get("/", format = "json")]
pub fn get_all(conn: db::Connection) -> Result<ApiResponse, ApiError> {
let result = Expense::get_all(&conn);
match result {
Ok(r) => Ok(success(json!(r))),
Err(e) => Err(db_error(e)),
}
}
架构
table! {
expenses (id) {
id -> Nullable<Int4>,
name -> Text,
description -> Nullable<Text>,
amount -> Numeric,
tax_year -> Int4,
purchase_date -> Nullable<Timestamptz>,
created_at -> Timestamptz,
updated_at -> Timestamptz,
}
}
I'm writing a simple rust app using the Rocket web framework, Postgres database, and Diesel to manage database migrations. The code compiles OK, and other parts of the application run properly, but for some reason, my Expense
endpoints don't seem to be working.
When hitting the /expense
endpoint for example to get all the Expenses, I get the following error in the log:
Err(
DeserializationError(
Error {
kind: UnexpectedEof,
message: "failed to fill whole buffer"
}
)
)
This error is obviously not very helpful, and doesn't have a lot of detail. Why am I receiving this error, and how can I resolve this problem?
Here are the relevant parts of the code:
Expense Migration
CREATE TABLE expenses (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL UNIQUE,
description TEXT NULL,
amount DECIMAL NOT NULL,
tax_year INT NOT NULL,
purchase_date TIMESTAMP WITH TIME ZONE NULL,
created_at TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT NOW(),
updated_at TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT NOW()
);
Expense Model
#[derive( Debug, Serialize, AsChangeset, Deserialize, Queryable, Insertable )]
#[table_name = "expenses"]
pub struct Expense {
pub id: Option<i32>,
pub name: String,
pub description: Option<String>,
pub amount: BigDecimal,
pub tax_year: i32,
pub purchase_date: Option<DateTime<Utc>>,
pub created_at: DateTime<Utc>,
pub updated_at: DateTime<Utc>,
}
impl Expense {
pub fn get_all(conn: &PgConnection) -> Result<Vec<Expense>, Error> {
expenses::table.order(expenses::id.desc()).load::<Expense>(conn)
}
...
}
Controller
#[get("/", format = "json")]
pub fn get_all(conn: db::Connection) -> Result<ApiResponse, ApiError> {
let result = Expense::get_all(&conn);
match result {
Ok(r) => Ok(success(json!(r))),
Err(e) => Err(db_error(e)),
}
}
Schema
table! {
expenses (id) {
id -> Nullable<Int4>,
name -> Text,
description -> Nullable<Text>,
amount -> Numeric,
tax_year -> Int4,
purchase_date -> Nullable<Timestamptz>,
created_at -> Timestamptz,
updated_at -> Timestamptz,
}
}
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
在数据库迁移中,您没有为
amount
列 (DECIMAL
) 指定 Diesel 所期望的精度。尝试按如下所示向
amount
列添加精度,然后重新应用迁移。In the database migration, you are not specifying a precision for the
amount
column(DECIMAL
), which Diesel is expecting.Try adding a precision to the
amount
column as per below, and reapplying the migration.