如何掩盖张量的时间步长之前

发布于 2025-02-10 16:35:41 字数 519 浏览 0 评论 0原文

我的时间序列数据的形式为[batch_size,Horizo​​n,feature]。事件经常发生,我将它们划分为单独的“ meta”张量作为布尔国旗。即,除了发生给定事件时(在这种情况下为1)时,它是填充零相同形状的张量。

如果事件发生在地平线内,我需要能够防止我的模型在事件发生之前查看数据;因此,默认情况下,在第二维中,蒙版应该是所有掩码,并且在检测到的事件之前的时间段应为所有零。

仅应考虑最后一个事件,即使存在先前的事件,所有先前的时间段也应为0。

一维示例(meta - > mask):

[0, 0, 1, 0] -> [0, 0, 1, 1]
[0, 0, 0, 1] -> [0, 0, 0, 1]
[1, 0, 1, 0] -> [0, 0, 1, 1]
[1, 0, 0, 0] -> [1, 1, 1, 1]
[0, 0, 0, 0] -> [1, 1, 1, 1]

I have time-series data in the form of [batch_size, horizon, feature]. Events occur every so often, and I demarcate them in a separate "meta" tensor as a boolean flag. i.e., it's a tensor of the same shape filled with zeros except for when a given event occurs (in which case it's a 1).

I need to be able to prevent my model from looking at data prior to the event if an event has occurred within the horizon; so by default within the 2nd dimension, the mask should be all ones, and timesteps before a detected event should be all zeros.

Only the last event should be considered, and all prior timesteps should be 0 even if there were prior events.

One-dimensional examples (meta -> mask):

[0, 0, 1, 0] -> [0, 0, 1, 1]
[0, 0, 0, 1] -> [0, 0, 0, 1]
[1, 0, 1, 0] -> [0, 0, 1, 1]
[1, 0, 0, 0] -> [1, 1, 1, 1]
[0, 0, 0, 0] -> [1, 1, 1, 1]

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

夜清冷一曲。 2025-02-17 16:35:41

也许这样的东西:

# copy, paste, acknowledge

import tensorflow as tf

the_example = tf.constant([[0, 0, 1, 0], 
                           [0, 0, 0, 1], 
                           [1, 0, 1, 0], 
                           [1, 0, 0, 0],
                           [0, 0, 0, 0]]) 

the_zero_mask = tf.where(tf.reduce_all(the_example == 0, axis=-1), True, False)
x = tf.boolean_mask(the_example, ~the_zero_mask)
this_shape = tf.shape(x)

something_special = tf.stack([tf.repeat(tf.where(~the_zero_mask), this_shape[-1]), tf.cast(tf.tile(tf.range(this_shape[-1]), [this_shape[0]]), dtype=tf.int64)], axis=-1)
tell_me_where = tf.where(x == 1)
here = tf.math.unsorted_segment_max(data = tell_me_where[:, 1], segment_ids = tell_me_where[:, 0], num_segments=this_shape[0])
raggidy_ragged = tf.reverse(tf.ones_like(tf.ragged.range(here, this_shape[-1])).to_tensor(), axis=[-1])
raggidy_ragged = tf.pad(raggidy_ragged , [[0, 0], [this_shape[1] - tf.shape(raggidy_ragged)[1], 0]])
we_made_it = tf.tensor_scatter_nd_update(tf.ones_like(the_example, dtype=tf.int64), something_special, tf.reshape(raggidy_ragged, [-1]))
print(we_made_it)
tf.Tensor(
[[0 0 1 1]
 [0 0 0 1]
 [0 0 1 1]
 [1 1 1 1]
 [1 1 1 1]], shape=(5, 4), dtype=int64)

Maybe something like this:

# copy, paste, acknowledge

import tensorflow as tf

the_example = tf.constant([[0, 0, 1, 0], 
                           [0, 0, 0, 1], 
                           [1, 0, 1, 0], 
                           [1, 0, 0, 0],
                           [0, 0, 0, 0]]) 

the_zero_mask = tf.where(tf.reduce_all(the_example == 0, axis=-1), True, False)
x = tf.boolean_mask(the_example, ~the_zero_mask)
this_shape = tf.shape(x)

something_special = tf.stack([tf.repeat(tf.where(~the_zero_mask), this_shape[-1]), tf.cast(tf.tile(tf.range(this_shape[-1]), [this_shape[0]]), dtype=tf.int64)], axis=-1)
tell_me_where = tf.where(x == 1)
here = tf.math.unsorted_segment_max(data = tell_me_where[:, 1], segment_ids = tell_me_where[:, 0], num_segments=this_shape[0])
raggidy_ragged = tf.reverse(tf.ones_like(tf.ragged.range(here, this_shape[-1])).to_tensor(), axis=[-1])
raggidy_ragged = tf.pad(raggidy_ragged , [[0, 0], [this_shape[1] - tf.shape(raggidy_ragged)[1], 0]])
we_made_it = tf.tensor_scatter_nd_update(tf.ones_like(the_example, dtype=tf.int64), something_special, tf.reshape(raggidy_ragged, [-1]))
print(we_made_it)
tf.Tensor(
[[0 0 1 1]
 [0 0 0 1]
 [0 0 1 1]
 [1 1 1 1]
 [1 1 1 1]], shape=(5, 4), dtype=int64)
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文