如何在模型中使用tf.keras.layers.normalization以上的功能

发布于 2025-02-10 07:22:49 字数 1837 浏览 0 评论 0 原文

This reproducible example creates a basic regression model预测MPG给定马力(希望我可以提供链接)。据我了解,这将功能马力转换为模型的训练 - 也称为。这很有吸引力,因为该模型在评分/推理中进行了必要的原始数据转换,例如部署后(如果我被误解,请纠正我)。我想知道,当一个人拥有的比独立变量多得多时,该如何实现。这是从上面引用的可再现代码中获取的:

horsepower_normalizer = tf.keras.layers.Normalization(input_shape=[1, ], axis=None)
horsepower_normalizer.adapt(horsepower)

horsepower_normalizer = tf.keras.layers.Normalization(input_shape=[1, ], axis=None)
horsepower_normalizer.adapt(horsepower)

horsepower_model = Sequential([
    horsepower_normalizer,
    layers.Dense(units=1)
])

因此,我们说我们有数字功能 x,y,z 的列表,是否可以基于此制作模型定义代码(例如,通过功能API) ?任何指示都将非常欢迎。谢谢!

PS:

我目前正在尝试学习Keras + TF,理想情况下,我希望正常化成为模式/培训的一部分。我沿着这些行使用非常敬业的代码(要改进!):

train_data = pd.read_csv('train.csv')
val_data = pd.read_csv('val.csv')

target_name = 'ze_target'

y_train = train_data[target_name]
X_train = train_data.drop(target_name, axis=1)

y_val = train_data[target_name]
X_val = train_data.drop(target_name, axis=1)

def create_model():
    
    model = Sequential()
    model.add(Dense(20, input_dim=X.shape[1], activation='relu'))
    model.add(Dense(20, input_dim=X.shape[1], activation='relu'))
    model.add(Dense(20, input_dim=X.shape[1], activation='relu'))
    model.add(Dense(1))
    # Compile model
    model.compile(optimizer=Adam(learning_rate=0.0001), loss = 'mse')
    return model

model = create_model()
model.summary()

model.fit(X_train, y_train, validation_data=(X_val,y_val), batch_size=128, epochs=30)

This reproducible example creates a basic regression model predicting MPG given Horsepower (hope I am OK to just provide link). As far as I understand, this bakes the transformation of the feature Horsepower into the model's training - also refered to as "inside the model". This is appealing as the model does the necessary transformation of raw data during scoring/inference e.g. after deployment (please correct me if I misunderstood). I am wondering, how this could be implemented when one has more than on independent variable. This is taken from the reproducible code quoted above:

horsepower_normalizer = tf.keras.layers.Normalization(input_shape=[1, ], axis=None)
horsepower_normalizer.adapt(horsepower)

horsepower_normalizer = tf.keras.layers.Normalization(input_shape=[1, ], axis=None)
horsepower_normalizer.adapt(horsepower)

horsepower_model = Sequential([
    horsepower_normalizer,
    layers.Dense(units=1)
])

So let us say we have a list of numeric features X, Y, Z could the model definition code be produced based on this (e.g. via the functional API)? Any pointers would be very much welcome. Thanks!

PS:

I am currently trying to learn Keras + TF and ideally I want the normalisation make part of the mode/training. I use very rudemnetary code (to be improved!) along those lines:

train_data = pd.read_csv('train.csv')
val_data = pd.read_csv('val.csv')

target_name = 'ze_target'

y_train = train_data[target_name]
X_train = train_data.drop(target_name, axis=1)

y_val = train_data[target_name]
X_val = train_data.drop(target_name, axis=1)

def create_model():
    
    model = Sequential()
    model.add(Dense(20, input_dim=X.shape[1], activation='relu'))
    model.add(Dense(20, input_dim=X.shape[1], activation='relu'))
    model.add(Dense(20, input_dim=X.shape[1], activation='relu'))
    model.add(Dense(1))
    # Compile model
    model.compile(optimizer=Adam(learning_rate=0.0001), loss = 'mse')
    return model

model = create_model()
model.summary()

model.fit(X_train, y_train, validation_data=(X_val,y_val), batch_size=128, epochs=30)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

菊凝晚露 2025-02-17 07:22:49

您可以使用 tf.concat 轴上的功能= 1,然后使用 对于下面的三个功能,因为我们想在三个功能上标准化,请确保设置 input_shape =(3,) and axis = -1 <-1 < /代码>。

import tensorflow as tf

x = tf.random.uniform((100, 1))
y = tf.random.uniform((100, 1))
z = tf.random.uniform((100, 1))

xyz = tf.concat([x, y, z], 1)

horsepower_normalizer = tf.keras.layers.Normalization(input_shape=(3,), axis=-1)
horsepower_normalizer.adapt(xyz)

horsepower_model = tf.keras.models.Sequential([
    horsepower_normalizer,
    tf.keras.layers.Dense(units=1)
])

horsepower_model(xyz)

输出:

<tf.Tensor: shape=(100, 1), dtype=float32, numpy=
array([[-0.17135675],
       [-0.48248804],
       [-2.2847023 ],
       [-0.05702276],
       [ 2.9332483 ],
       [ 0.64826846],
       [-2.1490448 ],
       [-1.1697797 ],
       [-0.01030668],
            ...
       [-1.880199  ],
       [ 1.2854142 ],
       [-0.5471661 ]], dtype=float32)>

You can use tf.concat and concatenate three features on axis=1 then use tf.keras.layers.Normalization for three feature like below, because we want to normalize on three features, make sure to set input_shape=(3,) and axis=-1.

import tensorflow as tf

x = tf.random.uniform((100, 1))
y = tf.random.uniform((100, 1))
z = tf.random.uniform((100, 1))

xyz = tf.concat([x, y, z], 1)

horsepower_normalizer = tf.keras.layers.Normalization(input_shape=(3,), axis=-1)
horsepower_normalizer.adapt(xyz)

horsepower_model = tf.keras.models.Sequential([
    horsepower_normalizer,
    tf.keras.layers.Dense(units=1)
])

horsepower_model(xyz)

Output:

<tf.Tensor: shape=(100, 1), dtype=float32, numpy=
array([[-0.17135675],
       [-0.48248804],
       [-2.2847023 ],
       [-0.05702276],
       [ 2.9332483 ],
       [ 0.64826846],
       [-2.1490448 ],
       [-1.1697797 ],
       [-0.01030668],
            ...
       [-1.880199  ],
       [ 1.2854142 ],
       [-0.5471661 ]], dtype=float32)>

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文