如何按照数据框的大小过滤(以字节)进行过滤记录
我有一个数据框,在Pyspark中,需要根据其大小(IN字节)过滤其记录。
我们正在应用此逻辑,但在所有情况下似乎都没有起作用。
purge_timestamp = getPurgeTimestamp()
updatedFrame = df.withColumn('ttlExpDate', lit(purge_timestamp).cast("long"))
...
columns = updatedFrame.columns
dfFinal = updatedFrame.withColumn('size', getSize(to_json(struct([updatedFrame[x] for x in columns]))))
dfFinal = dfFinal.withColumn('s3Key', lit(""))
...
dfFinal = updatedFrame.withColumn('size', getSize(to_json(struct([updatedFrame[x] for x in columns]))))
...
@udf(returnType=IntegerType())
def getSize(value):
value = json.dumps(value)
return len(value.encode('utf-8', 'ignore'))//1024
我没有日志可以显示,因为我们正在处理数百万个记录,并且无法打印每个步骤的结果。
感谢帮助:)
谢谢!
I have a Data Frame, in PySpark, and need to filter the records of it based on its size (in Bytes).
We are applying this logic but it doesn't seem to work in all cases.
purge_timestamp = getPurgeTimestamp()
updatedFrame = df.withColumn('ttlExpDate', lit(purge_timestamp).cast("long"))
...
columns = updatedFrame.columns
dfFinal = updatedFrame.withColumn('size', getSize(to_json(struct([updatedFrame[x] for x in columns]))))
dfFinal = dfFinal.withColumn('s3Key', lit(""))
...
dfFinal = updatedFrame.withColumn('size', getSize(to_json(struct([updatedFrame[x] for x in columns]))))
...
@udf(returnType=IntegerType())
def getSize(value):
value = json.dumps(value)
return len(value.encode('utf-8', 'ignore'))//1024
I do not have logs to show since we are processing millions of records and could not print the outcome of each step.
Appreciate the help :)
Thanks!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论