通过嘲笑S3桶进行单位测试
我是单元测试的新手,我需要对对象存储类执行一些简单的单元测试。
我有一个名为 OSBucket 的类,如下所示:
def __initBucket(self):
ecs_session = boto3.Session(
aws_access_key_id="OSKEY",
aws_secret_access_key="SECRETKEY"
)
OS_resource = ecs_session.resource('s3', verify=cert, endpoint_url=endpoint)
self.mybucket = OS_resource.Bucket(OS_BUCKET)
def get_mybucket(self):
return self.mybucket
def download_file(self, fileName,filepath):
self.mybucket.download_file(fileName, filepath)
def upload_file(self, filepath,dest_file_name):
self.mybucket.upload_file(filepath, '%s%s' % ("/",dest_file_name))
在该类的构造函数中调用方法 __initBucket。
我如何开始创建一个单元测试类来测试,例如 download_file 方法?
在创建 os_bucket
对象之前执行 moto_fake.start()
的UPDATE 1
moto_fake.start()
conn = boto3.resource('s3', aws_access_key_id="fake_id",
aws_secret_access_key="fake_secret")
conn.create_bucket(Bucket="OS_BUCKET")
os_bucket = OSBucket.OSBucket(thisRun)
sourcefile = "testingMoto.txt"
filePath = os.path.join("/", sourcefile)
os_bucket.upload_file(filePath, sourcefile)
对我不起作用。
更新2
使用 patch.object 将端点变量更改为 None,使测试通过
I am new to unit testing and I require to perform some simple unit tests for an object storage class.
I have a class named OSBucket as follows:
def __initBucket(self):
ecs_session = boto3.Session(
aws_access_key_id="OSKEY",
aws_secret_access_key="SECRETKEY"
)
OS_resource = ecs_session.resource('s3', verify=cert, endpoint_url=endpoint)
self.mybucket = OS_resource.Bucket(OS_BUCKET)
def get_mybucket(self):
return self.mybucket
def download_file(self, fileName,filepath):
self.mybucket.download_file(fileName, filepath)
def upload_file(self, filepath,dest_file_name):
self.mybucket.upload_file(filepath, '%s%s' % ("/",dest_file_name))
The method __initBucket is called in the constructor of the class.
How could I start creating a unit test class to test, for example, the download_file method?
UPDATE 1
moto_fake.start()
conn = boto3.resource('s3', aws_access_key_id="fake_id",
aws_secret_access_key="fake_secret")
conn.create_bucket(Bucket="OS_BUCKET")
os_bucket = OSBucket.OSBucket(thisRun)
sourcefile = "testingMoto.txt"
filePath = os.path.join("/", sourcefile)
os_bucket.upload_file(filePath, sourcefile)
which executes the moto_fake.start()
before creating the os_bucket
object does not work for me.
UPDATE 2
Using patch.object to change the endpoint variable to None, makes the test pass
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
第一种方法:使用Python模拟
您可以使用标准Python模拟模拟S3存储桶,然后检查您正在使用所期望的参数调用方法。
但是,这种方法实际上不能保证您的实施是正确的,因为您不会连接到S3。例如,如果您拼写错误的名称,则可以调用不存在的Boto函数 - 模拟不会引发任何例外。
第二种方法:使用moto
moto 是测试boto的最佳实践。它可以在本地计算机中模拟BOTO,在本地创建水桶,以便您可以完成端到端测试。
我使用Moto为您的班级编写了几个测试(我可能已经在您的实现中找到了一个错误 - 检查最后的测试行 - 这些是使其找到文件而不是不引发例外所需的参数)
最终注释:
First Approach: using python mocks
You can mock the s3 bucket using standard python mocks and then check that you are calling the methods with the arguments you expect.
However, this approach won't actually guarantee that your implementation is correct since you won't be connecting to s3. For example, you can call non-existing boto functions if you misspelled their names - the mock won't throw any exception.
Second Approach: using moto
Moto is the best practice for testing boto. It simulates boto in your local machine, creating buckets locally so you can have complete end-to-end tests.
I used moto to write a couple of tests for your class (and I might have found a bug in your implementation - check the last test line - those are the arguments needed to make it find the file and not throw an exception)
Final notes: