Hi all, we’re trying to implement Netdrive into our workflow as a way to mirror and distribute files across multiple people working remotely.
We purchased a license to one of our remote people and setup an FTP Server on the premises to have this person access the files needed to produce the work.
We selected 50Gb cache option to have info cached locally for the fellow, but everytime he opens the same file, the cache always loads it again from scratch and erases the cache folder contents (at least partially, as we see its size diminish substatially).
This would be alright if it would be a small 1Mb file or something, but we’re talking file clusters with hundreds of files in the vicinity of 5-15 Gb per scenes, which makes this whole process totally inefective.
Is there anyway to configure netdrive to cache to a fast drive using 500Gb of space, instead of the default max of 50Gb?
Is there something we’re missing?
We noticed on the log the following message:
[2021/11/03 01:57:25.225] [ERROR ] [ 19700] [BDRIVE ] Failed to Request/Reply with error = 11, rc = -1 [C:\buildbot\slave-win\netdrive3_release\netdrive3…\common\ZClientSession.cpp:324]
[2021/11/03 01:57:26.229] [ERROR ] [ 19700] Failed to initialize ZCLIENT [C:\buildbot\slave-win\netdrive3_release\netdrive3…\common\ZClientSession.cpp:233]
Could it have anything to do with this?
Kind regards, thanks for your input!